ColossalAI/examples/language/palm
ZijianYY 92de90dfb3
[examples] replace einsum with matmul (#2210)
2022-12-28 19:03:06 +08:00
..
data [example] add palm pytorch version (#2172) 2022-12-22 10:15:34 +08:00
palm_pytorch [examples] replace einsum with matmul (#2210) 2022-12-28 19:03:06 +08:00
README.md [example] add palm pytorch version (#2172) 2022-12-22 10:15:34 +08:00
train.py [example] add palm pytorch version (#2172) 2022-12-22 10:15:34 +08:00

README.md

PaLM - Pytorch

Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways, in less than 200 lines of code.

This model is pretty much SOTA on everything language.

It obviously will not scale, but it is just for educational purposes. To elucidate the public how simple it all really is.

Install

$ pip install PaLM-pytorch

Usage

import torch
from palm_pytorch import PaLM

palm = PaLM(
    num_tokens = 20000,
    dim = 512,
    depth = 12,
    heads = 8,
    dim_head = 64,
)

tokens = torch.randint(0, 20000, (1, 2048))
logits = palm(tokens) # (1, 2048, 20000)

The PaLM 540B in the paper would be

palm = PaLM(
    num_tokens = 256000,
    dim = 18432,
    depth = 118,
    heads = 48,
    dim_head = 256
)

Test on Enwik8

$ python train.py

Todo

Citations

@article{chowdhery2022PaLM,
  title   = {PaLM: Scaling Language Modeling with Pathways},
  author  = {Chowdhery, Aakanksha et al},
  year    = {2022}
}