You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/examples/language/llama2/requirements.txt

10 lines
141 B

colossalai>=0.3.2
datasets
numpy
torch>=1.12.0,<=2.0.0
tqdm
transformers
flash-attn>=2.0.0,<=2.0.5
SentencePiece==0.1.99
tensorboard==2.14.0