You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/applications/ColossalChat/requirements.txt

25 lines
300 B

transformers==4.34.1
huggingface_hub==0.17.3
tqdm
datasets
loralib
colossalai>=0.3.6
torch>=1.12.1
langchain
tokenizers
fastapi
sse_starlette
wandb
sentencepiece
gpustat
packaging
autoflake==2.2.1
black==23.9.1
tensorboard
six==1.16.0
datasets
ninja==1.11.1
sentencepiece==0.1.99
flash-attn
tiktoken