You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/applications/ColossalChat/requirements.txt

24 lines
283 B

transformers==4.39.3
tqdm
datasets==2.14.7
loralib
colossalai==0.4.0
torch>=2.1.0
langchain
tokenizers
fastapi
sse_starlette
wandb
sentencepiece
gpustat
packaging
autoflake==2.2.1
black==23.9.1
tensorboard
six==1.16.0
datasets
ninja==1.11.1
sentencepiece==0.1.99
flash-attn
tiktoken