mirror of https://github.com/hpcaitech/ColossalAI
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
25 lines
300 B
25 lines
300 B
8 months ago
|
transformers==4.34.1
|
||
|
huggingface_hub==0.17.3
|
||
|
tqdm
|
||
|
datasets
|
||
|
loralib
|
||
|
colossalai>=0.3.6
|
||
|
torch>=1.12.1
|
||
|
langchain
|
||
|
tokenizers
|
||
|
fastapi
|
||
|
sse_starlette
|
||
|
wandb
|
||
|
sentencepiece
|
||
|
gpustat
|
||
|
packaging
|
||
|
autoflake==2.2.1
|
||
|
black==23.9.1
|
||
|
tensorboard
|
||
|
six==1.16.0
|
||
|
datasets
|
||
|
ninja==1.11.1
|
||
|
sentencepiece==0.1.99
|
||
|
flash-attn
|
||
|
tiktoken
|