mirror of https://github.com/hpcaitech/ColossalAI
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
16 lines
226 B
16 lines
226 B
1 year ago
|
torch<2.0.0, >=1.12.1
|
||
|
packaging==23.1
|
||
|
colossalai==0.3.2
|
||
|
autoflake==2.2.1
|
||
|
black==23.9.1
|
||
|
transformers
|
||
|
tensorboard==2.14.0
|
||
|
six==1.16.0
|
||
|
datasets
|
||
|
ninja==1.11.1
|
||
|
flash-attn>=2.0.0,<=2.0.5
|
||
|
tqdm
|
||
|
sentencepiece==0.1.99
|
||
|
protobuf<=3.20.0
|
||
|
|