mirror of https://github.com/hpcaitech/ColossalAI
aibig-modeldata-parallelismdeep-learningdistributed-computingfoundation-modelsheterogeneous-traininghpcinferencelarge-scalemodel-parallelismpipeline-parallelism
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
16 lines
240 B
16 lines
240 B
7 months ago
|
torch==2.1.2
|
||
|
huggingface-hub
|
||
|
packaging==24.0
|
||
|
colossalai==0.3.6
|
||
1 year ago
|
autoflake==2.2.1
|
||
|
black==23.9.1
|
||
7 months ago
|
transformers==4.34.1
|
||
1 year ago
|
tensorboard==2.14.0
|
||
|
six==1.16.0
|
||
|
datasets
|
||
|
ninja==1.11.1
|
||
|
flash-attn>=2.0.0,<=2.0.5
|
||
|
tqdm
|
||
|
sentencepiece==0.1.99
|
||
|
protobuf<=3.20.0
|