mirror of https://github.com/hpcaitech/ColossalAI
aibig-modeldata-parallelismdeep-learningdistributed-computingfoundation-modelsheterogeneous-traininghpcinferencelarge-scalemodel-parallelismpipeline-parallelism
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
9 lines
111 B
9 lines
111 B
7 months ago
|
colossalai>=0.3.6
|
||
1 year ago
|
datasets
|
||
|
numpy
|
||
|
tqdm
|
||
|
transformers
|
||
7 months ago
|
flash-attn>=2.0.0
|
||
1 year ago
|
SentencePiece==0.1.99
|
||
|
tensorboard==2.14.0
|