mirror of https://github.com/hpcaitech/ColossalAI
aibig-modeldata-parallelismdeep-learningdistributed-computingfoundation-modelsheterogeneous-traininghpcinferencelarge-scalemodel-parallelismpipeline-parallelism
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Liang Bowen
2c45efc398
|
3 years ago | |
---|---|---|
.. | ||
Colossalai Homepage.rst | 3 years ago | |
Colossalai benchmarks.rst | 3 years ago | |
Colossalai examples.rst | 3 years ago | |
Colossalai tutorial.rst | 3 years ago |