mirror of https://github.com/hpcaitech/ColossalAI
aibig-modeldata-parallelismdeep-learningdistributed-computingfoundation-modelsheterogeneous-traininghpcinferencelarge-scalemodel-parallelismpipeline-parallelism
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
6 lines
305 B
6 lines
305 B
Colossal-AI Benchmarks |
|
================================== |
|
|
|
*If you are interested in the performance or the features of Colossal-AI, please check* |
|
`Colossal-AI Benchmark <https://github.com/hpcaitech/ColossalAI-Benchmark>`_. |
|
*to get more details about our performance on CIFAR10, ImageNet1K or GPT2 ZeRO.* |