Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 

6 lines
305 B

Colossal-AI Benchmarks
==================================
*If you are interested in the performance or the features of Colossal-AI, please check*
`Colossal-AI Benchmark <https://github.com/hpcaitech/ColossalAI-Benchmark>`_.
*to get more details about our performance on CIFAR10, ImageNet1K or GPT2 ZeRO.*