mirror of https://github.com/hpcaitech/ColossalAI
aibig-modeldata-parallelismdeep-learningdistributed-computingfoundation-modelsheterogeneous-traininghpcinferencelarge-scalemodel-parallelismpipeline-parallelism
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Hongxin Liu
079bf3cb26
|
1 year ago | |
---|---|---|
.. | ||
README.md | 2 years ago | |
main.py | 1 year ago |
README.md
Basic MNIST Example with optional FP8 of TransformerEngine
TransformerEngine is a library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper GPUs, to provide better performance with lower memory utilization in both training and inference.
Thanks for the contribution to this tutorial from NVIDIA.
python main.py
python main.py --use-te # Linear layers from TransformerEngine
python main.py --use-fp8 # FP8 + TransformerEngine for Linear layers
We are working to integrate it with Colossal-AI and will finish it soon.