[tutorial] update notes for TransformerEngine (#3098)

pull/3104/head
binmakeswell 2023-03-10 16:30:52 +08:00 committed by GitHub
parent 65a4dbda6c
commit 018936a3f3
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 7 additions and 1 deletions

View File

@ -1,7 +1,13 @@
# Basic MNIST Example with optional FP8
# Basic MNIST Example with optional FP8 of TransformerEngine
[TransformerEngine](https://github.com/NVIDIA/TransformerEngine) is a library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper GPUs, to provide better performance with lower memory utilization in both training and inference.
Thanks for the contribution to this tutorial from NVIDIA.
```bash
python main.py
python main.py --use-te # Linear layers from TransformerEngine
python main.py --use-fp8 # FP8 + TransformerEngine for Linear layers
```
> We are working to integrate it with Colossal-AI and will finish it soon.