ColossalAI/colossalai/booster/mixed_precision
Jianghai b366f1d99f [NFC] Fix format for mixed precision (#4253)
* [NFC] polish colossalai/booster/mixed_precision/mixed_precision_base.py code style
2023-07-26 14:12:57 +08:00
..
__init__.py [amp] Add naive amp demo (#3774) 2023-05-18 16:33:14 +08:00
bf16.py [booster] implemented mixed precision class (#3151) 2023-03-17 11:00:15 +08:00
fp8.py [booster] implemented mixed precision class (#3151) 2023-03-17 11:00:15 +08:00
fp16_apex.py [API] add docstrings and initialization to apex amp, naive amp (#3783) 2023-05-23 15:17:24 +08:00
fp16_naive.py [API] add docstrings and initialization to apex amp, naive amp (#3783) 2023-05-23 15:17:24 +08:00
fp16_torch.py [booster] make optimizer argument optional for boost (#3993) 2023-06-15 17:38:42 +08:00
mixed_precision_base.py [NFC] Fix format for mixed precision (#4253) 2023-07-26 14:12:57 +08:00