ColossalAI/colossalai/booster/mixed_precision
Wenhao Chen 725af3eeeb
[booster] make optimizer argument optional for boost (#3993)
* feat: make optimizer optional in Booster.boost

* test: skip unet test if diffusers version > 0.10.2
2023-06-15 17:38:42 +08:00
..
__init__.py [amp] Add naive amp demo (#3774) 2023-05-18 16:33:14 +08:00
bf16.py
fp8.py
fp16_apex.py [API] add docstrings and initialization to apex amp, naive amp (#3783) 2023-05-23 15:17:24 +08:00
fp16_naive.py [API] add docstrings and initialization to apex amp, naive amp (#3783) 2023-05-23 15:17:24 +08:00
fp16_torch.py [booster] make optimizer argument optional for boost (#3993) 2023-06-15 17:38:42 +08:00
mixed_precision_base.py [booster] make optimizer argument optional for boost (#3993) 2023-06-15 17:38:42 +08:00