ColossalAI/colossalai/amp
Hongxin Liu 261eab02fb [plugin] add 3d parallel plugin (#4295)
* [amp] add mixed precision optimizer

* [plugin] add 3d parallel plugin

* [booster] support pipeline

* [plugin] 3d parallel plugin support clip grad norm

* [shardformer] fix sharder and add plugin test

* [plugin] rename 3d parallel plugin

* [ci] support testmon core pkg change detection (#4305)

* [hotfix] debug testmon

* [hotfix] fix llama

* [hotfix] fix p2p bugs

* [hotfix] fix requirements
2023-08-15 23:25:14 +08:00
..
apex_amp [test] fixed the triton version for testing (#2608) 2023-02-07 13:49:38 +08:00
naive_amp [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
torch_amp [NFC] fix typo colossalai/amp auto_parallel autochunk (#3756) 2023-05-19 13:50:00 +08:00
__init__.py [NFC] polish colossalai/amp/__init__.py code style (#3272) 2023-03-29 15:22:21 +08:00
amp_type.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00