ColossalAI/colossalai/amp
Baizhou Zhang 0ceec8f9a9 [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354)
* add naive optimizer for 3DPlugin/refactor gpt2 shardformer test

* merge tests of PP/DP/TP combinations into one test file

* fix bug when sync grad for dp in HybridPlugin

* update supported precisions for 3DPlugin/fix bug when shifting tp_degree

* improve the passing of lazy_init

* modify lazy_init/use sync_shared_params
2023-08-15 23:25:14 +08:00
..
apex_amp [test] fixed the triton version for testing (#2608) 2023-02-07 13:49:38 +08:00
naive_amp [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354) 2023-08-15 23:25:14 +08:00
torch_amp [NFC] fix typo colossalai/amp auto_parallel autochunk (#3756) 2023-05-19 13:50:00 +08:00
__init__.py [NFC] polish colossalai/amp/__init__.py code style (#3272) 2023-03-29 15:22:21 +08:00
amp_type.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00