mirror of https://github.com/hpcaitech/ColossalAI
0ceec8f9a9
* add naive optimizer for 3DPlugin/refactor gpt2 shardformer test * merge tests of PP/DP/TP combinations into one test file * fix bug when sync grad for dp in HybridPlugin * update supported precisions for 3DPlugin/fix bug when shifting tp_degree * improve the passing of lazy_init * modify lazy_init/use sync_shared_params |
||
---|---|---|
.. | ||
grad_scaler | ||
mixed_precision_mixin | ||
__init__.py | ||
_fp16_optimizer.py | ||
_utils.py | ||
mixed_precision_optimizer.py | ||
naive_amp.py |