mirror of https://github.com/hpcaitech/ColossalAI
![]() * add test * fix no_sync bug in low level zero plugin * fix test * add argument for grad accum * add grad accum in backward hook for gemini * finish implementation, rewrite tests * fix test * skip stuck model in low level zero test * update doc * optimize communication & fix gradient checkpoint * modify doc * cleaning codes * update cpu adam fp16 case |
||
---|---|---|
.. | ||
test_3d_plugin.py | ||
test_dp_plugin_base.py | ||
test_gemini_plugin.py | ||
test_low_level_zero_plugin.py | ||
test_torch_ddp_plugin.py | ||
test_torch_fsdp_plugin.py |