mirror of https://github.com/hpcaitech/ColossalAI
21ba89cab6
* add test * fix no_sync bug in low level zero plugin * fix test * add argument for grad accum * add grad accum in backward hook for gemini * finish implementation, rewrite tests * fix test * skip stuck model in low level zero test * update doc * optimize communication & fix gradient checkpoint * modify doc * cleaning codes * update cpu adam fp16 case |
||
---|---|---|
.. | ||
test_mixed_precision | ||
test_plugin | ||
test_accelerator.py |