mirror of https://github.com/hpcaitech/ColossalAI
![]() * add test * fix no_sync bug in low level zero plugin * fix test * add argument for grad accum * add grad accum in backward hook for gemini * finish implementation, rewrite tests * fix test * skip stuck model in low level zero test * update doc * optimize communication & fix gradient checkpoint * modify doc * cleaning codes * update cpu adam fp16 case |
||
---|---|---|
.. | ||
test_chunk_mgrv2.py | ||
test_chunkv2.py | ||
test_fwd_bwd.py | ||
test_gemini_use_rmt.py | ||
test_grad_accum.py | ||
test_grad_clip.py | ||
test_inference.py | ||
test_optim.py | ||
test_runtime_mem_tracer.py | ||
test_search.py | ||
test_zeroddp_state_dict.py | ||
test_zerooptim_state_dict.py |