You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests/test_zero/test_gemini
Baizhou Zhang 21ba89cab6
[gemini] support gradient accumulation (#4869)
1 year ago
..
test_chunk_mgrv2.py [misc] update pre-commit and run all files (#4752) 1 year ago
test_chunkv2.py [misc] update pre-commit and run all files (#4752) 1 year ago
test_fwd_bwd.py [gemini] support amp o3 for gemini (#4872) 1 year ago
test_gemini_use_rmt.py [misc] update pre-commit and run all files (#4752) 1 year ago
test_grad_accum.py [gemini] support gradient accumulation (#4869) 1 year ago
test_grad_clip.py [kernel] support pure fp16 for cpu adam and update gemini optim tests (#4921) 1 year ago
test_inference.py [misc] update pre-commit and run all files (#4752) 1 year ago
test_optim.py [kernel] support pure fp16 for cpu adam and update gemini optim tests (#4921) 1 year ago
test_runtime_mem_tracer.py [misc] update pre-commit and run all files (#4752) 1 year ago
test_search.py [misc] update pre-commit and run all files (#4752) 1 year ago
test_zeroddp_state_dict.py [gemini] support amp o3 for gemini (#4872) 1 year ago
test_zerooptim_state_dict.py [hotfix] fix lr scheduler bug in torch 2.0 (#4864) 1 year ago