You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests
LuGY de46450461
Added activation offload (#331)
3 years ago
..
components_to_test fix bert unit test 3 years ago
test_comm Hotfix/Colossalai layers (#92) 3 years ago
test_config [profiler] primary memory tracer 3 years ago
test_context Optimize pipeline schedule (#94) 3 years ago
test_data added CI for unit testing (#69) 3 years ago
test_data_pipeline_tensor_parallel Optimize pipeline schedule (#94) 3 years ago
test_engine skip bert in test engine 3 years ago
test_layers fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 3 years ago
test_moe Added TPExpert for special situation 3 years ago
test_optimizer [zero] cpu adam kernel (#288) 3 years ago
test_trainer pipeline last stage supports multi output (#151) 3 years ago
test_utils Added activation offload (#331) 3 years ago
test_zero_data_parallel [zero] zero init context collect numel of model (#375) 3 years ago
test_zero_tensor_parallel Feature/zero (#279) 3 years ago
__init__.py [zero] Update sharded model v2 using sharded param v2 (#323) 3 years ago