You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests
Jiarui Fang e17e92c54d
Polish sharded parameter (#297)
3 years ago
..
test_comm Hotfix/Colossalai layers (#92) 3 years ago
test_config add pytorch hooks (#179) 3 years ago
test_context Optimize pipeline schedule (#94) 3 years ago
test_data added CI for unit testing (#69) 3 years ago
test_data_pipeline_tensor_parallel Optimize pipeline schedule (#94) 3 years ago
test_engine/test_engine add a common util for hooks registered on parameter. (#292) 3 years ago
test_layers fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 3 years ago
test_moe Added TPExpert for special situation 3 years ago
test_trainer pipeline last stage supports multi output (#151) 3 years ago
test_utils Feature/zero (#279) 3 years ago
test_zero_data_parallel Polish sharded parameter (#297) 3 years ago
test_zero_tensor_parallel Feature/zero (#279) 3 years ago