ColossalAI/tests
ver217 8106d7b8c7
[ddp] refactor ColoDDP and ZeroDDP (#1146)
* ColoDDP supports overwriting default process group

* rename ColoDDPV2 to ZeroDDP

* add docstr for ZeroDDP

* polish docstr
2022-06-21 16:35:23 +08:00
..
components_to_test [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
test_amp [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_comm [p2p]add object list send/recv (#1024) 2022-05-26 14:28:46 +08:00
test_config [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
test_context [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_data [unittest] refactored unit tests for change in dependency (#838) 2022-04-22 15:39:07 +08:00
test_data_pipeline_tensor_parallel [test] fixed hybrid parallel test case on 8 GPUs (#1106) 2022-06-14 10:30:54 +08:00
test_ddp [ddp] refactor ColoDDP and ZeroDDP (#1146) 2022-06-21 16:35:23 +08:00
test_engine [refactor] moving grad acc logic to engine (#804) 2022-04-19 14:03:21 +08:00
test_fx [fx]add autoparallel passes (#1121) 2022-06-15 16:36:46 +08:00
test_gemini [gemini] accelerate adjust_layout() (#878) 2022-04-26 18:08:31 +08:00
test_layers [test] skip tests when not enough GPUs are detected (#1090) 2022-06-09 17:19:13 +08:00
test_moe [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_optimizer [zero]added hybrid adam, removed loss scale in adam (#527) 2022-03-25 18:03:54 +08:00
test_pipeline [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
test_tensor [ddp] refactor ColoDDP and ZeroDDP (#1146) 2022-06-21 16:35:23 +08:00
test_trainer [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
test_utils [ddp] add save/load state dict for ColoDDP (#1127) 2022-06-20 10:51:47 +08:00
test_zero [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
__init__.py [zero] Update sharded model v2 using sharded param v2 (#323) 2022-03-11 15:50:28 +08:00