ColossalAI/tests
ver217 1f894e033f
[gemini] zero supports gemini (#1093)
* add placement policy

* add gemini mgr

* update mem stats collector

* update zero

* update zero optim

* fix bugs

* zero optim monitor os

* polish unit test

* polish unit test

* add assert
2022-06-10 14:48:28 +08:00
..
components_to_test [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
test_amp [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_comm [p2p]add object list send/recv (#1024) 2022-05-26 14:28:46 +08:00
test_config [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
test_context [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_data [unittest] refactored unit tests for change in dependency (#838) 2022-04-22 15:39:07 +08:00
test_data_pipeline_tensor_parallel [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
test_engine [refactor] moving grad acc logic to engine (#804) 2022-04-19 14:03:21 +08:00
test_gemini [gemini] accelerate adjust_layout() (#878) 2022-04-26 18:08:31 +08:00
test_layers [test] skip tests when not enough GPUs are detected (#1090) 2022-06-09 17:19:13 +08:00
test_moe [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_optimizer [zero]added hybrid adam, removed loss scale in adam (#527) 2022-03-25 18:03:54 +08:00
test_pipeline [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
test_tensor [gemini] zero supports gemini (#1093) 2022-06-10 14:48:28 +08:00
test_trainer [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
test_utils [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
test_zero [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
__init__.py [zero] Update sharded model v2 using sharded param v2 (#323) 2022-03-11 15:50:28 +08:00