ColossalAI/tests
Frank Lee 65ee6dcc20
[test] ignore 8 gpu test (#1080)
* [test] ignore 8 gpu test

* polish code

* polish workflow

* polish workflow
2022-06-08 23:14:18 +08:00
..
components_to_test [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
test_amp [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_comm [p2p]add object list send/recv (#1024) 2022-05-26 14:28:46 +08:00
test_config [test] removed trivial outdated test 2022-04-12 11:08:15 +08:00
test_context [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_data [unittest] refactored unit tests for change in dependency (#838) 2022-04-22 15:39:07 +08:00
test_data_pipeline_tensor_parallel [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
test_engine [refactor] moving grad acc logic to engine (#804) 2022-04-19 14:03:21 +08:00
test_gemini [gemini] accelerate adjust_layout() (#878) 2022-04-26 18:08:31 +08:00
test_layers [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
test_moe [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_optimizer [zero]added hybrid adam, removed loss scale in adam (#527) 2022-03-25 18:03:54 +08:00
test_tensor [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
test_trainer [pipeline]refactor ppschedule to support tensor list (#1050) 2022-06-02 13:48:59 +08:00
test_utils [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
test_zero [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
__init__.py [zero] Update sharded model v2 using sharded param v2 (#323) 2022-03-11 15:50:28 +08:00