ColossalAI/tests
Jiarui Fang 126ba573a8
[Tensor] add layer norm Op (#852)
2022-04-25 11:49:20 +08:00
..
components_to_test [ci] cache cuda extension (#860) 2022-04-25 10:03:47 +08:00
test_amp [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_comm [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_config [test] removed trivial outdated test 2022-04-12 11:08:15 +08:00
test_context [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_data [unittest] refactored unit tests for change in dependency (#838) 2022-04-22 15:39:07 +08:00
test_data_pipeline_tensor_parallel [refactor] moving grad acc logic to engine (#804) 2022-04-19 14:03:21 +08:00
test_engine [refactor] moving grad acc logic to engine (#804) 2022-04-19 14:03:21 +08:00
test_gemini [gemini] add GeminiMemoryManger (#832) 2022-04-24 13:08:48 +08:00
test_layers [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_moe [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_optimizer [zero]added hybrid adam, removed loss scale in adam (#527) 2022-03-25 18:03:54 +08:00
test_tensor [Tensor] add layer norm Op (#852) 2022-04-25 11:49:20 +08:00
test_trainer [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00
test_utils [gemini] add GeminiMemoryManger (#832) 2022-04-24 13:08:48 +08:00
test_zero [gemini] add GeminiMemoryManger (#832) 2022-04-24 13:08:48 +08:00
__init__.py [zero] Update sharded model v2 using sharded param v2 (#323) 2022-03-11 15:50:28 +08:00