ColossalAI/tests
Baizhou Zhang 7711bd524a [shardformer] rewrite tests for opt/bloom/llama/vit/chatglm (#4395)
* rewrite opt tests

* rewrite llama tests

* rewrite bloom & vit tests

* rewrite chatglm tests

* fix LinearCol for classfiers

* add judge for other tp layers, fix lazy init in util
2023-08-15 23:25:14 +08:00
..
components_to_test
kit [shardformer] rewrite tests for opt/bloom/llama/vit/chatglm (#4395) 2023-08-15 23:25:14 +08:00
test_amp
test_analyzer
test_auto_parallel [gemini] fix argument naming during chunk configuration searching 2023-06-25 13:34:15 +08:00
test_autochunk
test_booster [Shardformer] Merge flash attention branch to pipeline branch (#4362) 2023-08-15 23:25:14 +08:00
test_checkpoint_io
test_cluster [cluster] add process group mesh (#4039) 2023-08-15 23:25:14 +08:00
test_comm [test] refactor tests with spawn (#3452) 2023-04-06 14:51:35 +08:00
test_config
test_context
test_data [test] refactor tests with spawn (#3452) 2023-04-06 14:51:35 +08:00
test_data_pipeline_tensor_parallel
test_ddp
test_device
test_engine
test_fx [bugs] hot fix some testing bugs for new models (#4268) 2023-08-15 23:25:14 +08:00
test_kernels
test_layers
test_lazy [test] skip some not compatible models 2023-08-15 23:25:14 +08:00
test_moe
test_ops [test] refactor tests with spawn (#3452) 2023-04-06 14:51:35 +08:00
test_optimizer
test_pipeline [pipeline] add chatglm (#4363) 2023-08-15 23:25:14 +08:00
test_shardformer [shardformer] rewrite tests for opt/bloom/llama/vit/chatglm (#4395) 2023-08-15 23:25:14 +08:00
test_tensor
test_trainer
test_utils [shardformer] update shardformer to use flash attention 2 (#4392) 2023-08-15 23:25:14 +08:00
test_zero
__init__.py