You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests
YuliangLiu0306 2c4c7b3618
[autoparallel] add getattr handler (#1767)
2 years ago
..
components_to_test [NFC] polish test component gpt code style (#1567) 2 years ago
test_amp
test_auto_parallel [autoparallel] add getattr handler (#1767) 2 years ago
test_comm
test_config
test_context
test_data
test_data_pipeline_tensor_parallel
test_ddp [zero] add chunk init function for users (#1729) 2 years ago
test_device [tensor] support runtime ShardingSpec apply (#1453) 2 years ago
test_engine
test_fx skip torchrec unittests if not installed (#1790) 2 years ago
test_gemini [hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786) 2 years ago
test_layers updated tp layers 2 years ago
test_moe [moe] initialize MoE groups by ProcessGroup (#1640) 2 years ago
test_ops
test_optimizer
test_pipeline [fx/meta/rpc] move _meta_registration.py to fx folder / register fx functions with compatibility checks / remove color debug (#1710) 2 years ago
test_tensor [autoparallel] shard param and buffer as expected (#1753) 2 years ago
test_trainer
test_utils [feat] add flash attention (#1762) 2 years ago
test_zero [feature] A new ZeRO implementation (#1644) 2 years ago
__init__.py