You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests
ver217 63ee6fffe6
Merge branch 'main' into exp/mixtral
11 months ago
..
kit [gemini] hotfix NaN loss while using Gemini + tensor_parallel (#5150) 12 months ago
test_analyzer
test_auto_parallel
test_autochunk
test_booster [plugin]fix 3d checkpoint load when booster boost without optimizer. (#5135) 1 year ago
test_checkpoint_io [gemini] gemini support extra-dp (#5043) 1 year ago
test_cluster
test_config
test_device
test_fx
test_gptq
test_infer [Hotfix] Fix model policy matching strategy in ShardFormer (#5064) 1 year ago
test_infer_ops/triton [inference] Refactor inference architecture (#5057) 1 year ago
test_lazy
test_legacy [npu] add npu support for gemini and zero (#5067) 1 year ago
test_moe fix optim 11 months ago
test_optimizer [test] merge old components to test to model zoo (#4945) 1 year ago
test_pipeline [pipeline] A more general _communicate in p2p (#5062) 11 months ago
test_shardformer [pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134) 11 months ago
test_smoothquant [inference] Add smmoothquant for llama (#4904) 1 year ago
test_tensor
test_utils
test_zero [npu] add npu support for gemini and zero (#5067) 1 year ago
__init__.py