You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests
ver217 63ee6fffe6
Merge branch 'main' into exp/mixtral
11 months ago
..
kit [gemini] hotfix NaN loss while using Gemini + tensor_parallel (#5150) 12 months ago
test_analyzer [misc] update pre-commit and run all files (#4752) 1 year ago
test_auto_parallel [misc] update pre-commit and run all files (#4752) 1 year ago
test_autochunk [misc] update pre-commit and run all files (#4752) 1 year ago
test_booster [plugin]fix 3d checkpoint load when booster boost without optimizer. (#5135) 1 year ago
test_checkpoint_io [gemini] gemini support extra-dp (#5043) 1 year ago
test_cluster [misc] update pre-commit and run all files (#4752) 1 year ago
test_config [misc] update pre-commit and run all files (#4752) 1 year ago
test_device [misc] update pre-commit and run all files (#4752) 1 year ago
test_fx [misc] update pre-commit and run all files (#4752) 1 year ago
test_gptq [feature] add gptq for inference (#4754) 1 year ago
test_infer [Hotfix] Fix model policy matching strategy in ShardFormer (#5064) 1 year ago
test_infer_ops/triton [inference] Refactor inference architecture (#5057) 1 year ago
test_lazy [lazy] support from_pretrained (#4801) 1 year ago
test_legacy [npu] add npu support for gemini and zero (#5067) 1 year ago
test_moe fix optim 11 months ago
test_optimizer [test] merge old components to test to model zoo (#4945) 1 year ago
test_pipeline [pipeline] A more general _communicate in p2p (#5062) 11 months ago
test_shardformer [pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134) 11 months ago
test_smoothquant [inference] Add smmoothquant for llama (#4904) 1 year ago
test_tensor [misc] update pre-commit and run all files (#4752) 1 year ago
test_utils [misc] update pre-commit and run all files (#4752) 1 year ago
test_zero [npu] add npu support for gemini and zero (#5067) 1 year ago
__init__.py [zero] Update sharded model v2 using sharded param v2 (#323) 3 years ago