You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests
Zhongkai Zhao 8e412a548e
[shardformer] Sequence Parallelism Optimization (#5533)
8 months ago
..
kit [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
test_analyzer
test_auto_parallel [npu] change device to accelerator api (#5239) 11 months ago
test_autochunk
test_booster [shardformer] fix pipeline forward error if custom layer distribution is used (#5189) 8 months ago
test_checkpoint_io [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
test_cluster [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
test_config
test_device
test_fx
test_gptq
test_infer [Hotfix] Fix model policy matching strategy in ShardFormer (#5064) 1 year ago
test_lazy [example]add gpt2 benchmark example script. (#5295) 9 months ago
test_legacy [npu] change device to accelerator api (#5239) 11 months ago
test_moe [hotfix] set return_outputs=False in examples and polish code (#5404) 8 months ago
test_optimizer [shardformer]Fix lm parallel. (#5480) 8 months ago
test_pipeline [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 8 months ago
test_shardformer [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
test_smoothquant
test_tensor fixed layout converter caching and updated tester 8 months ago
test_zero [npu] change device to accelerator api (#5239) 11 months ago
__init__.py