You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests/test_shardformer
Wenhao Chen 1810b9100f
[pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134)
11 months ago
..
test_hybrid_parallel_grad_clip_norm [gemini] gemini support extra-dp (#5043) 1 year ago
test_layer [shardformer] llama support DistCrossEntropy (#5176) 12 months ago
test_model [pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134) 11 months ago
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 1 year ago
test_shard_utils.py [misc] update pre-commit and run all files (#4752) 1 year ago
test_with_torch_ddp.py [misc] update pre-commit and run all files (#4752) 1 year ago