ColossalAI/tests/test_shardformer/test_model
Hongxin Liu f2e8b9ef9f
[devops] fix compatibility (#5444)
* [devops] fix compatibility

* [hotfix] update compatibility test on pr

* [devops] fix compatibility

* [devops] record duration during comp test

* [test] decrease test duration

* fix falcon
2024-03-13 15:24:13 +08:00
..
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
_utils.py [hotfix] Fix ShardFormer test execution path when using sequence parallelism (#5230) 2024-01-17 17:42:29 +08:00
test_shard_bert.py [pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134) 2023-12-22 10:44:00 +08:00
test_shard_blip2.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_shard_bloom.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 2023-11-03 13:32:43 +08:00
test_shard_chatglm2.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 2023-11-03 13:32:43 +08:00
test_shard_falcon.py [devops] fix compatibility (#5444) 2024-03-13 15:24:13 +08:00
test_shard_gpt2.py [ci] fix shardformer tests. (#5255) 2024-01-11 19:07:45 +08:00
test_shard_gptj.py [shardformer] llama support DistCrossEntropy (#5176) 2023-12-13 01:39:14 +08:00
test_shard_llama.py [pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134) 2023-12-22 10:44:00 +08:00
test_shard_mistral.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 2023-11-28 16:54:42 +08:00
test_shard_opt.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_shard_sam.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_shard_t5.py [ci] fix shardformer tests. (#5255) 2024-01-11 19:07:45 +08:00
test_shard_vit.py [hotfix] fix torch 2.0 compatibility (#4936) 2023-10-18 11:05:25 +08:00
test_shard_whisper.py [ci] fix shardformer tests. (#5255) 2024-01-11 19:07:45 +08:00