ColossalAI/tests/test_shardformer/test_model
Hongxin Liu 172f7fa3cf [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
..
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
_utils.py [shardformer] rewrite tests for opt/bloom/llama/vit/chatglm (#4395) 2023-08-15 23:25:14 +08:00
test_shard_bert.py [shardformer] update tests for all optimization (#4413) 2023-08-15 23:25:14 +08:00
test_shard_blip2.py [Shardformer] Merge flash attention branch to pipeline branch (#4362) 2023-08-15 23:25:14 +08:00
test_shard_bloom.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
test_shard_chatglm.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
test_shard_gpt2.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
test_shard_llama.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
test_shard_opt.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
test_shard_sam.py [Shardformer] Merge flash attention branch to pipeline branch (#4362) 2023-08-15 23:25:14 +08:00
test_shard_t5.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
test_shard_vit.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00
test_shard_whisper.py [Shardformer] Merge flash attention branch to pipeline branch (#4362) 2023-08-15 23:25:14 +08:00