ColossalAI/tests/test_shardformer
Bin Jia 7c8be77081
[shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460)
* support gpt2 seq parallel with pp/dp/tp

* fix a bug when waiting for stream done

* delete unused gpt2_seq file
2023-08-18 11:21:53 +08:00
..
test_layer [shardformer/sequence parallel] Cherry pick commit to new branch (#4450) 2023-08-16 15:41:20 +08:00
test_model [shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460) 2023-08-18 11:21:53 +08:00
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
test_shard_utils.py [test] add shard util tests 2023-08-15 23:25:14 +08:00
test_with_torch_ddp.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00