mirror of https://github.com/hpcaitech/ColossalAI
![]() * support gpt2 seq parallel with pp/dp/tp * fix a bug when waiting for stream done * delete unused gpt2_seq file |
||
---|---|---|
.. | ||
test_layer | ||
test_model | ||
__init__.py | ||
test_shard_utils.py | ||
test_with_torch_ddp.py |