ColossalAI/colossalai/booster
Bin Jia 7c8be77081
[shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460)
* support gpt2 seq parallel with pp/dp/tp

* fix a bug when waiting for stream done

* delete unused gpt2_seq file
2023-08-18 11:21:53 +08:00
..
mixed_precision [NFC] Fix format for mixed precision (#4253) 2023-07-26 14:12:57 +08:00
plugin [shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460) 2023-08-18 11:21:53 +08:00
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
accelerator.py [booster] added the accelerator implementation (#3159) 2023-03-20 13:59:24 +08:00
booster.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00