ColossalAI/colossalai/booster/plugin
Bin Jia 7c8be77081
[shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460)
* support gpt2 seq parallel with pp/dp/tp

* fix a bug when waiting for stream done

* delete unused gpt2_seq file
2023-08-18 11:21:53 +08:00
..
__init__.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
gemini_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
hybrid_parallel_plugin.py [shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460) 2023-08-18 11:21:53 +08:00
low_level_zero_plugin.py [zero] support shard optimizer state dict of zero (#4194) 2023-07-31 22:13:29 +08:00
plugin_base.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
pp_plugin_base.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
torch_ddp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
torch_fsdp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00