ColossalAI/colossalai/booster/plugin
Hongxin Liu 1b387ca9fe
[shardformer] refactor pipeline grad ckpt config (#5646)
* [shardformer] refactor pipeline grad ckpt config

* [shardformer] refactor pipeline grad ckpt config

* [pipeline] fix stage manager
2024-04-25 15:19:30 +08:00
..
__init__.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
dp_plugin_base.py [llama] fix dataloader for hybrid parallel (#5358) 2024-02-05 15:14:56 +08:00
gemini_plugin.py [exampe] update llama example (#5626) 2024-04-23 14:12:20 +08:00
hybrid_parallel_plugin.py [shardformer] refactor pipeline grad ckpt config (#5646) 2024-04-25 15:19:30 +08:00
low_level_zero_plugin.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
moe_hybrid_parallel_plugin.py [shardformer] Sequence Parallelism Optimization (#5533) 2024-04-03 17:15:47 +08:00
plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
torch_ddp_plugin.py [doc] polish shardformer doc (#4779) 2023-09-26 10:57:47 +08:00
torch_fsdp_plugin.py [devops] remove post commit ci (#5566) 2024-04-08 15:09:40 +08:00