mirror of https://github.com/hpcaitech/ColossalAI
![]() * test: add more p2p tests * fix: remove send_forward_recv_forward as p2p op list need to use the same group * fix: make send and receive atomic * feat: update P2PComm fn * feat: add metadata cache in 1f1b * feat: add metadata cache in interleaved pp * feat: modify is_xx_stage fn * revert: add _broadcast_object_list * feat: add interleaved pp in llama policy * feat: set NCCL_BUFFSIZE in HybridParallelPlugin |
||
---|---|---|
.. | ||
__init__.py | ||
dp_plugin_base.py | ||
gemini_plugin.py | ||
hybrid_parallel_plugin.py | ||
low_level_zero_plugin.py | ||
moe_hybrid_parallel_plugin.py | ||
plugin_base.py | ||
pp_plugin_base.py | ||
torch_ddp_plugin.py | ||
torch_fsdp_plugin.py |