mirror of https://github.com/hpcaitech/ColossalAI
![]() * test: add more p2p tests * fix: remove send_forward_recv_forward as p2p op list need to use the same group * fix: make send and receive atomic * feat: update P2PComm fn * feat: add metadata cache in 1f1b * feat: add metadata cache in interleaved pp * feat: modify is_xx_stage fn * revert: add _broadcast_object_list * feat: add interleaved pp in llama policy * feat: set NCCL_BUFFSIZE in HybridParallelPlugin |
||
---|---|---|
.. | ||
test_hybrid_parallel_grad_clip_norm | ||
test_layer | ||
test_model | ||
__init__.py | ||
test_shard_utils.py | ||
test_with_torch_ddp.py |