ColossalAI/examples/language
Wenhao Chen 4fa689fca1
[pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134)
* test: add more p2p tests

* fix: remove send_forward_recv_forward as p2p op list need to use the same group

* fix: make send and receive atomic

* feat: update P2PComm fn

* feat: add metadata cache in 1f1b

* feat: add metadata cache in interleaved pp

* feat: modify is_xx_stage fn

* revert: add _broadcast_object_list

* feat: add interleaved pp in llama policy

* feat: set NCCL_BUFFSIZE in HybridParallelPlugin
2023-12-22 10:44:00 +08:00
..
bert [pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134) 2023-12-22 10:44:00 +08:00
commons [example] make gpt example directory more clear (#2353) 2023-01-06 11:11:26 +08:00
gpt [bug] fix get_default_parser in examples (#4764) 2023-09-21 10:42:25 +08:00
llama2 [pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134) 2023-12-22 10:44:00 +08:00
openmoe [doc] add moe news (#5128) 2023-11-28 17:44:06 +08:00
opt [bug] fix get_default_parser in examples (#4764) 2023-09-21 10:42:25 +08:00
palm [nfc] fix minor typo in README (#4846) 2023-10-07 17:51:11 +08:00