ColossalAI/colossalai
Edenzzzz 2a25a2aff7
[Feature] optimize PP overlap (#5735)
* update to fully overlap, still debugging

* improve interface

* fixed deadlock bug

* debug NaN loss

* (experimental) use one comm group for send_fw_recv_fw to fix NaN

* cleaned up interfaces; use one batch p2p for all

* clean up; removed the double p2p batch case

* p2p test passsed

* improve overlap: send fwd before backward

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* tentatively use 2 p2p batches

* remove two p2p batches

* fix typos

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* remove pp.sh

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: root <root@notebook-c55824c0-7742-45e8-9591-c855bb77ad29-0.notebook-c55824c0-7742-45e8-9591-c855bb77ad29.colossal-ai.svc.cluster.local>
2024-06-26 14:48:02 +08:00
..
_C Clean up 2024-06-07 09:09:29 +00:00
_analyzer [test] Fix/fix testcase (#5770) 2024-06-03 15:26:01 +08:00
accelerator [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 2024-03-05 21:52:30 +08:00
amp [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
auto_parallel [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00
autochunk [hotfix] Fix examples no pad token & auto parallel codegen bug; (#5606) 2024-04-18 18:15:50 +08:00
booster [Feature] optimize PP overlap (#5735) 2024-06-26 14:48:02 +08:00
checkpoint_io [shardformer] upgrade transformers to 4.39.3 (#5815) 2024-06-14 10:59:33 +08:00
cli [devops] fix extention building (#5427) 2024-03-05 15:35:54 +08:00
cluster [Feature] optimize PP overlap (#5735) 2024-06-26 14:48:02 +08:00
context [Fix]: implement thread-safety singleton to avoid deadlock for very large-scale training scenarios (#5625) 2024-04-25 14:45:52 +08:00
device [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 2024-05-14 13:52:45 +08:00
fx [test] Fix/fix testcase (#5770) 2024-06-03 15:26:01 +08:00
inference [Fix] Fix spec-dec Glide LlamaModel for compatibility with transformers (#5837) 2024-06-19 15:37:53 +08:00
interface [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 2024-05-14 13:52:45 +08:00
kernel [NFC] Fix code factors on inference triton kernels (#5743) 2024-05-21 22:12:15 +08:00
lazy [lazy] fix lazy cls init (#5720) 2024-05-17 18:18:59 +08:00
legacy [sync] Sync feature/colossal-infer with main 2024-05-20 15:50:53 +00:00
logging [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
moe [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 2024-03-05 21:52:30 +08:00
nn [misc] fix dist logger (#5782) 2024-06-05 15:04:22 +08:00
pipeline [Feature] optimize PP overlap (#5735) 2024-06-26 14:48:02 +08:00
quantization [Feature] qlora support (#5586) 2024-04-28 10:51:27 +08:00
shardformer change 'xxx if xxx else None' to 'xxx or None' 2024-06-18 03:32:42 +00:00
tensor [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 2024-05-14 13:52:45 +08:00
testing [misc] Accelerate CI for zero and dist optim (#5758) 2024-06-05 11:25:19 +08:00
utils Merge pull request #5310 from hpcaitech/feature/npu 2024-01-29 13:49:39 +08:00
zero [gemini] fix missing return (#5845) 2024-06-21 11:38:40 +08:00
__init__.py [devops] remove post commit ci (#5566) 2024-04-08 15:09:40 +08:00
initialize.py [launch] Support IPv4 host initialization in launch (#5822) 2024-06-18 19:18:29 +08:00