Commit Graph

3698 Commits (f9546ba0bee31cbc00a5266b2d9efbacad8fafa8)

Author SHA1 Message Date
wangbluo 698c8b9804 fix 2024-08-21 03:58:21 +00:00
wangbluo 6aface9316 fix 2024-08-21 03:51:25 +00:00
wangbluo 193030f696 fix 2024-08-21 03:21:49 +00:00
wangbluo eb5ba40def fix the merge 2024-08-21 02:58:23 +00:00
Tong Li 39e2597426
[ColossalChat] Add PP support (#6001)
* support pp training

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update rm

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* refactor

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update test case

* fix

* change to 4

* fix eval

* test

* add pp

* hotfix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* support pp training

* update rm

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* refactor

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update test case

* fix

* change to 4

* fix eval

* test

* add pp

* hotfix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update

* skip pp eval

* update all reduce

* update sft

* update ignore

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update no cache

* add eval

* remove fi

* remove debug

* remove parentheses to avoid warning

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Revert "add eval"

This reverts commit 3ab2f6fa32.

* add all reduce

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-21 10:47:39 +08:00
Hongxin Liu 0d3b0bd864
[plugin] add cast inputs option for zero (#6003) (#6022) 2024-08-21 10:21:26 +08:00
wangbluo 2d362ac090 fix merge 2024-08-20 09:26:04 +00:00
wangbluo 2e4cbe3a2d fix 2024-08-20 09:11:02 +00:00
wangbluo 2ee6235cfa fix 2024-08-20 06:48:16 +00:00
wangbluo f7acfa1bd5 fix 2024-08-20 05:07:58 +00:00
wangbluo 53823118f2 fix 2024-08-20 03:20:13 +00:00
Edenzzzz dcc44aab8d
[misc] Use dist logger in plugins (#6011)
* use dist logger in plugins

* remove trash

* print on rank 0

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-20 10:32:41 +08:00
wangbluo 1f703e0ef4 fix 2024-08-19 10:15:16 +00:00
wangbluo 88b3f0698c fix the merge 2024-08-19 10:11:27 +00:00
wangbluo 2eb36839c6 fix 2024-08-19 09:23:10 +00:00
wangbluo 12b44012d9 fix 2024-08-19 09:02:16 +00:00
wangbluo 0d8e82a024 Merge branch 'fp8_merge' of https://github.com/wangbluo/ColossalAI into fp8_merge 2024-08-19 08:10:27 +00:00
wangbluo 4c82bfcc54 fix the merge 2024-08-19 08:09:34 +00:00
pre-commit-ci[bot] 64aad96723 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-08-19 08:08:46 +00:00
wangbluo 3353042525 fix the merge 2024-08-19 08:07:51 +00:00
Edenzzzz f1c3266a94
overlap kv comm with output rescale (#6017)
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-19 14:08:17 +08:00
wangbluo 1a5847e6d1 fix the merge 2024-08-19 03:28:29 +00:00
wangbluo 52289e4c63 Merge branch 'fp8_merge' of https://github.com/wangbluo/ColossalAI into fp8_merge 2024-08-19 02:27:30 +00:00
wangbluo 02636c5bef fix the merge 2024-08-19 02:26:52 +00:00
pre-commit-ci[bot] 81272e9d00 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-08-17 09:37:39 +00:00
wangbluo 4cf79fa275 merge 2024-08-17 09:34:18 +00:00
Hongxin Liu 26493b97d3
[misc] update compatibility (#6008)
* [misc] update compatibility

* [misc] update requirements

* [devops] disable requirements cache

* [test] fix torch ddp test

* [test] fix rerun on address in use

* [test] fix lazy init
2024-08-16 18:49:14 +08:00
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905)
* halfway

* fix cross-PP-stage position id length diff bug

* fix typo

* fix typo

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* unified cross entropy func for all shardformer models

* remove redundant lines

* add basic ring attn; debug cross entropy

* fwd bwd logic complete

* fwd bwd logic complete; add experimental triton rescale

* precision tests passed

* precision tests passed

* fix typos and remove misc files

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add sp_mode to benchmark; fix varlen interface

* update softmax_lse shape by new interface

* change tester name

* remove buffer clone; support packed seq layout

* add varlen tests

* fix typo

* all tests passed

* add dkv_group; fix mask

* remove debug statements

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-16 13:56:38 +08:00
flybird11111 0a51319113
[fp8] zero support fp8 linear. (#6006)
* fix

* fix

* fix

* zero fp8

* zero fp8

* Update requirements.txt
2024-08-16 10:13:07 +08:00
Wang Binluo 3f09a6145f
[fp8] add use_fp8 option for MoeHybridParallelPlugin (#6009) 2024-08-16 10:12:50 +08:00
flybird11111 20722a8c93
[fp8]update reduce-scatter test (#6002)
* fix

* fix

* fix

* fix
2024-08-15 14:40:54 +08:00
Haze188 887d2d579b
[misc] Bypass the huggingface bug to solve the mask mismatch problem (#5991) 2024-08-15 14:40:26 +08:00
pre-commit-ci[bot] 4dd03999ec
[pre-commit.ci] pre-commit autoupdate (#5995)
updates:
- [github.com/psf/black-pre-commit-mirror: 24.4.2 → 24.8.0](https://github.com/psf/black-pre-commit-mirror/compare/24.4.2...24.8.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-15 14:40:03 +08:00
botbw 1a2e90dcc1 [fp8] linear perf enhancement 2024-08-15 13:43:08 +08:00
Hongxin Liu 406f984063
[plugin] add cast inputs option for zero (#6003) 2024-08-15 10:41:22 +08:00
botbw 88fa096d78
[fp8] update torch.compile for linear_fp8 to >= 2.4.0 (#6004) 2024-08-15 10:14:42 +08:00
flybird11111 597b206001
[fp8] support asynchronous FP8 communication (#5997)
* fix

* fix

* fix

* support async all2all

* support async op for all gather

* fix

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-14 14:08:19 +08:00
Tong Li ceb1e262e7
fix sync condition (#6000) 2024-08-14 11:22:39 +08:00
Hongxin Liu 0978080a69
[fp8] refactor fp8 linear with compile (#5993)
* [fp8] refactor fp8 linear with compile

* [fp8] fix linear test

* [fp8] fix linear test
2024-08-13 16:07:26 +08:00
Wang Binluo b2483c8e31
[fp8] support hybrid parallel plugin (#5982)
* support fp8 comm for qwen2 model

* support fp8 comm for qwen2 model

* support fp8 comm for qwen2 model

* fp8

* fix

* bert and bloom

* chatglm and command

* gpt2,gptj,bert, falcon,blip2

* mistral,opy,sam,t5,vit,whisper

* fix

* fix

* fix
2024-08-12 18:17:05 +08:00
YeAnbang ed97d3a5d3
[Chat] fix readme (#5989)
* fix readme

* fix readme, tokenization fully tested

* fix readme, tokenization fully tested

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: root <root@notebook-8f919155-6035-47b4-9c6f-1be133b9e2c9-0.notebook-8f919155-6035-47b4-9c6f-1be133b9e2c9.colossal-ai.svc.cluster.local>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-12 14:55:17 +08:00
flybird11111 f1a3a326c4
[fp8]Moe support fp8 communication (#5977)
* fix

* support moe fp8

* fix

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix

* fix

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix

* fix

* fix

fix

fi

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-09 18:26:02 +08:00
Edenzzzz b4d2377d4c
[Hotfix] Avoid fused RMSnorm import error without apex (#5985)
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-09 18:17:09 +08:00
botbw e4aadeee20
[fp8] use torch compile (torch >= 2.3.0) (#5979)
* [fp8] use torch compile (torch >= 2.4.0)

* [fp8] set use_fast_accum in linear

* [chore] formal version check

* [chore] fix sig
2024-08-09 15:51:06 +08:00
Hongxin Liu 8241c0c054
[fp8] support gemini plugin (#5978)
* [fp8] refactor hook

* [fp8] support gemini plugin

* [example] add fp8 option for llama benchmark
2024-08-09 14:09:48 +08:00
Tong Li ad3fa4f49c
[Hotfix] README link (#5966)
* update ignore

* update readme

* run style

* update readme
2024-08-08 18:04:47 +08:00
flybird11111 4b9bec8176
[test ci]Feature/fp8 comm (#5981)
* fix

* fix

* fix
2024-08-08 17:19:21 +08:00
Hanks b480eec738
[Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928)
* support fp8_communication in the Torch DDP grad comm, FSDP grad comm, and FSDP params comm

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* implement communication hook for FSDP params all-gather

* added unit test for fp8 operators

* support fp8 communication in GeminiPlugin

* update training scripts to support fsdp and fp8 communication

* fixed some minor bugs observed in unit test

* add all_gather_into_tensor_flat_fp8

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add skip the test if torch < 2.2.0

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add skip the test if torch < 2.2.0

* add skip the test if torch < 2.2.0

* add fp8_comm flag

* rebase latest fp8 operators

* rebase latest fp8 operators

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-08 15:55:01 +08:00
flybird11111 7739629b9d
fix (#5976) 2024-08-07 18:58:39 +08:00
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
* [fp8] support fp8 amp for hybrid parallel plugin

* [test] add fp8 hook test

* [fp8] fix fp8 linear compatibility
2024-08-07 18:21:08 +08:00