wangbluo
3dc08c8a5a
fix
1 month ago
wangbluo
8ff7d0c780
fix
2 months ago
wangbluo
fe9208feac
fix
2 months ago
wangbluo
3201377e94
fix
2 months ago
wangbluo
23199e34cc
fix
2 months ago
wangbluo
d891e50617
fix
2 months ago
wangbluo
e1e86f9f1f
fix
2 months ago
wangbluo
703bb5c18d
fix the test
2 months ago
wangbluo
4e0e99bb6a
fix the test
2 months ago
wangbluo
1507a7528f
fix
2 months ago
wangbluo
0002ae5956
fix
2 months ago
wangbluo
efe3042bb2
fix
2 months ago
wangbluo
5ecc27e150
fix
2 months ago
wangbluo
f98384aef6
fix
2 months ago
wangbluo
b635dd0669
fix
2 months ago
wangbluo
3532f77b90
fix
2 months ago
wangbluo
3fab92166e
fix
2 months ago
wangbluo
6705dad41b
fix
2 months ago
wangbluo
91ed32c256
fix
2 months ago
wangbluo
6fb1322db1
fix
2 months ago
wangbluo
65c8297710
fix the attn
2 months ago
wangbluo
cfd9eda628
fix the ring attn
2 months ago
wangbluo
10e4f7da72
fix
2 months ago
Wang Binluo
37e35230ff
Merge pull request #6061 from wangbluo/sp_fix
...
[sp] : fix the attention kernel for sp
3 months ago
wangbluo
827ef3ee9a
fix
3 months ago
Guangyao Zhang
bdb125f83f
[doc] FP8 training and communication document ( #6050 )
...
* Add FP8 training and communication document
* add fp8 docstring for plugins
* fix typo
* fix typo
3 months ago
Guangyao Zhang
f20b066c59
[fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 ( #6059 )
...
* all_gather only internode, fix pytest
* fix cuda arch <89 compile pytest error
* fix pytest failure
* disable all_gather_into_tensor_flat_fp8
* fix fp8 format
* fix pytest
* fix conversations
* fix chunk tuple to list
3 months ago
wangbluo
b582319273
fix
3 months ago
wangbluo
0ad3129cb9
fix
3 months ago
wangbluo
0b14a5512e
fix
3 months ago
botbw
696fced0d7
[fp8] fix missing fp8_comm flag in mixtral ( #6057 )
3 months ago
wangbluo
dc032172c3
fix
3 months ago
wangbluo
f393867cff
fix
3 months ago
wangbluo
6eb8832366
fix
3 months ago
wangbluo
683179cefd
fix
3 months ago
wangbluo
0a01e2a453
fix the attn
3 months ago
pre-commit-ci[bot]
216d54e374
[pre-commit.ci] auto fixes from pre-commit.com hooks
...
for more information, see https://pre-commit.ci
3 months ago
wangbluo
fdd84b9087
fix the sp
3 months ago
flybird11111
a35a078f08
[doc] update sp doc ( #6055 )
...
* update sp doc
* fix
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* fix
* fix
* fix
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
3 months ago
Hongxin Liu
13946c4448
[fp8] hotfix backward hook ( #6053 )
...
* [fp8] hotfix backward hook
* [fp8] hotfix pipeline loss accumulation
3 months ago
botbw
c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix ( #6048 )
...
* [example] pass use_fp8_comm flag to all plugins
* [example] add mixtral benchmark
* [moe] refine assertion and check
* [moe] fix mixtral & add more tests
* [moe] consider checking dp * sp group and moe_dp_group
* [mixtral] remove gate tp & add more tests
* [deepseek] fix tp & sp for deepseek
* [mixtral] minor fix
* [deepseek] add deepseek benchmark
3 months ago
Wenxuan Tan
8fd25d6e09
[Feature] Split cross-entropy computation in SP ( #5959 )
...
* halfway
* fix cross-PP-stage position id length diff bug
* fix typo
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* unified cross entropy func for all shardformer models
* remove redundant lines
* add basic ring attn; debug cross entropy
* fwd bwd logic complete
* fwd bwd logic complete; add experimental triton rescale
* precision tests passed
* precision tests passed
* fix typos and remove misc files
* update softmax_lse shape by new interface
* change tester name
* remove buffer clone; support packed seq layout
* add varlen tests
* fix typo
* all tests passed
* add dkv_group; fix mask
* remove debug statements
* adapt chatglm, command-R, qwen
* debug
* halfway
* fix cross-PP-stage position id length diff bug
* fix typo
* fix typo
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* unified cross entropy func for all shardformer models
* remove redundant lines
* add basic ring attn; debug cross entropy
* fwd bwd logic complete
* fwd bwd logic complete; add experimental triton rescale
* precision tests passed
* precision tests passed
* fix typos and remove misc files
* add sp_mode to benchmark; fix varlen interface
* update softmax_lse shape by new interface
* add varlen tests
* fix typo
* all tests passed
* add dkv_group; fix mask
* remove debug statements
* add comments
* q1 index only once
* remove events to simplify stream sync
* simplify forward/backward logic
* 2d ring forward passed
* 2d ring backward passed
* fixes
* fix ring attn loss
* 2D ring backward + llama passed
* merge
* update logger
* fix typo
* rebase
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* fix typo
* remove typos
* fixes
* support GPT
---------
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
3 months ago
Hongxin Liu
b3db1058ec
[release] update version ( #6041 )
...
* [release] update version
* [devops] update comp test
* [devops] update comp test debug
* [devops] debug comp test
* [devops] debug comp test
* [devops] debug comp test
* [devops] debug comp test
* [devops] debug comp test
3 months ago
Hanks
5ce6dd75bf
[fp8] disable all_to_all_fp8 in intranode ( #6045 )
...
* enhance all_to_all_fp8 with internode comm control
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* disable some fp8 ops due to performance issue
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
3 months ago
Hongxin Liu
26e553937b
[fp8] fix linear hook ( #6046 )
3 months ago
Hongxin Liu
c3b5caff0e
[fp8] optimize all-gather ( #6043 )
...
* [fp8] optimize all-gather
* [fp8] fix all gather fp8 ring
* [fp8] enable compile
* [fp8] fix all gather fp8 ring
3 months ago
Tong Li
c650a906db
[Hotfix] Remove deprecated install ( #6042 )
...
* remove deprecated install
* remove unused folder
3 months ago
Gao, Ruiyuan
e9032fb0b2
[colossalai/checkpoint_io/...] fix bug in load_state_dict_into_model; format error msg ( #6020 )
...
* fix bug in load_state_dict_into_model; format error msg
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Update utils.py
to support checking missing_keys
* Update general_checkpoint_io.py
fix bug in missing_keys error message
* retrigger tests
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
3 months ago
Guangyao Zhang
e96a0761ea
[FP8] unsqueeze scale to make it compatible with torch.compile ( #6040 )
3 months ago
Tong Li
0d3a85d04f
add fused norm ( #6038 )
3 months ago