Commit Graph

3877 Commits (feature/zerobubble)
 

Author SHA1 Message Date
wangbluo 683179cefd fix
2 months ago
wangbluo 0a01e2a453 fix the attn
2 months ago
pre-commit-ci[bot] 216d54e374 [pre-commit.ci] auto fixes from pre-commit.com hooks
2 months ago
wangbluo fdd84b9087 fix the sp
2 months ago
duanjunwen 9bc3b6e220 [feat] moehybrid support zerobubble;
2 months ago
flybird11111 a35a078f08
[doc] update sp doc (#6055)
2 months ago
Hongxin Liu 13946c4448
[fp8] hotfix backward hook (#6053)
2 months ago
duanjunwen 11ae6848c6
[zerobubble]Support ZeroBubble Pipeline (#6034)
3 months ago
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048)
3 months ago
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959)
3 months ago
Hongxin Liu b3db1058ec
[release] update version (#6041)
3 months ago
duanjunwen 6c2a120bed [fix] add testcase with microbatch 4;
3 months ago
duanjunwen 8366a7855f [fix] update optim state dict assert (include param group & state); fix mem assert after add optim;
3 months ago
duanjunwen ce58d8e8bf [fix] add output_obj_grad assert None at bwd b step; replace input_obj.require_grad_ with treemap;
3 months ago
duanjunwen 7568b34626 [fix] fix redundant detach & clone; add buffer assertation in the end;
3 months ago
duanjunwen fed8b1587d [fix] fix model zoo import;
3 months ago
duanjunwen a5ec3d4285 [fix] fix mem; use a new model shape; only assert mem less and equal than theo;
3 months ago
Hanks 5ce6dd75bf
[fp8] disable all_to_all_fp8 in intranode (#6045)
3 months ago
duanjunwen 35a7b636b3 [fix] fix mem assertation
3 months ago
duanjunwen 400e5e5b23 [fix] mem assertation'
3 months ago
duanjunwen 4a358348c7 [fix] fix mem check;
3 months ago
duanjunwen 2f09c374f3 [feat] add memory assertation;
3 months ago
duanjunwen e6e1a97a6d [fix] fix requir grad position and detach position and input&output local buffer append position;
3 months ago
duanjunwen 20503cdfdf [fix] rm requir_grad for output;
3 months ago
duanjunwen b4103f125c [fix] fix detach output & release output;
3 months ago
duanjunwen 4c1f81c683 [fix] fix bwd step if condition; remove useless comments and format info;
3 months ago
Hongxin Liu 26e553937b
[fp8] fix linear hook (#6046)
3 months ago
Hongxin Liu c3b5caff0e
[fp8] optimize all-gather (#6043)
3 months ago
duanjunwen ab643c9af7 [fix] rm output.data after send fwd;
3 months ago
duanjunwen a48afc4a66 [fix] fix optim bwd;
3 months ago
Tong Li c650a906db
[Hotfix] Remove deprecated install (#6042)
3 months ago
duanjunwen 591a13bf7e [fix] fix optim bwd;
3 months ago
duanjunwen 77fe44286c [fix] rm zbv in hybridplugin
3 months ago
duanjunwen 6d18d38d5c [feat] update test; rm comments;
3 months ago
Gao, Ruiyuan e9032fb0b2
[colossalai/checkpoint_io/...] fix bug in load_state_dict_into_model; format error msg (#6020)
3 months ago
duanjunwen a7b767b071 [fix] fix communication_map;
3 months ago
duanjunwen 8eb6eac225 [fix] fix optim bwd; add license for v_schedule; remove redundant attributes; fix schedule loop "while"--> "for"; add communication dict;
3 months ago
duanjunwen 6af81d8c0d [feat] add fwd_bwd_step, run_fwd_only;
3 months ago
duanjunwen 48ba22dbfd [feat] fix optimizer bwd b & w; support return accum loss & output
3 months ago
Guangyao Zhang e96a0761ea
[FP8] unsqueeze scale to make it compatible with torch.compile (#6040)
3 months ago
duanjunwen 4c4b01b859 [feat] add optim backward_b_by_grad
3 months ago
Tong Li 0d3a85d04f
add fused norm (#6038)
3 months ago
Tong Li 4a68efb7da
[Colossal-LLaMA] Refactor latest APIs (#6030)
3 months ago
duanjunwen b1419ef76a [fix] fix poc test; add comments in poc;
3 months ago
duanjunwen 582ba0d6ff [feat] fix func name & ci; add comments;
3 months ago
duanjunwen b5f7b4d228 [feat] fix poc format
3 months ago
duanjunwen d6e3d7d2a3 [feat] fix ci; add assert;
3 months ago
duanjunwen 29383b2de0 [fix] update
3 months ago
Hongxin Liu cc1b0efc17
[plugin] hotfix zero plugin (#6036)
3 months ago
duanjunwen fe209164f1 [feat] add apply v_schedule graph; p & p.grad assert err exist;
3 months ago