Commit Graph

292 Commits (f40b718959b496c797da8dfa17194b63858fc2b1)

Author SHA1 Message Date
Frank Lee 015af592f8 [shardformer] integrated linear 1D with dtensor (#3996)
1 year ago
FoolPlayer ab8a47f830 [shardformer] add Dropout layer support different dropout pattern (#3856)
1 year ago
FoolPlayer 8cc11235c0 [shardformer]: Feature/shardformer, add some docstring and readme (#3816)
1 year ago
github-actions[bot] a52f62082d
[format] applied code formatting on changed files in pull request 4021 (#4022)
1 year ago
Frank Lee ddcf58cacf
Revert "[sync] sync feature/shardformer with develop"
2 years ago
FoolPlayer 21a3915c98 [shardformer] add Dropout layer support different dropout pattern (#3856)
2 years ago
FoolPlayer 58f6432416 [shardformer]: Feature/shardformer, add some docstring and readme (#3816)
2 years ago
digger yu 0e484e6201
[nfc]fix typo colossalai/pipeline tensor nn (#3899)
2 years ago
digger yu 1878749753
[nfc] fix typo colossalai/nn (#3887)
2 years ago
Hongxin Liu ae02d4e4f7
[bf16] add bf16 support (#3882)
2 years ago
digger yu 9265f2d4d7
[NFC]fix typo colossalai/auto_parallel nn utils etc. (#3779)
2 years ago
digger-yu b9a8dff7e5
[doc] Fix typo under colossalai and doc(#3618)
2 years ago
Hongxin Liu 152239bbfa
[gemini] gemini supports lazy init (#3379)
2 years ago
ver217 26b7aac0be
[zero] reorganize zero/gemini folder structure (#3424)
2 years ago
HELSON 1a1d68b053
[moe] add checkpoint for moe models (#3354)
2 years ago
Tong Li 196d4696d0 [NFC] polish colossalai/nn/_ops/addmm.py code style (#3274)
2 years ago
Yuanchen d58fa705b2 [NFC] polish code style (#3268)
2 years ago
github-actions[bot] 82503a96f2
[format] applied code formatting on changed files in pull request 2997 (#3008)
2 years ago
binmakeswell 52a5078988
[doc] add ISC tutorial (#2997)
2 years ago
ver217 823f3b9cf4
[doc] add deepspeed citation and copyright (#2996)
2 years ago
zbian 61e687831d fixed using zero with tp cannot access weight correctly
2 years ago
Jiatong (Julius) Han 8c8a39be95
[hotfix]: Remove math.prod dependency (#2837)
2 years ago
junxu c52edcf0eb
Rename class method of ZeroDDP (#2692)
2 years ago
HELSON 56ddc9ca7a
[hotfix] add correct device for fake_param (#2796)
2 years ago
HELSON 8213f89fd2
[gemini] add fake_release_chunk for keep-gathered chunk in the inference mode (#2671)
2 years ago
binmakeswell 9ab14b20b5
[doc] add CVPR tutorial (#2666)
2 years ago
ver217 5b1854309a
[hotfix] fix zero ddp warmup check (#2545)
2 years ago
HELSON a4ed9125ac
[hotfix] fix lightning error (#2529)
2 years ago
HELSON 66dfcf5281
[gemini] update the gpt example (#2527)
2 years ago
HELSON b528eea0f0
[zero] add zero wrappers (#2523)
2 years ago
HELSON 707b11d4a0
[gemini] update ddp strict mode (#2518)
2 years ago
HELSON 2d1a7dfe5f
[zero] add strict ddp mode (#2508)
2 years ago
HELSON 2bfeb24308
[zero] add warning for ignored parameters (#2446)
2 years ago
HELSON 5521af7877
[zero] fix state_dict and load_state_dict for ddp ignored parameters (#2443)
2 years ago
HELSON 7829aa094e
[ddp] add is_ddp_ignored (#2434)
2 years ago
HELSON bb4e9a311a
[zero] add inference mode and its unit test (#2418)
2 years ago
HELSON dddacd2d2c
[hotfix] add norm clearing for the overflow step (#2416)
2 years ago
HELSON ea13a201bb
[polish] polish code for get_static_torch_model (#2405)
2 years ago
Frank Lee 551cafec14
[doc] updated kernel-related optimisers' docstring (#2385)
2 years ago
eric8607242 9880fd2cd8
Fix state_dict key missing issue of the ZeroDDP (#2363)
2 years ago
Frank Lee 40d376c566
[setup] support pre-build and jit-build of cuda kernels (#2374)
2 years ago
HELSON 48d33b1b17
[gemini] add get static torch model (#2356)
2 years ago
Jiarui Fang 16cc8e6aa7
[builder] MOE builder (#2277)
2 years ago
zbian e94c79f15b improved allgather & reducescatter for 3d
2 years ago
Jiarui Fang af32022f74
[Gemini] fix the convert_to_torch_module bug (#2269)
2 years ago
HELSON 2458659919
[zero] fix error for BEiT models (#2169)
2 years ago
Jiarui Fang 355ffb386e
[builder] unified cpu_optim fused_optim inferface (#2190)
2 years ago
Jiarui Fang 9587b080ba
[builder] use runtime builder for fused_optim (#2189)
2 years ago
Jiarui Fang d42afd30f8
[builder] runtime adam and fused_optim builder (#2184)
2 years ago
Tongping Liu ab54fed292
[hotfix] add kwargs for colo_addmm (#2171)
2 years ago