Commit Graph

35 Commits (80a8ca916a740e913cbedf60caeadc0bab5cb4fa)

Author SHA1 Message Date
Frank Lee 7cfed5f076
[feat] refactored extension module (#5298)
10 months ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752)
1 year ago
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code (#4743)
1 year ago
Hongxin Liu 554aa9592e
[legacy] move communication and nn to legacy and refactor logger (#4671)
1 year ago
digger yu 9265f2d4d7
[NFC]fix typo colossalai/auto_parallel nn utils etc. (#3779)
2 years ago
Frank Lee 80eba05b0a
[test] refactor tests with spawn (#3452)
2 years ago
YH a848091141
Fix port exception type (#2925)
2 years ago
Nikita Shulga 01066152f1
Don't use `torch._six` (#2775)
2 years ago
HELSON 7829aa094e
[ddp] add is_ddp_ignored (#2434)
2 years ago
Frank Lee 40d376c566
[setup] support pre-build and jit-build of cuda kernels (#2374)
2 years ago
Jiarui Fang 355ffb386e
[builder] unified cpu_optim fused_optim inferface (#2190)
2 years ago
Jiarui Fang 9587b080ba
[builder] use runtime builder for fused_optim (#2189)
2 years ago
ver217 f8a7148dec
[kernel] move all symlinks of kernel to `colossalai._C` (#1971)
2 years ago
Frank Lee 5a52e21fe3
[test] fixed the activation codegen test (#1447)
2 years ago
ver217 821c6172e2
[utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer (#1442)
2 years ago
ver217 a45ddf2d5f
[hotfix] fix sharded optim step and clip_grad_norm (#1226)
2 years ago
YuliangLiu0306 e27645376d
[hotfix]different overflow status lead to communication stuck. (#1175)
2 years ago
ver217 ab8c6b4a0e
[zero] refactor memstats collector (#706)
3 years ago
アマデウス 54e688b623
moved ensure_path_exists to utils.common (#591)
3 years ago
Liang Bowen ec5086c49c Refactored docstring to google style
3 years ago
HELSON f24b5ed201
[MOE] remove old MoE legacy (#493)
3 years ago
HELSON 7544347145
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
3 years ago
Frank Lee b72b8445c6
optimized context test time consumption (#446)
3 years ago
Jiarui Fang 5a560a060a Feature/zero (#279)
3 years ago
アマデウス 9ee197d0e9 moved env variables to global variables; (#215)
3 years ago
HELSON 0f8c7f9804
Fixed docstring in colossalai (#171)
3 years ago
Frank Lee e2089c5c15
adapted for sequence parallel (#163)
3 years ago
HELSON dceae85195
Added MoE parallel (#127)
3 years ago
ver217 96780e6ee4
Optimize pipeline schedule (#94)
3 years ago
アマデウス 01a80cd86d
Hotfix/Colossalai layers (#92)
3 years ago
ver217 8f02a88db2
add interleaved pipeline, fix naive amp and update pipeline model initializer (#80)
3 years ago
Frank Lee 35813ed3c4
update examples and sphnix docs for the new api (#63)
3 years ago
Frank Lee da01c234e1
Develop/experiments (#59)
3 years ago
Frank Lee 3defa32aee
Support TP-compatible Torch AMP and Update trainer API (#27)
3 years ago
zbian 404ecbdcc6 Migrated project
3 years ago