Commit Graph

18 Commits (8432dc70802951227aee31da28b14f356aa3fe4c)

Author SHA1 Message Date
ver217 104cbbb313
[hotfix] add hybrid adam to __init__ (#584)
3 years ago
LuGY c44d797072
[docs] updatad docs of hybrid adam and cpu adam (#552)
3 years ago
LuGY 105c5301c3
[zero]added hybrid adam, removed loss scale in adam (#527)
3 years ago
LuGY 6a3f9fda83
[cuda] modify the fused adam, support hybrid of fp16 and fp32 (#497)
3 years ago
ver217 9ec1ce6ab1
[zero] sharded model support the reuse of fp16 shard (#495)
3 years ago
ver217 62b0a8d644
[zero] sharded optim support hybrid cpu adam (#486)
3 years ago
HELSON 7544347145
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
3 years ago
Jiarui Fang 0fcfb1e00d
[test] make zero engine test really work (#447)
3 years ago
Jiarui Fang 237d08e7ee
[zero] hybrid cpu adam (#445)
3 years ago
Kai Wang (Victor Kai) 53bb3bcc0a fix format (#362)
3 years ago
LuGY a3269de5c9 [zero] cpu adam kernel (#288)
3 years ago
HELSON 0f8c7f9804
Fixed docstring in colossalai (#171)
3 years ago
ver217 f68eddfb3d
refactor kernel (#142)
3 years ago
Frank Lee da01c234e1
Develop/experiments (#59)
3 years ago
ver217 dbe62c67b8
add an example of ViT-B/16 and remove w_norm clipping in LAMB (#29)
3 years ago
Frank Lee 3defa32aee
Support TP-compatible Torch AMP and Update trainer API (#27)
3 years ago
ver217 3c7604ba30 update documentation
3 years ago
zbian 404ecbdcc6 Migrated project
3 years ago