Commit Graph

1346 Commits (26b7aac0be10fb83692e197ca326f8b67c1c990b)

Author SHA1 Message Date
Frank Lee 9afb5c8b2d fixed typo in ShardParam (#294)
3 years ago
Frank Lee e17e54e32a added buffer sync to naive amp model wrapper (#291)
3 years ago
Jiarui Fang 8d653af408 add a common util for hooks registered on parameter. (#292)
3 years ago
Jie Zhu f867365aba bug fix: pass hook_list to engine (#273)
3 years ago
Jiarui Fang 5a560a060a Feature/zero (#279)
3 years ago
1SAA 82023779bb Added TPExpert for special situation
3 years ago
HELSON 36b8477228 Fixed parameter initialization in FFNExpert (#251)
3 years ago
アマデウス e13293bb4c fixed CI dataset directory; fixed import error of 2.5d accuracy (#255)
3 years ago
1SAA 219df6e685 Optimized MoE layer and fixed some bugs;
3 years ago
zbian 3dba070580 fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial
3 years ago
Frank Lee f5ca88ec97 fixed apex import (#227)
3 years ago
Frank Lee 3a1a9820b0 fixed mkdir conflict and align yapf config with flake (#220)
3 years ago
アマデウス 9ee197d0e9 moved env variables to global variables; (#215)
3 years ago
Frank Lee 812357d63c
fixed utils docstring and add example to readme (#200)
3 years ago
Frank Lee 765db512b5
fixed ddp bug on torch 1.8 (#194)
3 years ago
Jiarui Fang 569357fea0
add pytorch hooks (#179)
3 years ago
ver217 708404d5f8
fix pipeline forward return tensors (#176)
3 years ago
HELSON 0f8c7f9804
Fixed docstring in colossalai (#171)
3 years ago
Frank Lee e2089c5c15
adapted for sequence parallel (#163)
3 years ago
puck_WCR 9473a1b9c8
AMP docstring/markdown update (#160)
3 years ago
Frank Lee f3802d6b06
fixed jit default setting (#154)
3 years ago
ver217 7bf1e98b97
pipeline last stage supports multi output (#151)
3 years ago
ver217 f68eddfb3d
refactor kernel (#142)
3 years ago
BoxiangW 4a3d3446b0
Update layer integration documentations (#108)
3 years ago
ver217 9ef05ed1fc
try import deepspeed when using zero (#130)
3 years ago
HELSON dceae85195
Added MoE parallel (#127)
3 years ago
ver217 293fb40c42
add scatter/gather optim for pipeline (#123)
3 years ago
Jiarui Fang 2c0c85d3d3
fix a bug in timer (#114)
3 years ago
ver217 7904baf6e1
fix layers/schedule for hybrid parallelization (#111) (#112)
3 years ago
ver217 a951bc6089
update default logger (#100) (#101)
3 years ago
ver217 96780e6ee4
Optimize pipeline schedule (#94)
3 years ago
アマデウス 01a80cd86d
Hotfix/Colossalai layers (#92)
3 years ago
アマデウス 0fedef4f3c
Layer integration (#83)
3 years ago
shenggan 5c3843dc98
add colossalai kernel module (#55)
3 years ago
ver217 8f02a88db2
add interleaved pipeline, fix naive amp and update pipeline model initializer (#80)
3 years ago
Frank Lee 91c327cb44
fixed zero level 3 dtype bug (#76)
3 years ago
HELSON 632e622de8
overlap computation and communication in 2d operations (#75)
3 years ago
Frank Lee cd9c28e055
added CI for unit testing (#69)
3 years ago
Frank Lee 35813ed3c4
update examples and sphnix docs for the new api (#63)
3 years ago
ver217 7d3711058f
fix zero3 fp16 and add zero3 model context (#62)
3 years ago
Frank Lee 9a0466534c
update markdown docs (english) (#60)
3 years ago
Frank Lee da01c234e1
Develop/experiments (#59)
3 years ago
ver217 dbe62c67b8
add an example of ViT-B/16 and remove w_norm clipping in LAMB (#29)
3 years ago
Frank Lee 3defa32aee
Support TP-compatible Torch AMP and Update trainer API (#27)
3 years ago
ver217 3c7604ba30 update documentation
3 years ago
zbian 404ecbdcc6 Migrated project
3 years ago