Commit Graph

265 Commits (b3348221633fa811650052111fccc3ed59d6be45)
 

Author SHA1 Message Date
Jiarui Fang b334822163
[zero] polish sharded param name (#484)
3 years ago
ver217 9caa8b6481
docs get correct release version (#489)
3 years ago
HELSON d7ea63992b
[MOE] add FP32LinearGate for MOE in NaiveAMP context (#480)
3 years ago
github-actions[bot] 353566c198
Automated submodule synchronization (#483)
3 years ago
Jiarui Fang 65c0f380c2
[format] polish name format for MOE (#481)
3 years ago
ver217 8d3250d74b
[zero] ZeRO supports pipeline parallel (#477)
3 years ago
Sze-qq 7f5e4592eb
Update Experiment result about Colossal-AI with ZeRO (#479)
3 years ago
Frank Lee 83a847d058
[test] added rerun on exception for testing (#475)
3 years ago
ver217 d70f43dd7a
embedding remove attn mask (#474)
3 years ago
HELSON 7544347145
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
3 years ago
ver217 1559c0df41
fix attn mask shape of gpt (#472)
3 years ago
ver217 3cb3fc275e
zero init ctx receives a dp process group (#471)
3 years ago
ver217 7e30068a22
[doc] update rst (#470)
3 years ago
HELSON aff9d354f7
[MOE] polish moe_env (#467)
3 years ago
HELSON bccbc15861
[MOE] changed parallelmode to dist process group (#460)
3 years ago
Frank Lee 8f9617c313
[release] update version (#465)
3 years ago
Frank Lee 2963565ff8
[test] fixed release workflow step (#464)
3 years ago
Frank Lee 292590e0fa
[test] fixed release workflow condition (#463)
3 years ago
Frank Lee 90bd97b9c0
[devops] fixed workflow bug (#462)
3 years ago
ver217 304263c2ce
fix gpt attention mask (#461)
3 years ago
ver217 fc8e6db005
[doc] Update docstring for ZeRO (#459)
3 years ago
HELSON 84fd7c1d4d
add moe context, moe utilities and refactor gradient handler (#455)
3 years ago
Frank Lee af185b5519
[test] fixed amp convergence comparison test (#454)
3 years ago
ver217 a241f61b34
[zero] Update initialize for ZeRO (#458)
3 years ago
ver217 642846d6f9
update sharded optim and fix zero init ctx (#457)
3 years ago
Jiarui Fang e2e9f82588
Revert "[zero] update sharded optim and fix zero init ctx" (#456)
3 years ago
ver217 8cf7ff08cf polish code
3 years ago
ver217 e99af94ab8 rename variables
3 years ago
ver217 46add4a5c5 remove surplus imports
3 years ago
ver217 57567ee768 update sharded optim and fix zero init ctx
3 years ago
Frank Lee f27d801a13
[test] optimized zero data parallel test (#452)
3 years ago
github-actions[bot] cfcc8271f3
[Bot] Automated submodule synchronization (#451)
3 years ago
Frank Lee ac4513c56e
[DevOps] remove unneeded dependency in build workflow (#449)
3 years ago
Jiarui Fang 0fcfb1e00d
[test] make zero engine test really work (#447)
3 years ago
Frank Lee bb2790cf0b
optimize engine and trainer test (#448)
3 years ago
Jiarui Fang 237d08e7ee
[zero] hybrid cpu adam (#445)
3 years ago
Frank Lee b72b8445c6
optimized context test time consumption (#446)
3 years ago
Jiarui Fang 496cbb0760
[hotfix] fix initialize bug with zero (#442)
3 years ago
Frank Lee 725a39f4bd
update github CI with the current workflow (#441)
3 years ago
Frank Lee 5a1e33b97f
update contributing.md with the current workflow (#440)
3 years ago
Jiarui Fang 17b8274f8a
[unitest] polish zero config in unittest (#438)
3 years ago
Jiarui Fang 640a6cd304
[refactory] refactory the initialize method for new zero design (#431)
3 years ago
Frank Lee 4f85b687cf
[misc] replace codebeat with codefactor on readme (#436)
3 years ago
Frank Lee bffd85bf34
added testing module (#435)
3 years ago
HELSON dbdc9a7783
added Multiply Jitter and capacity factor eval for MOE (#434)
3 years ago
Frank Lee b03b3ae99c
fixed mem monitor device (#433)
3 years ago
Frank Lee 14a7094243
fixed fp16 optimizer none grad bug (#432)
3 years ago
ver217 fce9432f08 sync before creating empty grad
3 years ago
ver217 ea6905a898 free param.grad
3 years ago
ver217 9506a8beb2 use double buffer to handle grad
3 years ago