Commit Graph

2676 Commits (839847b7d78bce6af5dfe58d27b5ce2c74a3619b)

Author SHA1 Message Date
Jiarui Fang 0035b7be07
[memory] add model data tensor moving api (#503) 2022-03-24 14:29:41 +08:00
Frank Lee 65ad47c35c
[devops] remove tsinghua source for pip (#507) 2022-03-24 14:12:02 +08:00
Frank Lee 44f7bcb277
[devops] remove tsinghua source for pip (#505) 2022-03-24 14:03:05 +08:00
binmakeswell af56c1d024
fix discussion button in issue template (#504) 2022-03-24 12:25:00 +08:00
Jiarui Fang a445e118cf
[polish] polish singleton and global context (#500) 2022-03-23 18:03:39 +08:00
ver217 9ec1ce6ab1
[zero] sharded model support the reuse of fp16 shard (#495)
* sharded model supports reuse fp16 shard

* rename variable

* polish code

* polish code

* polish code
2022-03-23 14:59:59 +08:00
HELSON f24b5ed201
[MOE] remove old MoE legacy (#493) 2022-03-22 17:37:16 +08:00
ver217 c4c02424f3
[zero] sharded model manages ophooks individually (#492) 2022-03-22 17:33:20 +08:00
HELSON c9023d4078
[MOE] support PR-MOE (#488) 2022-03-22 16:48:22 +08:00
ver217 a9ecb4b244
[zero] polish sharded optimizer v2 (#490) 2022-03-22 15:53:48 +08:00
ver217 62b0a8d644
[zero] sharded optim support hybrid cpu adam (#486)
* sharded optim support hybrid cpu adam

* update unit test

* polish docstring
2022-03-22 14:56:59 +08:00
Jiarui Fang b334822163
[zero] polish sharded param name (#484)
* [zero] polish sharded param name

* polish code

* polish

* polish code

* polish

* polsih

* polish
2022-03-22 14:36:16 +08:00
ver217 9caa8b6481
docs get correct release version (#489) 2022-03-22 14:24:41 +08:00
HELSON d7ea63992b
[MOE] add FP32LinearGate for MOE in NaiveAMP context (#480) 2022-03-22 10:50:20 +08:00
github-actions[bot] 353566c198
Automated submodule synchronization (#483)
Co-authored-by: github-actions <github-actions@github.com>
2022-03-22 09:34:26 +08:00
Jiarui Fang 65c0f380c2
[format] polish name format for MOE (#481) 2022-03-21 23:19:47 +08:00
ver217 8d3250d74b
[zero] ZeRO supports pipeline parallel (#477) 2022-03-21 16:55:37 +08:00
Sze-qq 7f5e4592eb
Update Experiment result about Colossal-AI with ZeRO (#479)
* [readme] add experimental visualisation regarding ColossalAI with ZeRO (#476)

* Hotfix/readme (#478)

* add experimental visualisation regarding ColossalAI with ZeRO

* adjust newly-added figure size
2022-03-21 16:34:07 +08:00
Frank Lee 83a847d058
[test] added rerun on exception for testing (#475)
* [test] added rerun on exception function

* polish code
2022-03-21 15:51:57 +08:00
ver217 d70f43dd7a
embedding remove attn mask (#474) 2022-03-21 14:53:23 +08:00
HELSON 7544347145
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 2022-03-21 13:35:04 +08:00
ver217 1559c0df41
fix attn mask shape of gpt (#472) 2022-03-21 12:01:31 +08:00
ver217 3cb3fc275e
zero init ctx receives a dp process group (#471) 2022-03-21 11:18:55 +08:00
ver217 7e30068a22
[doc] update rst (#470)
* update rst

* remove empty rst
2022-03-21 10:52:45 +08:00
HELSON aff9d354f7
[MOE] polish moe_env (#467) 2022-03-19 15:36:25 +08:00
HELSON bccbc15861
[MOE] changed parallelmode to dist process group (#460) 2022-03-19 13:46:29 +08:00
Frank Lee 8f9617c313
[release] update version (#465) 2022-03-18 19:26:07 +08:00
Frank Lee 2963565ff8
[test] fixed release workflow step (#464) 2022-03-18 19:17:13 +08:00
Frank Lee 292590e0fa
[test] fixed release workflow condition (#463) 2022-03-18 17:42:33 +08:00
Frank Lee 90bd97b9c0
[devops] fixed workflow bug (#462) 2022-03-18 17:26:24 +08:00
ver217 304263c2ce
fix gpt attention mask (#461) 2022-03-18 17:24:19 +08:00
ver217 fc8e6db005
[doc] Update docstring for ZeRO (#459)
* polish sharded model docstr

* polish sharded optim docstr

* polish zero docstr

* polish shard strategy docstr
2022-03-18 16:48:20 +08:00
HELSON 84fd7c1d4d
add moe context, moe utilities and refactor gradient handler (#455) 2022-03-18 16:38:32 +08:00
Frank Lee af185b5519
[test] fixed amp convergence comparison test (#454) 2022-03-18 16:28:16 +08:00
ver217 a241f61b34
[zero] Update initialize for ZeRO (#458)
* polish code

* shard strategy receive pg in shard() / gather()

* update zero engine

* polish code
2022-03-18 16:18:31 +08:00
ver217 642846d6f9
update sharded optim and fix zero init ctx (#457) 2022-03-18 15:44:47 +08:00
Jiarui Fang e2e9f82588
Revert "[zero] update sharded optim and fix zero init ctx" (#456)
* Revert "polish code"

This reverts commit 8cf7ff08cf.

* Revert "rename variables"

This reverts commit e99af94ab8.

* Revert "remove surplus imports"

This reverts commit 46add4a5c5.

* Revert "update sharded optim and fix zero init ctx"

This reverts commit 57567ee768.
2022-03-18 15:22:43 +08:00
ver217 8cf7ff08cf polish code 2022-03-18 14:25:25 +08:00
ver217 e99af94ab8 rename variables 2022-03-18 14:25:25 +08:00
ver217 46add4a5c5 remove surplus imports 2022-03-18 14:25:25 +08:00
ver217 57567ee768 update sharded optim and fix zero init ctx 2022-03-18 14:25:25 +08:00
Frank Lee f27d801a13
[test] optimized zero data parallel test (#452) 2022-03-18 11:35:54 +08:00
github-actions[bot] cfcc8271f3
[Bot] Automated submodule synchronization (#451)
Co-authored-by: github-actions <github-actions@github.com>
2022-03-18 09:51:43 +08:00
Frank Lee ac4513c56e
[DevOps] remove unneeded dependency in build workflow (#449) 2022-03-17 17:29:02 +08:00
Jiarui Fang 0fcfb1e00d
[test] make zero engine test really work (#447) 2022-03-17 17:24:25 +08:00
Frank Lee bb2790cf0b
optimize engine and trainer test (#448) 2022-03-17 15:44:17 +08:00
Jiarui Fang 237d08e7ee
[zero] hybrid cpu adam (#445) 2022-03-17 15:05:41 +08:00
Frank Lee b72b8445c6
optimized context test time consumption (#446) 2022-03-17 14:40:52 +08:00
Jiarui Fang 496cbb0760
[hotfix] fix initialize bug with zero (#442) 2022-03-17 13:16:22 +08:00
Frank Lee 725a39f4bd
update github CI with the current workflow (#441) 2022-03-17 10:38:04 +08:00