Commit Graph

1049 Commits (46931e3c32e8ccb6bddc46273653eca9d85152ac)

Author SHA1 Message Date
Jiarui Fang 7487215b95
[ColoTensor] add independent process group (#1179) 2022-06-29 10:03:09 +08:00
YuliangLiu0306 26ba87272d
[hotfix]fixed p2p process send stuck (#1181)
* [CLI] add CLI launcher

* Revert "[CLI] add CLI launcher"

This reverts commit df7e6506d4.

* [hotfix]fixed p2p process send stuck
2022-06-28 14:41:11 +08:00
Jiarui Fang 1b657f9ce1
[tensor] revert local view back (#1178) 2022-06-27 18:38:34 +08:00
Jiarui Fang 0dd4e2bbfb
[Tensor] rename some APIs in TensorSpec and Polish view unittest (#1176) 2022-06-27 15:56:11 +08:00
Ziyue Jiang dd0420909f
[Tensor] rename parallel_action (#1174)
* rename parallel_action

* polish
2022-06-27 10:04:45 +08:00
YuliangLiu0306 e27645376d
[hotfix]different overflow status lead to communication stuck. (#1175)
* [CLI] add CLI launcher

* Revert "[CLI] add CLI launcher"

This reverts commit df7e6506d4.

* [hotfix]fix some bugs caused by refactored schedule.

* [hotfix]different overflow statu llead to communication stuck.
2022-06-27 09:53:57 +08:00
Jiarui Fang aa7bef73d4
[Tensor] distributed view supports inter-process hybrid parallel (#1169) 2022-06-27 09:45:26 +08:00
ver217 9e1daa63d2
[zero] sharded optim supports loading local state dict (#1170)
* sharded optim supports loading local state dict

* polish code

* add unit test
2022-06-24 18:05:16 +08:00
ver217 561e90493f
[zero] zero optim supports loading local state dict (#1171)
* zero optim supports loading local state dict

* polish code

* add unit test
2022-06-24 17:25:57 +08:00
Jiarui Fang 4b9bba8116
[ColoTensor] rename APIs and add output_replicate to ComputeSpec (#1168) 2022-06-24 13:08:54 +08:00
Jiarui Fang f4ef224358
[Tensor] remove ParallelAction, use ComputeSpec instread (#1166) 2022-06-23 17:34:59 +08:00
Jiarui Fang 177c374401
remove gather out in parallel action (#1163) 2022-06-23 16:35:05 +08:00
Frank Lee 51f1ec96b0
[workflow] polish readme and dockerfile (#1165)
* [workflow] polish readme and dockerfile

* polish
2022-06-23 15:12:15 +08:00
Frank Lee ca73028a3a
[workflow] auto-publish docker image upon release (#1164) 2022-06-23 14:51:59 +08:00
ver217 634eecb98e
mark sanity_check of dist_spec_mgr as staticmethod (#1161) 2022-06-23 11:35:25 +08:00
Ziyue Jiang 955ac912de
remove log (#1160) 2022-06-23 10:32:42 +08:00
ver217 4e67b2a890
fix chunk move device (#1158) 2022-06-22 18:07:10 +08:00
Jiarui Fang 07f9c781f9
[graph] improve the graph building. (#1157) 2022-06-22 16:47:20 +08:00
ver217 22717a856f
[tensor] add embedding bag op (#1156) 2022-06-22 15:54:03 +08:00
ver217 ae86151968
[tensor] add more element-wise ops (#1155)
* add more element-wise ops

* update test_op

* polish unit test
2022-06-22 15:16:47 +08:00
github-actions[bot] e8c34eedfd
Automated submodule synchronization (#1129)
Co-authored-by: github-actions <github-actions@github.com>
2022-06-22 14:39:08 +08:00
Frank Lee d415d73286
[workflow] fixed release post workflow (#1154) 2022-06-22 11:55:21 +08:00
ver217 54aabb8da4
[gemini] refactor gemini mgr (#1151)
* refactor gemini mgr

* udpate __init__
2022-06-22 11:54:36 +08:00
Frank Lee f8eec98ff5
[tensor] fixed non-serializable colo parameter during model checkpointing (#1153) 2022-06-22 11:43:38 +08:00
ver217 ffa025e120
[tensor] dist spec s2s uses all-to-all (#1136)
* dist spec s2s uses all-to-all

* update unit test

* add sanity check

* polish unitest test with titans

* add sanity check for DistMgr

* add sanity check

Co-authored-by: jiaruifang <fangjiarui123@gmail.com>
2022-06-22 11:32:38 +08:00
Frank Lee c77da0dc81
[workflow] fixed format error in yaml file (#1145) 2022-06-22 11:31:24 +08:00
Jiarui Fang ff644ee5e4
polish unitest test with titans (#1152) 2022-06-22 09:58:02 +08:00
YuliangLiu0306 f1f51990b9
[hotfix]fix some bugs caused by refactored schedule. (#1148)
* [CLI] add CLI launcher

* Revert "[CLI] add CLI launcher"

This reverts commit df7e6506d4.

* [hotfix]fix some bugs caused by refactored schedule.
2022-06-21 22:46:30 +08:00
Jiarui Fang 8cdce0399c
[ColoTensor] improves init functions. (#1150) 2022-06-21 18:28:38 +08:00
ver217 8106d7b8c7
[ddp] refactor ColoDDP and ZeroDDP (#1146)
* ColoDDP supports overwriting default process group

* rename ColoDDPV2 to ZeroDDP

* add docstr for ZeroDDP

* polish docstr
2022-06-21 16:35:23 +08:00
Frank Lee 0e4e62d30d
[tensor] added __repr__ to spec (#1147) 2022-06-21 15:38:05 +08:00
YuliangLiu0306 70dd88e2ee
[pipeline]add customized policy (#1139)
* [CLI] add CLI launcher

* Revert "[CLI] add CLI launcher"

This reverts commit df7e6506d4.

* [pipeline]add customized policy
2022-06-21 15:23:41 +08:00
Frank Lee d1918304bb
[workflow] added workflow to auto draft the release post (#1144) 2022-06-21 14:43:25 +08:00
YuliangLiu0306 18091581c0
[pipeline]support more flexible pipeline (#1138)
* [CLI] add CLI launcher

* Revert "[CLI] add CLI launcher"

This reverts commit df7e6506d4.

* [pipeline]support more flexible pipeline
2022-06-21 14:40:50 +08:00
ver217 ccf3c58c89
embedding op use gather_out (#1143) 2022-06-21 13:21:20 +08:00
Frank Lee e61dc31b05
[ci] added scripts to auto-generate release post text (#1142)
* [ci] added scripts to auto-generate release post text

* polish code
2022-06-21 12:22:53 +08:00
ver217 6690a61b4d
[hotfix] prevent nested ZeRO (#1140) 2022-06-21 11:33:53 +08:00
Frank Lee 15aab1476e
[zero] avoid zero hook spam by changing log to debug level (#1137) 2022-06-21 10:44:01 +08:00
Frank Lee 73ad05fc8c
[zero] added error message to handle on-the-fly import of torch Module class (#1135)
* [zero] added error message to handle on-the-fly import of torch Module class

* polish code
2022-06-20 11:24:27 +08:00
ver217 e4f555f29a
[optim] refactor fused sgd (#1134) 2022-06-20 11:19:38 +08:00
ver217 d26902645e
[ddp] add save/load state dict for ColoDDP (#1127)
* add save/load state dict for ColoDDP

* add unit test

* refactor unit test folder

* polish unit test

* rename unit test
2022-06-20 10:51:47 +08:00
YuliangLiu0306 946dbd629d
[hotfix]fix bugs caused by refactored pipeline (#1133)
* [CLI] add CLI launcher

* Revert "[CLI] add CLI launcher"

This reverts commit df7e6506d4.

* [hotfix]fix bugs caused by refactored pipeline
2022-06-17 17:54:15 +08:00
ver217 789cad301b
[hotfix] fix param op hook (#1131)
* fix param op hook

* update zero tp test

* fix bugs
2022-06-17 16:12:05 +08:00
ver217 a1a7899cae
[hotfix] fix zero init ctx numel (#1128) 2022-06-16 17:17:27 +08:00
ver217 f0a954f16d
[ddp] add set_params_to_ignore for ColoDDP (#1122)
* add set_params_to_ignore for ColoDDP

* polish code

* fix zero hook v2

* add unit test

* polish docstr
2022-06-16 12:54:46 +08:00
YuliangLiu0306 3175bcb4d8
[pipeline]support List of Dict data (#1125)
* [CLI] add CLI launcher

* Revert "[CLI] add CLI launcher"

This reverts commit df7e6506d4.

* [pipeline]support List of Dict data

* polish
2022-06-16 11:19:48 +08:00
Frank Lee 91a5999825
[ddp] supported customized torch ddp configuration (#1123) 2022-06-15 18:11:53 +08:00
YuliangLiu0306 fcf55777dd
[fx]add autoparallel passes (#1121)
* [CLI] add CLI launcher

* Revert "[CLI] add CLI launcher"

This reverts commit df7e6506d4.

* feature/add autoparallel passes
2022-06-15 16:36:46 +08:00
ver217 e127b4375b
cast colo ddp v2 inputs/outputs (#1120) 2022-06-15 15:57:04 +08:00
Frank Lee 16302a5359
[fx] added unit test for coloproxy (#1119)
* [fx] added unit test for coloproxy

* polish code

* polish code
2022-06-15 15:27:51 +08:00