HELSON
|
943a96323e
|
[hotfix] fix no optimizer in save/load (#1363)
|
2022-07-26 10:53:53 +08:00 |
Frank Lee
|
cd063ac37f
|
[fx] added activation checkpoint codegen support for torch < 1.12 (#1359)
|
2022-07-25 23:35:31 +08:00 |
HELSON
|
4417804129
|
[unit test] add megatron init test in zero_optim (#1358)
|
2022-07-25 11:18:08 +08:00 |
HELSON
|
7a065dc9f6
|
[hotfix] fix megatron_init in test_gpt2.py (#1357)
|
2022-07-25 10:28:19 +08:00 |
Frank Lee
|
644582eee9
|
[fx] added activation checkpoint codegen (#1355)
|
2022-07-25 09:39:10 +08:00 |
ver217
|
38fd8844c0
|
[docker] add tensornvme in docker (#1354)
* add tensornvme in docker
* fix dockerfile
* fix dockerfile
|
2022-07-21 17:44:00 +08:00 |
ver217
|
6b43c789fd
|
fix zero optim backward_by_grad and save/load (#1353)
|
2022-07-21 16:43:58 +08:00 |
ver217
|
d068af81a3
|
[doc] update rst and docstring (#1351)
* update rst
* add zero docstr
* fix docstr
* remove fx.tracer.meta_patch
* fix docstr
* fix docstr
* update fx rst
* fix fx docstr
* remove useless rst
|
2022-07-21 15:54:53 +08:00 |
Frank Lee
|
274c1a3b5f
|
[fx] fixed apex normalization patch exception (#1352)
|
2022-07-21 15:29:11 +08:00 |
ver217
|
ce470ba37e
|
[checkpoint] sharded optim save/load grad scaler (#1350)
|
2022-07-21 15:21:21 +08:00 |
Frank Lee
|
05fae1fd56
|
[fx] added activation checkpointing annotation (#1349)
* [fx] added activation checkpointing annotation
* polish code
* polish code
|
2022-07-21 11:14:28 +08:00 |
YuliangLiu0306
|
051592c64e
|
[fx] update MetaInforProp pass to process more complex node.meta (#1344)
* [CLI] add CLI launcher
* Revert "[CLI] add CLI launcher"
This reverts commit df7e6506d4 .
* [fx] update MetaInforProp pass to process more complex node.meta
|
2022-07-21 10:57:52 +08:00 |
HELSON
|
7a8702c06d
|
[colotensor] add Tensor.view op and its unit test (#1343)
[colotensor] add megatron initialization for gpt2
|
2022-07-21 10:53:15 +08:00 |
github-actions[bot]
|
6160a1d6a7
|
Automated submodule synchronization (#1348)
Co-authored-by: github-actions <github-actions@github.com>
|
2022-07-21 10:50:27 +08:00 |
binmakeswell
|
92b0b139eb
|
[NFC] add OPT (#1345)
|
2022-07-20 15:02:07 +08:00 |
YuliangLiu0306
|
942c8cd1fb
|
[fx] refactor tracer to trace complete graph (#1342)
* [CLI] add CLI launcher
* Revert "[CLI] add CLI launcher"
This reverts commit df7e6506d4 .
* [fx] refactor tracer to trace complete graph
* add comments and solve conflicts.
|
2022-07-20 11:20:38 +08:00 |
Frank Lee
|
2cc1175c76
|
[fx] tested the complete workflow for auto-parallel (#1336)
* [fx] tested the complete workflow for auto-parallel
* polish code
* polish code
* polish code
|
2022-07-20 10:45:17 +08:00 |
YuliangLiu0306
|
4631fef8a0
|
[fx]refactor tracer (#1335)
|
2022-07-19 15:50:42 +08:00 |
HELSON
|
bf5066fba7
|
[refactor] refactor ColoTensor's unit tests (#1340)
|
2022-07-19 15:46:24 +08:00 |
HELSON
|
f92c100ddd
|
[checkpoint] use gather_tensor in checkpoint and update its unit test (#1339)
|
2022-07-19 14:15:28 +08:00 |
Frank Lee
|
f3ce7b8336
|
[fx] recovered skipped pipeline tests (#1338)
|
2022-07-19 09:49:50 +08:00 |
ver217
|
0c51ff2c13
|
[hotfix] ZeroDDP use new process group (#1333)
* process group supports getting ranks in group
* chunk mgr receives a process group
* update unit test
* fix unit tests
|
2022-07-18 14:14:52 +08:00 |
Frank Lee
|
11d1436a67
|
[workflow] update docker build workflow to use proxy (#1334)
|
2022-07-18 14:09:41 +08:00 |
Frank Lee
|
75abc75c15
|
[fx] fixed compatiblity issue with torch 1.10 (#1331)
|
2022-07-18 11:41:27 +08:00 |
Frank Lee
|
069d6fdc84
|
[workflow] update 8-gpu test to use torch 1.11 (#1332)
|
2022-07-18 11:41:13 +08:00 |
fastalgo
|
7857fd7616
|
Update README.md
|
2022-07-16 19:00:59 -07:00 |
Frank Lee
|
169954f87e
|
[test] removed outdated unit test for meta context (#1329)
|
2022-07-15 23:16:23 +08:00 |
ver217
|
7a05367101
|
[hotfix] shared model returns cpu state_dict (#1328)
|
2022-07-15 22:11:37 +08:00 |
Frank Lee
|
b2475d8c5c
|
[fx] fixed unit tests for torch 1.12 (#1327)
|
2022-07-15 18:22:15 +08:00 |
HELSON
|
d49708ae43
|
[hotfix] fix ddp for unit test test_gpt2 (#1326)
|
2022-07-15 18:19:52 +08:00 |
Frank Lee
|
250be4d31e
|
[utils] integrated colotensor with lazy init context (#1324)
* [utils] integrated colotensor with lazy init context
* polish code
* polish code
* polish code
|
2022-07-15 17:47:12 +08:00 |
Frank Lee
|
659a740738
|
[workflow] roll back to use torch 1.11 for unit testing (#1325)
|
2022-07-15 17:20:17 +08:00 |
Frank Lee
|
4d5dbf48a6
|
[workflow] fixed trigger condition for 8-gpu unit test (#1323)
|
2022-07-15 15:00:02 +08:00 |
YuliangLiu0306
|
e8acf55e8b
|
[fx] add balanced policy v2 (#1251)
* [CLI] add CLI launcher
* Revert "[CLI] add CLI launcher"
This reverts commit df7e6506d4 .
* [fx] add balanced policy v2
* add unittest
|
2022-07-15 14:54:26 +08:00 |
XYE
|
ca2d3f284f
|
[fx] Add unit test and fix bugs for transform_mlp_pass (#1299)
* add test and fix bugs
* add functions back
* add comments
|
2022-07-15 14:37:58 +08:00 |
HELSON
|
1b41686461
|
[hotfix] fix unit test test_module_spec (#1321)
|
2022-07-15 14:02:32 +08:00 |
Jiarui Fang
|
9e4c6449b0
|
[checkpoint] add ColoOptimizer checkpointing (#1316)
|
2022-07-15 09:52:55 +08:00 |
Frank Lee
|
7c2634f4b3
|
[workflow] updated release bdist workflow (#1318)
* [workflow] updated release bdist workflow
* polish workflow
* polish workflow
|
2022-07-15 09:40:58 +08:00 |
github-actions[bot]
|
869cf3d3b8
|
Automated submodule synchronization (#1319)
Co-authored-by: github-actions <github-actions@github.com>
|
2022-07-15 09:38:26 +08:00 |
Frank Lee
|
efdc240f1f
|
[workflow] disable SHM for compatibility CI on rtx3080 (#1315)
|
2022-07-14 17:44:43 +08:00 |
ver217
|
7c70bfbefa
|
[hotfix] fix PipelineSharedModuleGradientHandler (#1314)
|
2022-07-14 17:31:13 +08:00 |
Jiarui Fang
|
85f933b58b
|
[Optimizer] Remove useless ColoOptimizer (#1312)
|
2022-07-14 16:57:48 +08:00 |
Frank Lee
|
c9c37dcc4d
|
[workflow] updated pytorch compatibility test (#1311)
|
2022-07-14 16:45:17 +08:00 |
Jiarui Fang
|
9f10524313
|
[Optimizer] polish the init method of ColoOptimizer (#1310)
|
2022-07-14 16:37:33 +08:00 |
HELSON
|
36086927e1
|
[hotfix] fix ColoTensor GPT2 unitest (#1309)
|
2022-07-14 16:37:20 +08:00 |
Jiarui Fang
|
3ef3791a3b
|
[checkpoint] add test for bert and hotfix save bugs (#1297)
|
2022-07-14 15:38:18 +08:00 |
Jiarui Fang
|
bd71e2a88b
|
[hotfix] add missing file (#1308)
|
2022-07-14 14:43:15 +08:00 |
Frank Lee
|
4f4d8c3656
|
[fx] added apex normalization to patched modules (#1300)
* [fx] added apex normalization to patched modules
* remove unused imports
|
2022-07-14 14:24:13 +08:00 |
Jiarui Fang
|
4165eabb1e
|
[hotfix] remove potiential circle import (#1307)
* make it faster
* [hotfix] remove circle import
|
2022-07-14 13:44:26 +08:00 |
github-actions[bot]
|
6f2f9eb214
|
Automated submodule synchronization (#1305)
Co-authored-by: github-actions <github-actions@github.com>
|
2022-07-14 13:40:54 +08:00 |