Commit Graph

1144 Commits (b1be5b88bdd37cd4c2338f0faf423abbbff36e57)

Author SHA1 Message Date
Jiarui Fang 30b4dd17c0
[FAW] export FAW in _ops (#1438) 2022-08-11 13:43:24 +08:00
HELSON 9056677b13
[zero] add chunk size searching algorithm for parameters in different groups (#1436) 2022-08-11 13:32:19 +08:00
Jiarui Fang c9427a323f
hotfix #1434 (#1437) 2022-08-11 13:14:25 +08:00
HELSON 039b7ed3bc
[polish] add update directory in gemini; rename AgChunk to ChunkV2 (#1432) 2022-08-10 16:40:29 +08:00
Super Daniel f20cb4e893
[fx] modify the calculation of node_size in MetaInfoProp for activation checkpointing usages (#1425)
* [fx] modify the calculation of node_size in MetaInfoProp for activation checkpointing usages

* [fx] modify the calculation of node_size in MetaInfoProp for activation checkpointing usages

* [fx] modify the calculation of node_size in MetaInfoProp for activation checkpointing usages
2022-08-10 16:36:35 +08:00
Jiarui Fang 89c434a0a6
[polish] add test_ops directory (#1431) 2022-08-10 15:35:26 +08:00
Jiarui Fang 10b3df65c8
[FAW] move coloparam setting in test code. (#1429) 2022-08-10 14:31:53 +08:00
Jiarui Fang cb98cf5558
[FAW] parallel FreqAwareEmbedding (#1424) 2022-08-10 13:44:30 +08:00
HELSON 0d212183c4
[zero] add has_inf_or_nan in AgChunk; enhance the unit test of AgChunk (#1426) 2022-08-10 11:37:28 +08:00
YuliangLiu0306 33f0744d51
[tensor] add shape consistency feature to support auto spec transform (#1418)
* [tensor] add shape consistency feature to supportauto sharding spec transform.

* [tensor] remove unused argument in simulator, add doc string for target pair.
2022-08-10 11:29:17 +08:00
HELSON 4fb3c52cf0
[zero] add unit test for AgChunk's append, close, access (#1423) 2022-08-09 18:03:10 +08:00
HELSON c577ed016e
[zero] add AgChunk (#1417) 2022-08-09 16:39:48 +08:00
Jiarui Fang d209aff684
Add FreqAwareEmbeddingBag (#1421) 2022-08-09 16:26:12 +08:00
ver217 6df3e19be9
[hotfix] zero optim prevents calling inner optim.zero_grad (#1422) 2022-08-09 16:08:12 +08:00
Jiarui Fang 504419d261
[FAW] add cache manager for the cached embedding (#1419) 2022-08-09 15:17:17 +08:00
Kirigaya Kazuto 44fd3c83ab
[communication] add p2p_v2.py to support communication with List[Any] (#1407)
* support p2p communication with any type of object | pass test

* reconstruct pipeline schedule with p2p_v2.py(support communication with List[Any]) | pass test

* [communication] add p2p_v2.py to support communication with List[Any]

* Delete _pipeline_schedule_v2.py

* Delete test_cifar_with_data_pipeline_tensor_v2.py

* [engin/schedule] use p2p_v2 to recontruct pipeline_schedule

* [communication] remove print code

* [communication] remove print code
2022-08-09 11:40:04 +08:00
github-actions[bot] 1590f59908
Automated submodule synchronization (#1415)
Co-authored-by: github-actions <github-actions@github.com>
2022-08-09 10:07:04 +08:00
github-actions[bot] 9b442ecdc3
Automated submodule synchronization (#1404)
Co-authored-by: github-actions <github-actions@github.com>
2022-08-08 11:24:58 +08:00
YuliangLiu0306 7c96055c68
[tensor]build sharding spec to replace distspec in future. (#1405) 2022-08-08 11:15:57 +08:00
ver217 12b4887097
[hotfix] fix CPUAdam kernel nullptr (#1410) 2022-08-05 19:45:45 +08:00
github-actions[bot] 1e5eb0874c
Automated submodule synchronization (#1396)
Co-authored-by: github-actions <github-actions@github.com>
2022-08-03 09:18:45 +08:00
YuliangLiu0306 0442f940f0
[device] add DeviceMesh class to support logical device layout (#1394)
* [device] add DeviceMesh class to support logical device layout

* polish code

* add doc string
2022-08-02 19:23:48 +08:00
ver217 04c9a86af8
[zero] ZeroDDP supports controlling outputs' dtype (#1399) 2022-08-02 17:49:11 +08:00
HELSON 4e98e938ce
[zero] alleviate memory usage in ZeRODDP state_dict (#1398) 2022-08-02 15:49:13 +08:00
Jiarui Fang 4f5f8f77d1
update nvme on readme (#1397) 2022-08-02 11:39:37 +08:00
ver217 56b8863b87
[zero] chunk manager allows filtering ex-large params (#1393) 2022-08-02 10:40:27 +08:00
Frank Lee adf5054ff8
[fx] fixed torchaudio conformer tracing (#1392) 2022-08-01 16:08:28 +08:00
Frank Lee 7d6293927f
[fx] patched torch.max and data movement operator (#1391)
* [fx] patched torch.max and data movement operator

* polish code
2022-08-01 15:31:50 +08:00
fastalgo db89600cf2
Update README.md 2022-07-30 22:11:07 +08:00
Frank Lee 89e60d1505
[fx] fixed indentation error in checkpointing codegen (#1385) 2022-07-30 00:27:12 +08:00
HELSON c7221cb2d4
[hotfix] adapt ProcessGroup and Optimizer to ColoTensor (#1388) 2022-07-29 19:33:24 +08:00
Frank Lee ad678921db
[fx] patched torch.full for huggingface opt (#1386) 2022-07-29 17:56:28 +08:00
HELSON 527758b2ae
[hotfix] fix a running error in test_colo_checkpoint.py (#1387) 2022-07-29 15:58:06 +08:00
Jiarui Fang f792507ff3
[chunk] add PG check for tensor appending (#1383) 2022-07-29 13:27:05 +08:00
ver217 8dced41ad0
[zero] zero optim state_dict takes only_rank_0 (#1384)
* zero optim state_dict takes only_rank_0

* fix unit test
2022-07-29 13:22:50 +08:00
ver217 7d5d628e07
[DDP] test ddp state dict uses more strict threshold (#1382) 2022-07-28 17:29:04 +08:00
YuliangLiu0306 df54481473
[hotfix] fix some bugs during gpt2 testing (#1379) 2022-07-28 17:21:07 +08:00
ver217 828b9e5e0d
[hotfix] fix zero optim save/load state dict (#1381) 2022-07-28 17:19:39 +08:00
HELSON b6fd165f66
[checkpoint] add kwargs for load_state_dict (#1374) 2022-07-28 15:56:52 +08:00
github-actions[bot] 50dec605e1
Automated submodule synchronization (#1380)
Co-authored-by: github-actions <github-actions@github.com>
2022-07-28 11:12:52 +08:00
ver217 83328329dd
[hotfix] fix zero ddp buffer cast (#1376)
* fix zero ddp buffer cast

* fix zero ddp ignore params
2022-07-28 10:54:44 +08:00
ver217 5d5031e946
fix zero ddp state dict (#1378) 2022-07-28 09:31:42 +08:00
Frank Lee 0c1a16ea5b
[util] standard checkpoint function naming (#1377) 2022-07-28 09:29:30 +08:00
YuliangLiu0306 52bc2dc271
[fx] update split module pass and add customized policy (#1373)
* [CLI] add CLI launcher

* Revert "[CLI] add CLI launcher"

This reverts commit df7e6506d4.

* [fx]update split module pass and add customized policy
2022-07-27 13:40:54 +08:00
Super Daniel be229217ce
[fx] add torchaudio test (#1369)
* [fx]add torchaudio test

* [fx]add torchaudio test

* [fx] add torchaudio test

* [fx] add torchaudio test

* [fx] add torchaudio test

* [fx] add torchaudio test

* [fx] add torchaudio test

* [fx] add torchaudio test and test patches

* Delete ~

* [fx] add patches and patches test

* [fx] add patches and patches test

* [fx] fix patches

* [fx] fix rnn patches

* [fx] fix rnn patches

* [fx] fix rnn patches

* [fx] fix rnn patches

* [fx] merge upstream

* [fx] fix import errors
2022-07-27 11:03:14 +08:00
github-actions[bot] fb6f085907
Automated submodule synchronization (#1372)
Co-authored-by: github-actions <github-actions@github.com>
2022-07-27 09:25:03 +08:00
Boyuan Yao bb640ec728
[fx] Add colotracer compatibility test on torchrec (#1370) 2022-07-26 17:54:39 +08:00
ver217 c415240db6
[nvme] CPUAdam and HybridAdam support NVMe offload (#1360)
* impl nvme optimizer

* update cpu adam

* add unit test

* update hybrid adam

* update docstr

* add TODOs

* update CI

* fix CI

* fix CI

* fix CI path

* fix CI path

* fix CI path

* fix install tensornvme

* fix CI

* fix CI path

* fix CI env variables

* test CI

* test CI

* fix CI

* fix nvme optim __del__

* fix adam __del__

* fix nvme optim

* fix CI env variables

* fix nvme optim import

* test CI

* test CI

* fix CI
2022-07-26 17:25:24 +08:00
HELSON 8463290642
[checkpoint] use args, kwargs in save_checkpoint, load_checkpoint (#1368) 2022-07-26 14:41:53 +08:00
github-actions[bot] c491c2a948
Automated submodule synchronization (#1364)
Co-authored-by: github-actions <github-actions@github.com>
2022-07-26 14:31:45 +08:00