Commit Graph

813 Commits (fee32a3b785327d63ff9cdaea4451b4cfe071d2a)

Author SHA1 Message Date
github-actions[bot] c77b3b19be
[format] applied code formatting on changed files in pull request 4152 (#4157)
1 year ago
Frank Lee 1fb0d95df0 [shardformer] made tensor parallelism configurable (#4144)
1 year ago
Frank Lee 74257cb446 [shardformer] refactored some doc and api (#4137)
1 year ago
Frank Lee ae035d305d [shardformer] added embedding gradient check (#4124)
1 year ago
Frank Lee 6a88bae4ec [shardformer] integrate with data parallelism (#4103)
1 year ago
Frank Lee f3b6aaa6b7 [shardformer] supported fused normalization (#4112)
1 year ago
Frank Lee b1c2901530 [shardformer] supported bloom model (#4098)
1 year ago
Kun Lin 8af29ee47a [shardformer] support vision transformer (#4096)
1 year ago
jiangmingyan ac80937138 [shardformer] shardformer support opt models (#4091)
1 year ago
Frank Lee d33a44e8c3 [shardformer] refactored layernorm (#4086)
1 year ago
Frank Lee c4b1b65931 [test] fixed tests failed due to dtensor change (#4082)
1 year ago
FoolPlayer 92f6791095 [shardformer] Add layernorm (#4072)
1 year ago
Frank Lee 70c58cfd4f [shardformer] supported fused qkv checkpoint (#4073)
1 year ago
FoolPlayer 0803a61412 [shardformer] add linearconv1d test (#4067)
1 year ago
Frank Lee 8eb09a4c69 [shardformer] support module saving and loading (#4062)
1 year ago
FoolPlayer 7740c55c55 support kit use for bert/gpt test (#4055)
1 year ago
Frank Lee f22ddacef0 [shardformer] refactored the shardformer layer structure (#4053)
1 year ago
Frank Lee 58df720570 [shardformer] adapted T5 and LLaMa test to use kit (#4049)
1 year ago
FoolPlayer 4021b9a8a2 [shardformer] add gpt2 test and layer class refactor (#4041)
1 year ago
Frank Lee d857f3dbba [shardformer] supported T5 and its variants (#4045)
1 year ago
Frank Lee c1d5453e9f [shardformer] adapted llama to the new API (#4036)
1 year ago
FoolPlayer 74d176c8d8 [shardformer] fix bert and gpt downstream with new api (#4024)
1 year ago
FoolPlayer 507c0ad368 add vocabembedding layer
1 year ago
Frank Lee 3893fa1a8d [shardformer] refactored embedding and dropout to parallel module (#4013)
1 year ago
FoolPlayer dfca9678fa integrate with dist layer (#4011)
1 year ago
Frank Lee 015af592f8 [shardformer] integrated linear 1D with dtensor (#3996)
1 year ago
Frank Lee 611971248c [device] support init device mesh from process group (#3990)
1 year ago
FoolPlayer f7774ec0f3 [Shardformer] Downstream bert (#3979)
1 year ago
wukong1992 c1c672d0f0 [shardformer] shardformer support t5 model (#3994)
1 year ago
wukong1992 6b30dfb7ce [shardformer] support llama model using shardformer (#3969)
1 year ago
FoolPlayer a73130482d [shardformer] Unit test (#3928)
1 year ago
FoolPlayer f1cb5ac6bf [shardformer] Align bert value (#3907)
1 year ago
Baizhou Zhang 0bb0b481b4 [gemini] fix argument naming during chunk configuration searching
1 year ago
github-actions[bot] a52f62082d
[format] applied code formatting on changed files in pull request 4021 (#4022)
1 year ago
Frank Lee a5883aa790
[test] fixed codefactor format report (#4026)
1 year ago
Baizhou Zhang 822c3d4d66
[checkpointio] sharded optimizer checkpoint for DDP plugin (#4002)
1 year ago
Wenhao Chen 725af3eeeb
[booster] make optimizer argument optional for boost (#3993)
1 year ago
Baizhou Zhang c9cff7e7fa
[checkpointio] General Checkpointing of Sharded Optimizers (#3984)
1 year ago
digger yu e61ffc77c6
fix typo tests/ (#3936)
1 year ago
Frank Lee ddcf58cacf
Revert "[sync] sync feature/shardformer with develop"
1 year ago
Frank Lee eb39154d40
[dtensor] updated api and doc (#3845)
1 year ago
Hongxin Liu ae02d4e4f7
[bf16] add bf16 support (#3882)
2 years ago
Hongxin Liu dbb32692d2
[lazy] refactor lazy init (#3891)
2 years ago
wukong1992 6b305a99d6
[booster] torch fsdp fix ckpt (#3788)
2 years ago
Frank Lee 615e2e5fc1
[test] fixed lazy init test import error (#3799)
2 years ago
Hongxin Liu 3c07a2846e
[plugin] a workaround for zero plugins' optimizer checkpoint (#3780)
2 years ago
Hongxin Liu 5452df63c5
[plugin] torch ddp plugin supports sharded model checkpoint (#3775)
2 years ago
wukong1992 6050f37776
[booster] removed models that don't support fsdp (#3744)
2 years ago
Hongxin Liu afb239bbf8
[devops] update torch version of CI (#3725)
2 years ago
wukong1992 b37797ed3d
[booster] support torch fsdp plugin in booster (#3697)
2 years ago