Commit Graph

48 Commits (feature/inference-refactor)

Author SHA1 Message Date
Jianghai ef4c14a5e2
[Inference] Fix bug in ChatGLM2 Tensor Parallelism (#5014)
1 year ago
littsk 1a3315e336
[hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926)
1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752)
1 year ago
Bin Jia 86d22581e4
[shardformer] Add overlap optional for HybridParallelPlugin (#4615)
1 year ago
Bin Jia e241b74f24
[shardformer] Add overlap support for gpt2 (#4535)
1 year ago
Bin Jia c554b7f559
[shardformer/fix overlap bug] fix overlap bug, add overlap as an option in shardco… (#4516)
1 year ago
Baizhou Zhang 44eab2b27f
[shardformer] support sharded checkpoint IO for models of HybridParallelPlugin (#4506)
1 year ago
flybird11111 59e252ecdb
[shardformer] chatglm support sequence parallel (#4482)
1 year ago
flybird11111 a27e0bb494
[shardformer] bert support sequence parallel. (#4455)
1 year ago
Bin Jia 7c8be77081
[shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460)
1 year ago
Bin Jia 424629fea0
[shardformer/sequence parallel] Cherry pick commit to new branch (#4450)
1 year ago
ver217 5d4efdf58f [shardformer] fix import
1 year ago
ver217 73a4144b91 [shardformer] fix embedding
1 year ago
Hongxin Liu 172f7fa3cf [misc] resolve code factor issues (#4433)
1 year ago
Baizhou Zhang 7711bd524a [shardformer] rewrite tests for opt/bloom/llama/vit/chatglm (#4395)
1 year ago
Baizhou Zhang ed4c448488 [pipeline] rewrite t5 tests & support multi-tensor transmitting in pipeline (#4388)
1 year ago
FoolPlayer 726541afe2 update some module with new api version
1 year ago
FoolPlayer 9ee4ebea83 [shardformer] support whisper (#4212)
1 year ago
FoolPlayer dd2bf02679 [shardformer] support SAM (#4231)
1 year ago
Baizhou Zhang 0ceec8f9a9 [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354)
1 year ago
Hongxin Liu d921ce8391 [shardformer] support inplace sharding (#4251)
1 year ago
Baizhou Zhang 208ac8f2ba [pipeline] Add Pipeline Forward for GPT2Model Shardformer (#4224)
1 year ago
Hongxin Liu 890774b2fb [shardformer] support lazy init (#4202)
1 year ago
digger yu 2ac24040eb
fix some typo colossalai/shardformer (#4160)
1 year ago
github-actions[bot] c77b3b19be
[format] applied code formatting on changed files in pull request 4152 (#4157)
1 year ago
Frank Lee f3b6aaa6b7 [shardformer] supported fused normalization (#4112)
1 year ago
Frank Lee b1c2901530 [shardformer] supported bloom model (#4098)
1 year ago
Kun Lin 8af29ee47a [shardformer] support vision transformer (#4096)
1 year ago
Frank Lee d33a44e8c3 [shardformer] refactored layernorm (#4086)
1 year ago
FoolPlayer 92f6791095 [shardformer] Add layernorm (#4072)
1 year ago
Frank Lee 70c58cfd4f [shardformer] supported fused qkv checkpoint (#4073)
1 year ago
FoolPlayer 0803a61412 [shardformer] add linearconv1d test (#4067)
1 year ago
Frank Lee 8eb09a4c69 [shardformer] support module saving and loading (#4062)
1 year ago
Frank Lee f22ddacef0 [shardformer] refactored the shardformer layer structure (#4053)
1 year ago
Frank Lee 58df720570 [shardformer] adapted T5 and LLaMa test to use kit (#4049)
1 year ago
FoolPlayer 4021b9a8a2 [shardformer] add gpt2 test and layer class refactor (#4041)
1 year ago
Frank Lee d857f3dbba [shardformer] supported T5 and its variants (#4045)
1 year ago
FoolPlayer 507c0ad368 add vocabembedding layer
1 year ago
Frank Lee 45d9384346 [shardformer] removed inplace tensor sharding (#4018)
1 year ago
Frank Lee 3893fa1a8d [shardformer] refactored embedding and dropout to parallel module (#4013)
1 year ago
Frank Lee 015af592f8 [shardformer] integrated linear 1D with dtensor (#3996)
1 year ago
wukong1992 c1c672d0f0 [shardformer] shardformer support t5 model (#3994)
1 year ago
wukong1992 6b30dfb7ce [shardformer] support llama model using shardformer (#3969)
1 year ago
FoolPlayer a73130482d [shardformer] Unit test (#3928)
1 year ago
FoolPlayer f1cb5ac6bf [shardformer] Align bert value (#3907)
1 year ago
FoolPlayer ab8a47f830 [shardformer] add Dropout layer support different dropout pattern (#3856)
1 year ago
Frank Lee ddcf58cacf
Revert "[sync] sync feature/shardformer with develop"
1 year ago
FoolPlayer 21a3915c98 [shardformer] add Dropout layer support different dropout pattern (#3856)
1 year ago