985 Commits (refactor/inference)

Author SHA1 Message Date
Cuiqing Li bce0f16702
[Feature] The first PR to Add TP inference engine, kv-cache manager and related kernels for our inference system (#4577) 1 year ago
flybird11111 eedaa3e1ef
[shardformer]fix gpt2 double head (#4663) 1 year ago
Hongxin Liu 554aa9592e
[legacy] move communication and nn to legacy and refactor logger (#4671) 1 year ago
flybird11111 7486ed7d3a
[shardformer] update llama2/opt finetune example and fix llama2 policy (#4645) 1 year ago
Baizhou Zhang 660eed9124
[pipeline] set optimizer to optional in execute_pipeline (#4630) 1 year ago
Hongxin Liu 8accecd55b [legacy] move engine to legacy (#4560) 1 year ago
Hongxin Liu 89fe027787 [legacy] move trainer to legacy (#4545) 1 year ago
Hongxin Liu bd18678478
[test] fix gemini checkpoint and gpt test (#4620) 1 year ago
Hongxin Liu 807e01a4ba
[zero] hotfix master param sync (#4618) 1 year ago
Hongxin Liu e71d245293
[test] ignore gpt2 shardformer test (#4619) 1 year ago
Baizhou Zhang e79b1e80e2
[checkpointio] support huggingface from_pretrained for all plugins (#4606) 1 year ago
Jianghai 24c0768795
[shardformer] Pytree fix (#4533) 1 year ago
Hongxin Liu 508ca36fe3
[pipeline] 1f1b schedule receive microbatch size (#4589) 1 year ago
LuGY cbac782254
[zero]fix zero ckptIO with offload (#4529) 1 year ago
Baizhou Zhang 38ccb8b1a3
[shardformer] support from_pretrained when loading model with HybridParallelPlugin (#4575) 1 year ago
Baizhou Zhang c9625dbb63
[shardformer] support sharded optimizer checkpointIO of HybridParallelPlugin (#4540) 1 year ago
Baizhou Zhang 2c787d7f47
[shardformer] fix submodule replacement bug when enabling pp (#4544) 1 year ago
flybird11111 ec18fc7340
[shardformer] support pp+tp+zero1 tests (#4531) 1 year ago
flybird11111 d367b88785
[shardformer] fix opt test hanging (#4521) 1 year ago
Bin Jia e241b74f24
[shardformer] Add overlap support for gpt2 (#4535) 1 year ago
Baizhou Zhang 0387a47e63
[shardformer] fix emerged bugs after updating transformers (#4526) 1 year ago
Bin Jia c554b7f559
[shardformer/fix overlap bug] fix overlap bug, add overlap as an option in shardco… (#4516) 1 year ago
Jianghai 376533a564
[shardformer] zero1+pp and the corresponding tests (#4517) 1 year ago
Baizhou Zhang 44eab2b27f
[shardformer] support sharded checkpoint IO for models of HybridParallelPlugin (#4506) 1 year ago
flybird11111 de8a65babc
[shardformer] opt fix. (#4514) 1 year ago
LuGY 839847b7d7
[zero]support zero2 with gradient accumulation (#4511) 1 year ago
flybird11111 3353e55c80
[shardformer] vit/llama/t5 ignore the sequence parallelism flag and some fix. (#4498) 1 year ago
Hongxin Liu 27061426f7
[gemini] improve compatibility and add static placement policy (#4479) 1 year ago
Jianghai e04436a82a
[shardformer] tests for 3d parallel (#4493) 1 year ago
flybird11111 59e252ecdb
[shardformer] chatglm support sequence parallel (#4482) 1 year ago
Jianghai 5545114fd8
rename chatglm to chatglm2 (#4484) 1 year ago
Baizhou Zhang 1c7df566e2
[shardformer] support tp+zero for shardformer (#4472) 1 year ago
Jianghai 8739aa7fa0
[shardformer] Pipeline/whisper (#4456) 1 year ago
Bin Jia 7c8be77081
[shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460) 1 year ago
LuGY a78daf6180
[shardformer] support interleaved pipeline (#4448) 1 year ago
Hongxin Liu 26e29d58f0
[devops] add large-scale distributed test marker (#4452) 1 year ago
Baizhou Zhang 6ef33f75aa
[shardformer] support DDP in HybridPlugin/add tp+dp tests (#4446) 1 year ago
Bin Jia 424629fea0
[shardformer/sequence parallel] Cherry pick commit to new branch (#4450) 1 year ago
github-actions[bot] d20dceb9a3
[format] applied code formatting on changed files in pull request 4441 (#4445) 1 year ago
Hongxin Liu 172f7fa3cf [misc] resolve code factor issues (#4433) 1 year ago
flybird11111 328a791d10 [shardformer] update bloom/llama/vit/chatglm tests (#4420) 1 year ago
flybird11111 108e54a0b4 [shardformer]update t5 tests for using all optimizations. (#4407) 1 year ago
flybird11111 1edc9b5fb3 [shardformer] update tests for all optimization (#4413) 1 year ago
Baizhou Zhang 7711bd524a [shardformer] rewrite tests for opt/bloom/llama/vit/chatglm (#4395) 1 year ago
flybird11111 21e0a42fd1 [shardformer]fix, test gpt2 for AMP+TP (#4403) 1 year ago
Jianghai 7596e9ae08 [pipeline] rewrite bert tests and fix some bugs (#4409) 1 year ago
flybird1111 d2cd48e0be [shardformer] test all optimizations (#4399) 1 year ago
flybird1111 7a3dfd0c64 [shardformer] update shardformer to use flash attention 2 (#4392) 1 year ago
Baizhou Zhang ed4c448488 [pipeline] rewrite t5 tests & support multi-tensor transmitting in pipeline (#4388) 1 year ago
flybird1111 906426cb44 [Shardformer] Merge flash attention branch to pipeline branch (#4362) 1 year ago