Commit Graph

2161 Commits (162251ab7844e4116a36d6e0fec2ac7ccd03f74d)

Author SHA1 Message Date
Tong Li ceb1e262e7
fix sync condition (#6000)
3 months ago
Hongxin Liu 0978080a69
[fp8] refactor fp8 linear with compile (#5993)
4 months ago
Wang Binluo b2483c8e31
[fp8] support hybrid parallel plugin (#5982)
4 months ago
flybird11111 f1a3a326c4
[fp8]Moe support fp8 communication (#5977)
4 months ago
Edenzzzz b4d2377d4c
[Hotfix] Avoid fused RMSnorm import error without apex (#5985)
4 months ago
botbw e4aadeee20
[fp8] use torch compile (torch >= 2.3.0) (#5979)
4 months ago
Hongxin Liu 8241c0c054
[fp8] support gemini plugin (#5978)
4 months ago
Hanks b480eec738
[Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928)
4 months ago
flybird11111 7739629b9d
fix (#5976)
4 months ago
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
4 months ago
Hongxin Liu 76ea16466f
[fp8] add fp8 linear (#5967)
4 months ago
flybird11111 afb26de873
[fp8]support all2all fp8 (#5953)
4 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963)
4 months ago
Guangyao Zhang 53cb9606bd
[Feature] llama shardformer fp8 support (#5938)
4 months ago
ver217 ae486ce005 [fp8] add fp8 comm for low level zero
4 months ago
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956)
4 months ago
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens
4 months ago
botbw d1d1ab871e [moe] solve dp axis issue
4 months ago
botbw 65daa87627 [doc] add MoeHybridParallelPlugin docstring
4 months ago
hxwang 7bedd03739 [moe] remove force_overlap_comm flag and add warning instead
4 months ago
hxwang f7c5485ed6 [chore] docstring
4 months ago
haze188 7e737df5ad [misc] remove useless condition
4 months ago
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin
4 months ago
hxwang 606b0891ed [chore] change moe_pg_mesh to private
4 months ago
hxwang 5b4c12381b Revert "[moe] implement submesh initialization"
4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment
4 months ago
haze188 034020bd04 [misc] remove debug/print code
4 months ago
hxwang c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers)
4 months ago
hxwang 6c39f0b144 [test] add check
4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support
4 months ago
botbw 96d0fbc531 [bug] fix: somehow logger hangs the program
4 months ago
hxwang 067e18f7e9 [test] fix test: test_zero1_2
4 months ago
hxwang 74b03de3f9 [moe] remove ops
4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure
4 months ago
hxwang 46037c2ccd [chore] minor fix after rebase
4 months ago
hxwang 803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix)
4 months ago
hxwang 7077d38d5a [moe] finalize test (no pp)
4 months ago
haze188 2cddeac717 moe sp + ep bug fix
4 months ago
hxwang 877d94bb8c [moe] init moe plugin comm setting with sp
4 months ago
hxwang 09d6280d3e [chore] minor fix
4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918)
4 months ago
hxwang 3e2b6132b7 [moe] clean legacy code
4 months ago
hxwang 74eccac0db [moe] test deepseek
4 months ago
botbw dc583aa576 [moe] implement tp
4 months ago
hxwang 102b784a10 [chore] arg pass & remove drop token
4 months ago
botbw 8dbb86899d [chore] trivial fix
4 months ago
botbw 014faf6c5a [chore] manually revert unintended commit
4 months ago
botbw 9b9b76bdcd [moe] add mixtral dp grad scaling when not all experts are activated
4 months ago
botbw e28e05345b [moe] implement submesh initialization
4 months ago
haze188 5ed5e8cfba solve hang when parallel mode = pp + dp
4 months ago
botbw 13b48ac0aa [zero] solve hang
4 months ago
botbw b5bfeb2efd [moe] implement transit between non moe tp and ep
4 months ago
botbw 37443cc7e4 [test] pass mixtral shardformer test
4 months ago
hxwang 46c069b0db [zero] solve hang
4 months ago
hxwang 0fad23c691 [chore] handle non member group
4 months ago
hxwang a249e71946 [test] mixtra pp shard test
4 months ago
hxwang 8ae8525bdf [moe] fix plugin
4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test
4 months ago
hxwang f9b6fcf81f [test] add mixtral for sequence classification
4 months ago
Hongxin Liu 060892162a
[zero] hotfix update master params (#5951)
4 months ago
Runyu Lu bcf0181ecd
[Feat] Distrifusion Acceleration Support for Diffusion Inference (#5895)
4 months ago
Hongxin Liu 7b38964e3a
[shardformer] hotfix attn mask (#5947)
4 months ago
Hongxin Liu 9664b1bc19
[shardformer] hotfix attn mask (#5945)
4 months ago
Edenzzzz 2069472e96
[Hotfix] Fix ZeRO typo #5936
4 months ago
Hongxin Liu 5fd0592767
[fp8] support all-gather flat tensor (#5932)
4 months ago
Gao, Ruiyuan 5fb958cc83
[FIX BUG] convert env param to int in (#5934)
4 months ago
Insu Jang a521ffc9f8
Add n_fused as an input from native_module (#5894)
4 months ago
Hongxin Liu e86127925a
[plugin] support all-gather overlap for hybrid parallel (#5919)
4 months ago
GuangyaoZhang 5b969fd831 fix shardformer fp8 communication training degradation
4 months ago
GuangyaoZhang 6a20f07b80 remove all to all
4 months ago
GuangyaoZhang 5a310b9ee1 fix rebase
4 months ago
GuangyaoZhang 457a0de79f shardformer fp8
4 months ago
アマデウス 530283dba0 fix object_to_tensor usage when torch>=2.3.0 (#5820)
4 months ago
Guangyao Zhang 2e28c793ce [compatibility] support torch 2.2 (#5875)
4 months ago
Guangyao Zhang 1c961b20f3
[ShardFormer] fix qwen2 sp (#5903)
4 months ago
Stephan Kö 45c49dde96
[Auto Parallel]: Speed up intra-op plan generation by 44% (#5446)
4 months ago
pre-commit-ci[bot] 51f916b11d [pre-commit.ci] auto fixes from pre-commit.com hooks
5 months ago
BurkeHulk 1f1b856354 Merge remote-tracking branch 'origin/feature/fp8_comm' into feature/fp8_comm
5 months ago
BurkeHulk e88190184a support fp8 communication in pipeline parallelism
5 months ago
BurkeHulk 1e1959467e fix scaling algorithm in FP8 casting
5 months ago
Hongxin Liu c068ef0fa0
[zero] support all-gather overlap (#5898)
5 months ago
GuangyaoZhang dbfa7d39fc fix typo
5 months ago
Guangyao Zhang 669849d74b
[ShardFormer] Add Ulysses Sequence Parallelism support for Command-R, Qwen2 and ChatGLM (#5897)
5 months ago
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868)
5 months ago
Runyu Lu 66abf1c6e8
[HotFix] CI,import,requirements-test for #5838 (#5892)
5 months ago
Runyu Lu cba20525a8
[Feat] Diffusion Model(PixArtAlpha/StableDiffusion3) Support (#5838)
5 months ago
Edenzzzz 8ec24b6a4d
[Hoxfix] Fix CUDA_DEVICE_MAX_CONNECTIONS for comm overlap
5 months ago
Haze188 3420921101
[shardformer] DeepseekMoE support (#5871)
5 months ago
pre-commit-ci[bot] e17f835df7 [pre-commit.ci] auto fixes from pre-commit.com hooks
5 months ago
Hanks 6991819a97
Merge branch 'hpcaitech:main' into feature/fp8_comm
5 months ago
Hongxin Liu 7afbc81d62
[quant] fix bitsandbytes version check (#5882)
5 months ago
Wang Binluo 6cd4c32be4
[shardformer] fix the moe (#5883)
5 months ago
Edenzzzz eb24fcd914
[Hotfix] Fix OPT gradient checkpointing forward
5 months ago
Haze188 ea94c07b95
[hotfix] fix the bug that large tensor exceed the maximum capacity of TensorBucket (#5879)
5 months ago
pre-commit-ci[bot] 7c2f79fa98
[pre-commit.ci] pre-commit autoupdate (#5572)
5 months ago
Jianghai 8ab46b4000
[Shardformer] change qwen2 modeling into gradient checkpointing style (#5874)
5 months ago
HangXu f5a52e1600
fp8 operators for compressed communication
5 months ago
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821)
5 months ago
flybird11111 773d9f964a
[shardformer]delete xformers (#5859)
5 months ago
Runyu Lu 3c7cda0c9a
[Inference]Lazy Init Support (#5785)
5 months ago