Commit Graph

32 Commits (feature/async-io)

Author SHA1 Message Date
Hongxin Liu 5ddad486ca
[fp8] add fallback and make compile option configurable (#6092)
1 month ago
Guangyao Zhang f20b066c59
[fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059)
3 months ago
Hanks 5ce6dd75bf
[fp8] disable all_to_all_fp8 in intranode (#6045)
3 months ago
Hongxin Liu c3b5caff0e
[fp8] optimize all-gather (#6043)
3 months ago
Guangyao Zhang e96a0761ea
[FP8] unsqueeze scale to make it compatible with torch.compile (#6040)
3 months ago
botbw 1a2e90dcc1 [fp8] linear perf enhancement
4 months ago
botbw 88fa096d78
[fp8] update torch.compile for linear_fp8 to >= 2.4.0 (#6004)
4 months ago
flybird11111 597b206001
[fp8] support asynchronous FP8 communication (#5997)
4 months ago
Hongxin Liu 0978080a69
[fp8] refactor fp8 linear with compile (#5993)
4 months ago
flybird11111 f1a3a326c4
[fp8]Moe support fp8 communication (#5977)
4 months ago
botbw e4aadeee20
[fp8] use torch compile (torch >= 2.3.0) (#5979)
4 months ago
Hongxin Liu 8241c0c054
[fp8] support gemini plugin (#5978)
4 months ago
Hanks b480eec738
[Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928)
4 months ago
flybird11111 7739629b9d
fix (#5976)
4 months ago
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
4 months ago
Hongxin Liu 76ea16466f
[fp8] add fp8 linear (#5967)
4 months ago
flybird11111 afb26de873
[fp8]support all2all fp8 (#5953)
4 months ago
Guangyao Zhang 53cb9606bd
[Feature] llama shardformer fp8 support (#5938)
4 months ago
Hongxin Liu 5fd0592767
[fp8] support all-gather flat tensor (#5932)
4 months ago
GuangyaoZhang 6a20f07b80 remove all to all
4 months ago
GuangyaoZhang 5a310b9ee1 fix rebase
4 months ago
GuangyaoZhang 457a0de79f shardformer fp8
5 months ago
pre-commit-ci[bot] 51f916b11d [pre-commit.ci] auto fixes from pre-commit.com hooks
5 months ago
BurkeHulk 1f1b856354 Merge remote-tracking branch 'origin/feature/fp8_comm' into feature/fp8_comm
5 months ago
BurkeHulk e88190184a support fp8 communication in pipeline parallelism
5 months ago
BurkeHulk 1e1959467e fix scaling algorithm in FP8 casting
5 months ago
GuangyaoZhang dbfa7d39fc fix typo
5 months ago
pre-commit-ci[bot] e17f835df7 [pre-commit.ci] auto fixes from pre-commit.com hooks
5 months ago
Hanks 6991819a97
Merge branch 'hpcaitech:main' into feature/fp8_comm
5 months ago
Hongxin Liu 7afbc81d62
[quant] fix bitsandbytes version check (#5882)
5 months ago
HangXu f5a52e1600
fp8 operators for compressed communication
5 months ago
linsj20 91fa553775 [Feature] qlora support (#5586)
7 months ago