3751 Commits (30a94431323d71c5ef06bd4b7f047aced3312fdf)
 

Author SHA1 Message Date
Wang Binluo d77e66a577
Merge pull request #6023 from wangbluo/fp8_merge 3 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
wangbluo 8b8e282441 fix 3 months ago
wangbluo 698c8b9804 fix 3 months ago
wangbluo 6aface9316 fix 3 months ago
wangbluo 193030f696 fix 3 months ago
wangbluo eb5ba40def fix the merge 3 months ago
Tong Li 39e2597426
[ColossalChat] Add PP support (#6001) 3 months ago
Hongxin Liu 0d3b0bd864
[plugin] add cast inputs option for zero (#6003) (#6022) 3 months ago
wangbluo 2d362ac090 fix merge 3 months ago
wangbluo 2e4cbe3a2d fix 3 months ago
wangbluo 2ee6235cfa fix 3 months ago
wangbluo f7acfa1bd5 fix 3 months ago
wangbluo 53823118f2 fix 3 months ago
Edenzzzz dcc44aab8d
[misc] Use dist logger in plugins (#6011) 3 months ago
wangbluo 1f703e0ef4 fix 3 months ago
wangbluo 88b3f0698c fix the merge 3 months ago
wangbluo 2eb36839c6 fix 3 months ago
wangbluo 12b44012d9 fix 3 months ago
wangbluo 0d8e82a024 Merge branch 'fp8_merge' of https://github.com/wangbluo/ColossalAI into fp8_merge 3 months ago
wangbluo 4c82bfcc54 fix the merge 3 months ago
pre-commit-ci[bot] 64aad96723 [pre-commit.ci] auto fixes from pre-commit.com hooks 3 months ago
wangbluo 3353042525 fix the merge 3 months ago
Edenzzzz f1c3266a94
overlap kv comm with output rescale (#6017) 3 months ago
wangbluo 1a5847e6d1 fix the merge 3 months ago
wangbluo 52289e4c63 Merge branch 'fp8_merge' of https://github.com/wangbluo/ColossalAI into fp8_merge 3 months ago
wangbluo 02636c5bef fix the merge 3 months ago
pre-commit-ci[bot] 81272e9d00 [pre-commit.ci] auto fixes from pre-commit.com hooks 3 months ago
wangbluo 4cf79fa275 merge 3 months ago
Hongxin Liu 26493b97d3
[misc] update compatibility (#6008) 3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905) 3 months ago
flybird11111 0a51319113
[fp8] zero support fp8 linear. (#6006) 3 months ago
Wang Binluo 3f09a6145f
[fp8] add use_fp8 option for MoeHybridParallelPlugin (#6009) 3 months ago
flybird11111 20722a8c93
[fp8]update reduce-scatter test (#6002) 3 months ago
Haze188 887d2d579b
[misc] Bypass the huggingface bug to solve the mask mismatch problem (#5991) 3 months ago
pre-commit-ci[bot] 4dd03999ec
[pre-commit.ci] pre-commit autoupdate (#5995) 3 months ago
botbw 1a2e90dcc1 [fp8] linear perf enhancement 3 months ago
Hongxin Liu 406f984063
[plugin] add cast inputs option for zero (#6003) 3 months ago
botbw 88fa096d78
[fp8] update torch.compile for linear_fp8 to >= 2.4.0 (#6004) 3 months ago
flybird11111 597b206001
[fp8] support asynchronous FP8 communication (#5997) 3 months ago
Tong Li ceb1e262e7
fix sync condition (#6000) 3 months ago
Hongxin Liu 0978080a69
[fp8] refactor fp8 linear with compile (#5993) 3 months ago
Wang Binluo b2483c8e31
[fp8] support hybrid parallel plugin (#5982) 3 months ago
YeAnbang ed97d3a5d3
[Chat] fix readme (#5989) 3 months ago
flybird11111 f1a3a326c4
[fp8]Moe support fp8 communication (#5977) 3 months ago
Edenzzzz b4d2377d4c
[Hotfix] Avoid fused RMSnorm import error without apex (#5985) 3 months ago
botbw e4aadeee20
[fp8] use torch compile (torch >= 2.3.0) (#5979) 3 months ago
Hongxin Liu 8241c0c054
[fp8] support gemini plugin (#5978) 3 months ago
Tong Li ad3fa4f49c
[Hotfix] README link (#5966) 4 months ago
flybird11111 4b9bec8176
[test ci]Feature/fp8 comm (#5981) 4 months ago