Commit Graph

3764 Commits (cf519dac6a5799b8f314aac6f510e2a98d3af9c6)
 

Author SHA1 Message Date
Hongxin Liu 406f984063
[plugin] add cast inputs option for zero (#6003)
3 months ago
botbw 88fa096d78
[fp8] update torch.compile for linear_fp8 to >= 2.4.0 (#6004)
3 months ago
flybird11111 597b206001
[fp8] support asynchronous FP8 communication (#5997)
3 months ago
Tong Li ceb1e262e7
fix sync condition (#6000)
3 months ago
Hongxin Liu 0978080a69
[fp8] refactor fp8 linear with compile (#5993)
3 months ago
Wang Binluo b2483c8e31
[fp8] support hybrid parallel plugin (#5982)
3 months ago
YeAnbang ed97d3a5d3
[Chat] fix readme (#5989)
3 months ago
flybird11111 f1a3a326c4
[fp8]Moe support fp8 communication (#5977)
4 months ago
Edenzzzz b4d2377d4c
[Hotfix] Avoid fused RMSnorm import error without apex (#5985)
4 months ago
botbw e4aadeee20
[fp8] use torch compile (torch >= 2.3.0) (#5979)
4 months ago
Hongxin Liu 8241c0c054
[fp8] support gemini plugin (#5978)
4 months ago
Tong Li ad3fa4f49c
[Hotfix] README link (#5966)
4 months ago
flybird11111 4b9bec8176
[test ci]Feature/fp8 comm (#5981)
4 months ago
Hanks b480eec738
[Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928)
4 months ago
flybird11111 7739629b9d
fix (#5976)
4 months ago
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
4 months ago
Hongxin Liu 76ea16466f
[fp8] add fp8 linear (#5967)
4 months ago
Edenzzzz 9179d4088e
[Docs] clarify launch port
4 months ago
flybird11111 afb26de873
[fp8]support all2all fp8 (#5953)
4 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963)
4 months ago
Guangyao Zhang 53cb9606bd
[Feature] llama shardformer fp8 support (#5938)
4 months ago
Hanks c297e21bea
Merge pull request #5961 from ver217/feature/zeor-fp8
4 months ago
YeAnbang fe71917851
Merge pull request #5962 from hpcaitech/colossalchat
4 months ago
YeAnbang 0b2d55c4ab Support overall loss, update KTO logging
4 months ago
ver217 91e596d017 [test] add zero fp8 test case
4 months ago
ver217 ae486ce005 [fp8] add fp8 comm for low level zero
4 months ago
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956)
4 months ago
Tong Li 19d1510ea2
[feat] Dist Loader for Eval (#5950)
4 months ago
botbw 62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens
4 months ago
botbw d1d1ab871e [moe] solve dp axis issue
4 months ago
botbw 65daa87627 [doc] add MoeHybridParallelPlugin docstring
4 months ago
hxwang 7bedd03739 [moe] remove force_overlap_comm flag and add warning instead
4 months ago
hxwang f7c5485ed6 [chore] docstring
4 months ago
haze188 7e737df5ad [misc] remove useless condition
4 months ago
haze188 70793ce9ed [misc] fix ci failure: change default value to false in moe plugin
4 months ago
haze188 12d043ca00 [misc] remove incompatible test config
4 months ago
hxwang 606b0891ed [chore] change moe_pg_mesh to private
4 months ago
hxwang 5b4c12381b Revert "[moe] implement submesh initialization"
4 months ago
hxwang cb01c0d5ce [moe] refactor mesh assignment
4 months ago
haze188 034020bd04 [misc] remove debug/print code
4 months ago
haze188 59bcf56c60 [misc] skip redunant test
4 months ago
hxwang c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers)
4 months ago
hxwang 6c39f0b144 [test] add check
4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support
4 months ago
botbw 96d0fbc531 [bug] fix: somehow logger hangs the program
4 months ago
hxwang 067e18f7e9 [test] fix test: test_zero1_2
4 months ago
hxwang 74b03de3f9 [moe] remove ops
4 months ago
hxwang 70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure
4 months ago
pre-commit-ci[bot] 52d346f2a5 [pre-commit.ci] auto fixes from pre-commit.com hooks
4 months ago
hxwang 46037c2ccd [chore] minor fix after rebase
4 months ago