Commit Graph

1191 Commits (bc7eeade33e33e3a7c2df26fedab707f3a62d6fe)

Author SHA1 Message Date
wangbluo fd92789af2 fix
1 month ago
wangbluo 6be9862aaf fix
1 month ago
wangbluo 3dc08c8a5a fix
1 month ago
wangbluo 8ff7d0c780 fix
2 months ago
wangbluo 3201377e94 fix
2 months ago
wangbluo 23199e34cc fix
2 months ago
wangbluo 703bb5c18d fix the test
2 months ago
wangbluo 4e0e99bb6a fix the test
2 months ago
Guangyao Zhang f20b066c59
[fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059)
3 months ago
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048)
3 months ago
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959)
3 months ago
Hongxin Liu b3db1058ec
[release] update version (#6041)
3 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016)
3 months ago
flybird11111 20722a8c93
[fp8]update reduce-scatter test (#6002)
4 months ago
flybird11111 597b206001
[fp8] support asynchronous FP8 communication (#5997)
4 months ago
Hongxin Liu 8241c0c054
[fp8] support gemini plugin (#5978)
4 months ago
Hanks b480eec738
[Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928)
4 months ago
Hongxin Liu ccabcf6485
[fp8] support fp8 amp for hybrid parallel plugin (#5975)
4 months ago
Hongxin Liu 76ea16466f
[fp8] add fp8 linear (#5967)
4 months ago
flybird11111 afb26de873
[fp8]support all2all fp8 (#5953)
4 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963)
4 months ago
Guangyao Zhang 53cb9606bd
[Feature] llama shardformer fp8 support (#5938)
4 months ago
ver217 91e596d017 [test] add zero fp8 test case
4 months ago
Hongxin Liu 5fd0592767
[fp8] support all-gather flat tensor (#5932)
4 months ago
pre-commit-ci[bot] 7c2f79fa98
[pre-commit.ci] pre-commit autoupdate (#5572)
5 months ago
Haze188 416580b314
[MoE/ZeRO] Moe refactor with zero refactor (#5821)
5 months ago
Guangyao Zhang d9d5e7ea1f
[shardformer] Support the T5ForTokenClassification model (#5816)
5 months ago
Edenzzzz 2a25a2aff7
[Feature] optimize PP overlap (#5735)
5 months ago
Guangyao Zhang fd1dc417d8
[shardformer] Change atol in test command-r weight-check to pass pytest (#5835)
5 months ago
GuangyaoZhang fe2e74c03a fix precommit
5 months ago
GuangyaoZhang 98da648a4a Fix Code Factor check
5 months ago
GuangyaoZhang f656d61778 change command
5 months ago
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789)
5 months ago
flybird11111 2ddf624a86
[shardformer] upgrade transformers to 4.39.3 (#5815)
6 months ago
Li Xingjian 8554585a5f
[Inference] Fix flash-attn import and add model test (#5794)
6 months ago
Guangyao Zhang aac941ef78
[test] fix qwen2 pytest distLarge (#5797)
6 months ago
Hongxin Liu 587bbf4c6d
[test] fix chatglm test kit (#5793)
6 months ago
char-1ee b303976a27 Fix test import
6 months ago
char-1ee 5f398fc000 Pass inference model shard configs for module init
6 months ago
duanjunwen 10a19e22c6
[hotfix] fix testcase in test_fx/test_tracer (#5779)
6 months ago
botbw 80c3c8789b
[Test/CI] remove test cases to reduce CI duration (#5753)
6 months ago
Edenzzzz 79f7a7b211
[misc] Accelerate CI for zero and dist optim (#5758)
6 months ago
yuehuayingxueluo b45000f839
[Inference]Add Streaming LLM (#5745)
6 months ago
Haze188 e22b82755d
[CI/tests] simplify some test case to reduce testing time (#5755)
6 months ago
duanjunwen 1b76564e16
[test] Fix/fix testcase (#5770)
6 months ago
Hongxin Liu 68359ed1e1
[release] update version (#5752)
6 months ago
botbw 023ea13cb5
Merge pull request #5749 from hpcaitech/prefetch
6 months ago
Yuanheng Zhao b96c6390f4
[inference] Fix running time of test_continuous_batching (#5750)
6 months ago
Edenzzzz 5f8c0a0ac3
[Feature] auto-cast optimizers to distributed version (#5746)
6 months ago
hxwang ca674549e0 [chore] remove unnecessary test & changes
6 months ago