86 Commits (feature/zerobubble)

Author SHA1 Message Date
duanjunwen 8e40087633 [fix] fix model zoo init 3 weeks ago
duanjunwen 0218e673db [fix] fix use_fp8 flag 3 weeks ago
flybird11111 295dd2d9fe
[zerobubble] rebase main (#6075) 2 months ago
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959) 2 months ago
duanjunwen fed8b1587d [fix] fix model zoo import; 2 months ago
duanjunwen a5ec3d4285 [fix] fix mem; use a new model shape; only assert mem less and equal than theo; 2 months ago
duanjunwen 35a7b636b3 [fix] fix mem assertation 2 months ago
duanjunwen 4a358348c7 [fix] fix mem check; 3 months ago
duanjunwen ab643c9af7 [fix] rm output.data after send fwd; 3 months ago
duanjunwen 6d18d38d5c [feat] update test; rm comments; 3 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905) 3 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963) 4 months ago
haze188 034020bd04 [misc] remove debug/print code 4 months ago
haze188 b2952a5982 [moe] deepseek moe sp support 4 months ago
Haze188 404b16faf3 [Feature] MoE Ulysses Support (#5918) 4 months ago
hxwang 46c069b0db [zero] solve hang 4 months ago
hxwang a249e71946 [test] mixtra pp shard test 4 months ago
hxwang 0b76b57cd6 [test] add mixtral transformer test 4 months ago
pre-commit-ci[bot] 7c2f79fa98
[pre-commit.ci] pre-commit autoupdate (#5572) 5 months ago
Guangyao Zhang d9d5e7ea1f
[shardformer] Support the T5ForTokenClassification model (#5816) 5 months ago
GuangyaoZhang fe2e74c03a fix precommit 5 months ago
GuangyaoZhang f656d61778 change command 5 months ago
GuangyaoZhang 9a290ab013 fix precommit 5 months ago
pre-commit-ci[bot] 2a7fa2e7d0 [pre-commit.ci] auto fixes from pre-commit.com hooks 5 months ago
GuangyaoZhang 94fbde6055 change command 5 months ago
Hongxin Liu 587bbf4c6d
[test] fix chatglm test kit (#5793) 5 months ago
botbw 80c3c8789b
[Test/CI] remove test cases to reduce CI duration (#5753) 6 months ago
Edenzzzz 43995ee436
[Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 6 months ago
Wang Binluo a3cc68ca93
[Shardformer] Support the Qwen2 model (#5699) 7 months ago
Hongxin Liu bbb2c21f16
[shardformer] fix chatglm implementation (#5644) 7 months ago
Wang Binluo 0d0a582033
[shardformer] update transformers (#5583) 7 months ago
Hongxin Liu 641b1ee71a
[devops] remove post commit ci (#5566) 8 months ago
Zhongkai Zhao 8e412a548e
[shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
Wenhao Chen e614aa34f3
[shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 8 months ago
flybird11111 29695cf70c
[example]add gpt2 benchmark example script. (#5295) 9 months ago
Frank Lee d69cd2eb89
[workflow] fixed oom tests (#5275) 10 months ago
Frank Lee 2b83418719
[ci] fixed ddp test (#5254) 11 months ago
Frank Lee d5eeeb1416
[ci] fixed booster test (#5251) 11 months ago
Frank Lee edf94a35c3
[workflow] fixed build CI (#5240) 11 months ago
flybird11111 21aa5de00b
[gemini] hotfix NaN loss while using Gemini + tensor_parallel (#5150) 12 months ago
Wenhao Chen 7172459e74
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
Jianghai cf579ff46d
[Inference] Dynamic Batching Inference, online and offline (#4953) 1 year ago
Hongxin Liu b8e770c832
[test] merge old components to test to model zoo (#4945) 1 year ago
Zhongkai Zhao db40e086c8 [test] modify model supporting part of low_level_zero plugin (including correspoding docs) 1 year ago
Jianghai ce7ade3882
[inference] chatglm2 infer demo (#4724) 1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752) 1 year ago
digger yu 9c2feb2f0b
fix some typo with colossalai/device colossalai/tensor/ etc. (#4171) 1 year ago
flybird11111 eedaa3e1ef
[shardformer]fix gpt2 double head (#4663) 1 year ago
flybird11111 7486ed7d3a
[shardformer] update llama2/opt finetune example and fix llama2 policy (#4645) 1 year ago