40 Commits (ckpt)

Author SHA1 Message Date
Tong Li 4c8e85ee0d
[Coati] Train DPO using PP (#6054) 1 month ago
Wenxuan Tan 8fd25d6e09
[Feature] Split cross-entropy computation in SP (#5959) 2 months ago
Wang Binluo eea37da6fa
[fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago
wangbluo 698c8b9804 fix 3 months ago
wangbluo 193030f696 fix 3 months ago
wangbluo 53823118f2 fix 3 months ago
pre-commit-ci[bot] 81272e9d00 [pre-commit.ci] auto fixes from pre-commit.com hooks 3 months ago
Edenzzzz f5c84af0b0
[Feature] Zigzag Ring attention (#5905) 3 months ago
Wang Binluo b2483c8e31
[fp8] support hybrid parallel plugin (#5982) 3 months ago
flybird11111 0c10afd372
[FP8] rebase main (#5963) 4 months ago
Guangyao Zhang 53cb9606bd
[Feature] llama shardformer fp8 support (#5938) 4 months ago
Hongxin Liu 9664b1bc19
[shardformer] hotfix attn mask (#5945) 4 months ago
GuangyaoZhang 6a20f07b80 remove all to all 4 months ago
GuangyaoZhang 5a310b9ee1 fix rebase 4 months ago
GuangyaoZhang 457a0de79f shardformer fp8 4 months ago
Edenzzzz fbf33ecd01
[Feature] Enable PP + SP for llama (#5868) 5 months ago
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789) 5 months ago
flybird11111 2ddf624a86
[shardformer] upgrade transformers to 4.39.3 (#5815) 5 months ago
Hongxin Liu 73e88a5553
[shardformer] fix import (#5788) 6 months ago
flybird11111 50b4c8e8cf
[hotfix] fix llama flash attention forward (#5777) 6 months ago
Haze188 22ce873c3f
[Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702) 6 months ago
Hongxin Liu 1b387ca9fe
[shardformer] refactor pipeline grad ckpt config (#5646) 7 months ago
Wang Binluo 0d0a582033
[shardformer] update transformers (#5583) 7 months ago
flybird11111 a0ad587c24
[shardformer] refactor embedding resize (#5603) 7 months ago
Zhongkai Zhao 8e412a548e
[shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
Wenhao Chen e614aa34f3
[shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 8 months ago
github-actions[bot] e6707a6e8d
[format] applied code formatting on changed files in pull request 5510 (#5517) 8 months ago
Hongxin Liu 19e1a5cf16
[shardformer] update colo attention to support custom mask (#5510) 8 months ago
flybird11111 0688d92e2d
[shardformer]Fix lm parallel. (#5480) 8 months ago
flybird11111 5e16bf7980
[shardformer] fix gathering output when using tensor parallelism (#5431) 8 months ago
digger yu 049121d19d
[hotfix] fix typo change enabel to enable under colossalai/shardformer/ (#5317) 9 months ago
flybird11111 0a25e16e46
[shardformer]gather llama logits (#5398) 9 months ago
Frank Lee 7cfed5f076
[feat] refactored extension module (#5298) 10 months ago
Xuanlei Zhao dd2c28a323
[npu] use extension for op builder (#5172) 11 months ago
flybird11111 451e9142b8
fix flash attn (#5209) 11 months ago
flybird11111 79718fae04
[shardformer] llama support DistCrossEntropy (#5176) 12 months ago
Xuanlei Zhao d6df19bae7
[npu] support triangle attention for llama (#5130) 12 months ago
Xuanlei Zhao 68fcaa2225
remove duplicate import (#5100) 1 year ago
flybird11111 aae496631c
[shardformer]fix flash attention, when mask is casual, just don't unpad it (#5084) 1 year ago
flybird11111 97cd0cd559
[shardformer] fix llama error when transformers upgraded. (#5055) 1 year ago
Elsa Granger b2ad0d9e8f
[pipeline,shardformer] Fix p2p efficiency in pipeline, allow skipping loading weight not in weight_map when `strict=False`, fix llama flash attention forward, add flop estimation by megatron in llama benchmark (#5017) 1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752) 1 year ago
Cuiqing Li bce0f16702
[Feature] The first PR to Add TP inference engine, kv-cache manager and related kernels for our inference system (#4577) 1 year ago
flybird11111 7486ed7d3a
[shardformer] update llama2/opt finetune example and fix llama2 policy (#4645) 1 year ago
Hongxin Liu 172f7fa3cf [misc] resolve code factor issues (#4433) 1 year ago
flybird1111 7a3dfd0c64 [shardformer] update shardformer to use flash attention 2 (#4392) 1 year ago
flybird1111 906426cb44 [Shardformer] Merge flash attention branch to pipeline branch (#4362) 1 year ago
Hongxin Liu 261eab02fb [plugin] add 3d parallel plugin (#4295) 1 year ago
Jianghai 18ebcf406a [pipeline] reformat for unified design (#4283) 1 year ago