212 Commits (5f8c0a0ac3b52a71b664c3e36dd1a8cef40f428d)

Author SHA1 Message Date
Haze188 22ce873c3f
[Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702) 6 months ago
Wang Binluo 537f6a3855
[Shardformer]fix the num_heads assert for llama model and qwen model (#5704) 7 months ago
Wang Binluo a3cc68ca93
[Shardformer] Support the Qwen2 model (#5699) 7 months ago
Jianghai 61a1b2e798 [Inference] Fix bugs and docs for feat/online-server (#5598) 7 months ago
CjhHa1 7bbb28e48b [Inference] resolve rebase conflicts 7 months ago
Jianghai 69cd7e069d [Inference] ADD async and sync Api server using FastAPI (#5396) 7 months ago
Yuanheng Zhao f9afe0addd
[hotfix] Fix KV Heads Number Assignment in KVCacheManager (#5695) 7 months ago
wangbluo 4e50cce26b fix the mistral model 7 months ago
wangbluo a8408b4d31 remove comment code 7 months ago
pre-commit-ci[bot] ca56b93d83 [pre-commit.ci] auto fixes from pre-commit.com hooks 7 months ago
wangbluo 108ddfb795 add parallel_output for the opt model 7 months ago
pre-commit-ci[bot] 88f057ce7c [pre-commit.ci] auto fixes from pre-commit.com hooks 7 months ago
wangbluo 2632916329 remove useless code 7 months ago
wangbluo 9efc79ef24 add parallel output for mistral model 7 months ago
Wang Binluo d3f34ee8cc
[Shardformer] add assert for num of attention heads divisible by tp_size (#5670) 7 months ago
flybird11111 6af6d6fc9f
[shardformer] support bias_gelu_jit_fused for models (#5647) 7 months ago
Hongxin Liu 7f8b16635b
[misc] refactor launch API and tensor constructor (#5666) 7 months ago
flybird11111 8b7d535977
fix gptj (#5652) 7 months ago
Hongxin Liu 1b387ca9fe
[shardformer] refactor pipeline grad ckpt config (#5646) 7 months ago
Hongxin Liu bbb2c21f16
[shardformer] fix chatglm implementation (#5644) 7 months ago
flybird11111 5d88ef1aaf
[shardformer] remove useless code (#5645) 7 months ago
flybird11111 148506c828
[coloattention]modify coloattention (#5627) 7 months ago
Edenzzzz 7ee569b05f
[hotfix] Fixed fused layernorm bug without apex (#5609) 7 months ago
Wang Binluo 0d0a582033
[shardformer] update transformers (#5583) 7 months ago
Hongxin Liu e094933da1
[shardformer] fix pipeline grad ckpt (#5620) 7 months ago
flybird11111 a0ad587c24
[shardformer] refactor embedding resize (#5603) 7 months ago
Hongxin Liu 641b1ee71a
[devops] remove post commit ci (#5566) 8 months ago
Zhongkai Zhao 8e412a548e
[shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
Edenzzzz 7e0ec5a85c
fix incorrect sharding without zero (#5545) 8 months ago
Yuanheng Zhao 4bb5d8923a
[Fix/Inference] Remove unused and non-functional functions (#5543) 8 months ago
Wenhao Chen e614aa34f3
[shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 8 months ago
Insu Jang 00525f7772
[shardformer] fix pipeline forward error if custom layer distribution is used (#5189) 8 months ago
github-actions[bot] e6707a6e8d
[format] applied code formatting on changed files in pull request 5510 (#5517) 8 months ago
Hongxin Liu 19e1a5cf16
[shardformer] update colo attention to support custom mask (#5510) 8 months ago
flybird11111 0688d92e2d
[shardformer]Fix lm parallel. (#5480) 8 months ago
flybird11111 5e16bf7980
[shardformer] fix gathering output when using tensor parallelism (#5431) 8 months ago
digger yu 049121d19d
[hotfix] fix typo change enabel to enable under colossalai/shardformer/ (#5317) 9 months ago
flybird11111 29695cf70c
[example]add gpt2 benchmark example script. (#5295) 9 months ago
flybird11111 0a25e16e46
[shardformer]gather llama logits (#5398) 9 months ago
digger yu 71321a07cf
fix typo change dosen't to doesn't (#5308) 10 months ago
flybird11111 388179f966
[tests] fix t5 test. (#5322) 10 months ago
Frank Lee 7cfed5f076
[feat] refactored extension module (#5298) 10 months ago
binmakeswell c174c4fc5f
[doc] fix doc typo (#5256) 11 months ago
Hongxin Liu d202cc28c0
[npu] change device to accelerator api (#5239) 11 months ago
Xuanlei Zhao dd2c28a323
[npu] use extension for op builder (#5172) 11 months ago
digger yu b0b53a171c
[nfc] fix typo colossalai/shardformer/ (#5133) 11 months ago
flybird11111 451e9142b8
fix flash attn (#5209) 11 months ago
flybird11111 02d2328a04
support linear accumulation fusion (#5199) 11 months ago
Wenhao Chen 4fa689fca1
[pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134) 11 months ago
flybird11111 79718fae04
[shardformer] llama support DistCrossEntropy (#5176) 12 months ago