YeAnbang
384c64057d
fix colossalai, transformers version
5 months ago
YeAnbang
8aad064fe7
fix style
5 months ago
YeAnbang
c8d1b4a968
add orpo
5 months ago
botbw
8e718a1421
[gemini] fixes for benchmarking ( #5847 )
...
* [gemini] fix missing return
* [gemini] fix missing arg pass
* [gemini] use gather tensor instead of list
* [test] enable flash attention for benchmark by default
* [test] enable flash attention for benchmark by default
---------
Co-authored-by: genghaozhe <939857490@qq.com>
5 months ago
Edenzzzz
2a25a2aff7
[Feature] optimize PP overlap ( #5735 )
...
* update to fully overlap, still debugging
* improve interface
* fixed deadlock bug
* debug NaN loss
* (experimental) use one comm group for send_fw_recv_fw to fix NaN
* cleaned up interfaces; use one batch p2p for all
* clean up; removed the double p2p batch case
* p2p test passsed
* improve overlap: send fwd before backward
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* tentatively use 2 p2p batches
* remove two p2p batches
* fix typos
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* remove pp.sh
---------
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: root <root@notebook-c55824c0-7742-45e8-9591-c855bb77ad29-0.notebook-c55824c0-7742-45e8-9591-c855bb77ad29.colossal-ai.svc.cluster.local>
5 months ago
binmakeswell
4ccaaaab63
[doc] add GPU cloud playground ( #5851 )
...
* [doc] add GPU cloud playground
* [doc] add GPU cloud playground
* [doc] add GPU cloud playground
* [doc] add GPU cloud playground
* [doc] add GPU cloud playground
* [doc] add GPU cloud playground
* [doc] add GPU cloud playground
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
5 months ago
YeAnbang
f3de5a025c
remove debug code
5 months ago
YeAnbang
0b2d6275c4
fix dataloader
5 months ago
YeAnbang
4b59d874df
Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into main
5 months ago
YeAnbang
82aecd6374
add SimPO
5 months ago
binmakeswell
7266f82d03
[doc] fix open sora model weight link ( #5848 )
...
* [doc] fix open sora model weight link
* [doc] fix open sora model weight link
5 months ago
binmakeswell
8f445729a4
[doc] opensora v1.2 news ( #5846 )
...
* [doc] opensora v1.2 news
* [doc] opensora v1.2 news
5 months ago
botbw
8a5c86439a
[gemini] fix missing return ( #5845 )
5 months ago
Hongxin Liu
bd3e34fef6
[release] update version ( #5833 )
5 months ago
Yuanheng Zhao
7b249c76e5
[Fix] Fix spec-dec Glide LlamaModel for compatibility with transformers ( #5837 )
...
* fix glide llama model
* revise
5 months ago
Guangyao Zhang
fd1dc417d8
[shardformer] Change atol in test command-r weight-check to pass pytest ( #5835 )
5 months ago
Guangyao Zhang
2014cce870
[devops] Remove building on PR when edited to avoid skip issue ( #5836 )
5 months ago
Kai Lv
0adca5b688
[launch] Support IPv4 host initialization in launch ( #5822 )
5 months ago
Guangyao Zhang
639394b0d4
Merge pull request #5818 from GuangyaoZhang/command-r
...
[shardformer] Support the Command-R model
5 months ago
Edenzzzz
7f9ec599be
[misc] Add dist optim to doc sidebar ( #5806 )
...
* add to sidebar
* fix chinese
5 months ago
GuangyaoZhang
4adbc36913
Merge branch 'command-r' of github.com:GuangyaoZhang/ColossalAI into command-r
5 months ago
GuangyaoZhang
d84d68601a
change 'xxx if xxx else None' to 'xxx or None'
5 months ago
pre-commit-ci[bot]
996c65077e
[pre-commit.ci] auto fixes from pre-commit.com hooks
...
for more information, see https://pre-commit.ci
5 months ago
GuangyaoZhang
a83a2336e8
rebase master llama change
5 months ago
GuangyaoZhang
20c0b06ff5
Merge branch 'command-r' of github.com:GuangyaoZhang/ColossalAI into command-r
5 months ago
GuangyaoZhang
363cde6957
merge model and attention forward
5 months ago
GuangyaoZhang
7a2b08646f
Remove CohereLayerNorm and use existing layernorm
5 months ago
GuangyaoZhang
fe2e74c03a
fix precommit
5 months ago
GuangyaoZhang
98da648a4a
Fix Code Factor check
5 months ago
GuangyaoZhang
f656d61778
change command
5 months ago
GuangyaoZhang
0b81163bc0
Copy llama to command
5 months ago
Edenzzzz
8795bb2e80
Support 4d parallel + flash attention ( #5789 )
...
* support tp + sp + pp
* remove comments
---------
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
5 months ago
GuangyaoZhang
3c7302ad0e
merge model and attention forward
5 months ago
GuangyaoZhang
8c3f524660
Remove CohereLayerNorm and use existing layernorm
6 months ago
GuangyaoZhang
c9025ebd7c
Merge branch 'command-r' of github.com:GuangyaoZhang/ColossalAI into command-r
6 months ago
GuangyaoZhang
9a290ab013
fix precommit
6 months ago
pre-commit-ci[bot]
2a7fa2e7d0
[pre-commit.ci] auto fixes from pre-commit.com hooks
...
for more information, see https://pre-commit.ci
6 months ago
GuangyaoZhang
1016bb3257
Fix Code Factor check
6 months ago
GuangyaoZhang
94fbde6055
change command
6 months ago
GuangyaoZhang
431b7bcf8f
Copy llama to command
6 months ago
flybird11111
2ddf624a86
[shardformer] upgrade transformers to 4.39.3 ( #5815 )
...
* [shardformer]upgrade transformers for gpt2/gptj/whisper (#5807 )
* [shardformer] fix modeling of gpt2 and gptj
* [shardformer] fix whisper modeling
* [misc] update requirements
---------
Co-authored-by: ver217 <lhx0217@gmail.com>
* [shardformer]upgrade transformers for mistral (#5808 )
* upgrade transformers for mistral
* fix
* fix
* [shardformer]upgrade transformers for llama (#5809 )
* update transformers
fix
* fix
* fix
* [inference] upgrade transformers (#5810 )
* update transformers
fix
* fix
* fix
* fix
* fix
* [gemini] update transformers for gemini (#5814 )
---------
Co-authored-by: ver217 <lhx0217@gmail.com>
6 months ago
botbw
3bcbba9262
[gemini] quick fix on possible async operation ( #5803 )
...
* [gemini] quick fix on possible async operation
* [gemini] quick fix on possible async operation
6 months ago
Haze188
d9dddf574f
[Gemini] Use async stream to prefetch and h2d data moving ( #5781 )
...
* use async stream to prefetch and h2d data moving
* Remove redundant code
6 months ago
Li Xingjian
8554585a5f
[Inference] Fix flash-attn import and add model test ( #5794 )
...
* Fix torch int32 dtype
Signed-off-by: char-1ee <xingjianli59@gmail.com>
* Fix flash-attn import
Signed-off-by: char-1ee <xingjianli59@gmail.com>
* Add generalized model test
Signed-off-by: char-1ee <xingjianli59@gmail.com>
* Remove exposed path to model
Signed-off-by: char-1ee <xingjianli59@gmail.com>
* Add default value for use_flash_attn
Signed-off-by: char-1ee <xingjianli59@gmail.com>
* Rename model test
Signed-off-by: char-1ee <xingjianli59@gmail.com>
---------
Signed-off-by: char-1ee <xingjianli59@gmail.com>
6 months ago
Guangyao Zhang
aac941ef78
[test] fix qwen2 pytest distLarge ( #5797 )
6 months ago
Hongxin Liu
aa125bcc91
[shardformer] fix modeling of bloom and falcon ( #5796 )
6 months ago
Hongxin Liu
587bbf4c6d
[test] fix chatglm test kit ( #5793 )
6 months ago
YeAnbang
74f4a29734
Merge pull request #5759 from hpcaitech/colossalchat_upgrade
...
[ColossalChat] Colossalchat upgrade
6 months ago
Runyu Lu
c0948aff97
[Inference]refactor baichuan ( #5791 )
...
* refactor baichuan
* remove unused code and add TODO for lazyinit
6 months ago
YeAnbang
84eab13078
update sft trainning script
6 months ago