GuangyaoZhang
|
20c0b06ff5
|
Merge branch 'command-r' of github.com:GuangyaoZhang/ColossalAI into command-r
|
2024-06-18 02:37:14 +00:00 |
GuangyaoZhang
|
363cde6957
|
merge model and attention forward
|
2024-06-18 02:32:41 +00:00 |
GuangyaoZhang
|
7a2b08646f
|
Remove CohereLayerNorm and use existing layernorm
|
2024-06-18 02:32:41 +00:00 |
GuangyaoZhang
|
fe2e74c03a
|
fix precommit
|
2024-06-18 02:31:33 +00:00 |
GuangyaoZhang
|
98da648a4a
|
Fix Code Factor check
|
2024-06-18 02:31:33 +00:00 |
GuangyaoZhang
|
f656d61778
|
change command
|
2024-06-18 02:31:33 +00:00 |
GuangyaoZhang
|
0b81163bc0
|
Copy llama to command
|
2024-06-18 02:31:33 +00:00 |
Edenzzzz
|
8795bb2e80
|
Support 4d parallel + flash attention (#5789)
* support tp + sp + pp
* remove comments
---------
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
|
2024-06-17 17:40:47 +08:00 |
GuangyaoZhang
|
3c7302ad0e
|
merge model and attention forward
|
2024-06-17 08:50:05 +00:00 |
GuangyaoZhang
|
8c3f524660
|
Remove CohereLayerNorm and use existing layernorm
|
2024-06-14 09:14:01 +00:00 |
GuangyaoZhang
|
c9025ebd7c
|
Merge branch 'command-r' of github.com:GuangyaoZhang/ColossalAI into command-r
|
2024-06-14 08:10:31 +00:00 |
GuangyaoZhang
|
9a290ab013
|
fix precommit
|
2024-06-14 08:09:24 +00:00 |
pre-commit-ci[bot]
|
2a7fa2e7d0
|
[pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
|
2024-06-14 08:05:07 +00:00 |
GuangyaoZhang
|
1016bb3257
|
Fix Code Factor check
|
2024-06-14 08:04:29 +00:00 |
GuangyaoZhang
|
94fbde6055
|
change command
|
2024-06-14 07:55:13 +00:00 |
GuangyaoZhang
|
431b7bcf8f
|
Copy llama to command
|
2024-06-14 03:07:01 +00:00 |
flybird11111
|
2ddf624a86
|
[shardformer] upgrade transformers to 4.39.3 (#5815)
* [shardformer]upgrade transformers for gpt2/gptj/whisper (#5807)
* [shardformer] fix modeling of gpt2 and gptj
* [shardformer] fix whisper modeling
* [misc] update requirements
---------
Co-authored-by: ver217 <lhx0217@gmail.com>
* [shardformer]upgrade transformers for mistral (#5808)
* upgrade transformers for mistral
* fix
* fix
* [shardformer]upgrade transformers for llama (#5809)
* update transformers
fix
* fix
* fix
* [inference] upgrade transformers (#5810)
* update transformers
fix
* fix
* fix
* fix
* fix
* [gemini] update transformers for gemini (#5814)
---------
Co-authored-by: ver217 <lhx0217@gmail.com>
|
2024-06-14 10:59:33 +08:00 |
botbw
|
3bcbba9262
|
[gemini] quick fix on possible async operation (#5803)
* [gemini] quick fix on possible async operation
* [gemini] quick fix on possible async operation
|
2024-06-13 10:35:17 +08:00 |
Haze188
|
d9dddf574f
|
[Gemini] Use async stream to prefetch and h2d data moving (#5781)
* use async stream to prefetch and h2d data moving
* Remove redundant code
|
2024-06-12 15:48:52 +08:00 |
Li Xingjian
|
8554585a5f
|
[Inference] Fix flash-attn import and add model test (#5794)
* Fix torch int32 dtype
Signed-off-by: char-1ee <xingjianli59@gmail.com>
* Fix flash-attn import
Signed-off-by: char-1ee <xingjianli59@gmail.com>
* Add generalized model test
Signed-off-by: char-1ee <xingjianli59@gmail.com>
* Remove exposed path to model
Signed-off-by: char-1ee <xingjianli59@gmail.com>
* Add default value for use_flash_attn
Signed-off-by: char-1ee <xingjianli59@gmail.com>
* Rename model test
Signed-off-by: char-1ee <xingjianli59@gmail.com>
---------
Signed-off-by: char-1ee <xingjianli59@gmail.com>
|
2024-06-12 14:13:50 +08:00 |
Guangyao Zhang
|
aac941ef78
|
[test] fix qwen2 pytest distLarge (#5797)
|
2024-06-12 12:13:51 +08:00 |
Hongxin Liu
|
aa125bcc91
|
[shardformer] fix modeling of bloom and falcon (#5796)
|
2024-06-11 17:43:50 +08:00 |
Hongxin Liu
|
587bbf4c6d
|
[test] fix chatglm test kit (#5793)
|
2024-06-11 16:54:31 +08:00 |
YeAnbang
|
74f4a29734
|
Merge pull request #5759 from hpcaitech/colossalchat_upgrade
[ColossalChat] Colossalchat upgrade
|
2024-06-11 12:49:53 +08:00 |
Runyu Lu
|
c0948aff97
|
[Inference]refactor baichuan (#5791)
* refactor baichuan
* remove unused code and add TODO for lazyinit
|
2024-06-11 10:52:01 +08:00 |
YeAnbang
|
84eab13078
|
update sft trainning script
|
2024-06-11 02:44:20 +00:00 |
Li Xingjian
|
77a219a082
|
Merge pull request #5771 from char-1ee/refactor/modeling
[Inference] Refactor modeling attention layer by abstracting attention backends
|
2024-06-10 11:52:22 +08:00 |
char-1ee
|
b303976a27
|
Fix test import
Signed-off-by: char-1ee <xingjianli59@gmail.com>
|
2024-06-10 02:03:30 +00:00 |
YeAnbang
|
2abdede1d7
|
fix readme
|
2024-06-10 01:08:42 +00:00 |
char-1ee
|
f5981e808e
|
Remove flash attention backend
Signed-off-by: char-1ee <xingjianli59@gmail.com>
|
2024-06-07 10:02:19 +00:00 |
YeAnbang
|
77db21610a
|
replace the customized dataloader setup with the build-in one
|
2024-06-07 09:44:25 +00:00 |
YeAnbang
|
0d7ff10ea5
|
replace the customized dataloader setup with the build-in one
|
2024-06-07 09:43:42 +00:00 |
char-1ee
|
ceba662d22
|
Clean up
Signed-off-by: char-1ee <xingjianli59@gmail.com>
|
2024-06-07 09:09:29 +00:00 |
char-1ee
|
5f398fc000
|
Pass inference model shard configs for module init
Signed-off-by: char-1ee <xingjianli59@gmail.com>
|
2024-06-07 08:33:52 +00:00 |
char-1ee
|
eec77e5702
|
Fix tests and naming
Signed-off-by: char-1ee <xingjianli59@gmail.com>
|
2024-06-07 08:33:47 +00:00 |
char-1ee
|
04386d9eff
|
Refactor modeling by adding attention backend
Signed-off-by: char-1ee <xingjianli59@gmail.com>
|
2024-06-07 08:33:47 +00:00 |
YeAnbang
|
790e1362a6
|
merge
|
2024-06-07 07:01:32 +00:00 |
YeAnbang
|
ac1520cb8f
|
remove baichuan from template test due to transformer version conflict
|
2024-06-07 07:01:32 +00:00 |
YeAnbang
|
e16ccc272a
|
update ci
|
2024-06-07 07:01:32 +00:00 |
YeAnbang
|
45195ac53d
|
remove local data path
|
2024-06-07 07:01:31 +00:00 |
YeAnbang
|
bf57b13dda
|
remove models that require huggingface auth from ci
|
2024-06-07 07:01:31 +00:00 |
YeAnbang
|
0bbac158ed
|
fix datasets version
|
2024-06-07 07:01:31 +00:00 |
YeAnbang
|
62eb28b929
|
remove duplicated test
|
2024-06-07 07:01:31 +00:00 |
YeAnbang
|
b8b5cacf38
|
fix transformers version
|
2024-06-07 07:01:31 +00:00 |
pre-commit-ci[bot]
|
1b880ce095
|
[pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
|
2024-06-07 07:01:31 +00:00 |
YeAnbang
|
b1031f7244
|
fix ci
|
2024-06-07 07:01:31 +00:00 |
YeAnbang
|
7ae87b3159
|
fix training script
|
2024-06-07 07:01:31 +00:00 |
YeAnbang
|
0b4a33548c
|
moupdate ci tests, st ci test cases passed, tp failed in generation for ppo, sp is buggy
|
2024-06-07 07:01:31 +00:00 |
YeAnbang
|
7e65b71815
|
run pre-commit
|
2024-06-07 07:01:30 +00:00 |
YeAnbang
|
929e1e3da4
|
upgrade ppo dpo rm script
|
2024-06-07 07:01:30 +00:00 |