botbw
dc583aa576
[moe] implement tp
4 months ago
botbw
0b5bbe9ce4
[test] add mixtral modelling test
4 months ago
hxwang
102b784a10
[chore] arg pass & remove drop token
4 months ago
botbw
8dbb86899d
[chore] trivial fix
4 months ago
botbw
014faf6c5a
[chore] manually revert unintended commit
4 months ago
botbw
9b9b76bdcd
[moe] add mixtral dp grad scaling when not all experts are activated
4 months ago
botbw
e28e05345b
[moe] implement submesh initialization
4 months ago
haze188
5ed5e8cfba
solve hang when parallel mode = pp + dp
4 months ago
haze188
fe24789eb1
[misc] solve booster hang by rename the variable
4 months ago
botbw
13b48ac0aa
[zero] solve hang
4 months ago
botbw
b5bfeb2efd
[moe] implement transit between non moe tp and ep
4 months ago
botbw
37443cc7e4
[test] pass mixtral shardformer test
4 months ago
hxwang
46c069b0db
[zero] solve hang
4 months ago
hxwang
0fad23c691
[chore] handle non member group
4 months ago
hxwang
a249e71946
[test] mixtra pp shard test
4 months ago
hxwang
8ae8525bdf
[moe] fix plugin
4 months ago
hxwang
0b76b57cd6
[test] add mixtral transformer test
4 months ago
hxwang
f9b6fcf81f
[test] add mixtral for sequence classification
4 months ago
Tong Li
1aeb5e8847
[hotfix] Remove unused plan section ( #5957 )
...
* remove readme
* fix readme
* update
4 months ago
YeAnbang
66fbf2ecb7
Update README.md ( #5958 )
4 months ago
YeAnbang
30f4e31a33
[Chat] Fix lora ( #5946 )
...
* fix merging
* remove filepath
* fix style
4 months ago
Hongxin Liu
09c5f72595
[release] update version ( #5952 )
4 months ago
Hongxin Liu
060892162a
[zero] hotfix update master params ( #5951 )
4 months ago
Runyu Lu
bcf0181ecd
[Feat] Distrifusion Acceleration Support for Diffusion Inference ( #5895 )
...
* Distrifusion Support source
* comp comm overlap optimization
* sd3 benchmark
* pixart distrifusion bug fix
* sd3 bug fix and benchmark
* generation bug fix
* naming fix
* add docstring, fix counter and shape error
* add reference
* readme and requirement
4 months ago
Hongxin Liu
7b38964e3a
[shardformer] hotfix attn mask ( #5947 )
4 months ago
Hongxin Liu
9664b1bc19
[shardformer] hotfix attn mask ( #5945 )
4 months ago
YeAnbang
c8332b9cb5
Merge pull request #5922 from hpcaitech/kto
...
[Chat] Add KTO
4 months ago
YeAnbang
6fd9e86864
fix style
4 months ago
YeAnbang
de1bf08ed0
fix style
4 months ago
YeAnbang
8a3ff4f315
fix style
4 months ago
zhurunhua
ad35a987d3
[Feature] Add a switch to control whether the model checkpoint needs to be saved after each epoch ends ( #5941 )
...
* Add a switch to control whether the model checkpoint needs to be saved after each epoch ends
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
4 months ago
Edenzzzz
2069472e96
[Hotfix] Fix ZeRO typo #5936
...
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
4 months ago
Hongxin Liu
5fd0592767
[fp8] support all-gather flat tensor ( #5932 )
4 months ago
Gao, Ruiyuan
5fb958cc83
[FIX BUG] convert env param to int in ( #5934 )
4 months ago
Insu Jang
a521ffc9f8
Add n_fused as an input from native_module ( #5894 )
4 months ago
YeAnbang
9688e19b32
remove real data path
4 months ago
YeAnbang
b0e15d563e
remove real data path
4 months ago
YeAnbang
12fe8b5858
refactor evaluation
4 months ago
YeAnbang
c5f582f666
fix test data
4 months ago
zhurunhua
4ec17a7cdf
[FIX BUG] UnboundLocalError: cannot access local variable 'default_conversation' where it is not associated with a value ( #5931 )
...
* cannot access local variable 'default_conversation' where it is not associated with a value
set default value for 'default_conversation'
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
4 months ago
YeAnbang
150505cbb8
Merge branch 'kto' of https://github.com/hpcaitech/ColossalAI into kto
4 months ago
YeAnbang
d49550fb49
refactor tokenization
4 months ago
Tong Li
d08c99be0d
Merge branch 'main' into kto
4 months ago
Tong Li
f585d4e38e
[ColossalChat] Hotfix for ColossalChat ( #5910 )
...
* add ignore and tiny llama
* fix path issue
* run style
* fix issue
* update bash
* add ignore and tiny llama
* fix path issue
* run style
* fix issue
* update bash
* fix ddp issue
* add Qwen 1.5 32B
4 months ago
Edenzzzz
8cc8f645cd
[Examples] Add lazy init to OPT and GPT examples ( #5924 )
...
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
4 months ago
YeAnbang
544b7a38a1
fix style, add kto data sample
4 months ago
Guangyao Zhang
62661cde22
Merge pull request #5921 from BurkeHulk/fp8_fix
...
[Shardformer] Fix Shardformer FP8 communication training accuracy degradation
4 months ago
YeAnbang
845ea7214e
Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into kto
4 months ago
YeAnbang
09d5ffca1a
add kto
4 months ago
Hongxin Liu
e86127925a
[plugin] support all-gather overlap for hybrid parallel ( #5919 )
...
* [plugin] fixed all-gather overlap support for hybrid parallel
4 months ago