hxwang
8ae8525bdf
[moe] fix plugin
4 months ago
hxwang
0b76b57cd6
[test] add mixtral transformer test
4 months ago
hxwang
f9b6fcf81f
[test] add mixtral for sequence classification
4 months ago
Tong Li
1aeb5e8847
[hotfix] Remove unused plan section ( #5957 )
...
* remove readme
* fix readme
* update
4 months ago
YeAnbang
66fbf2ecb7
Update README.md ( #5958 )
4 months ago
YeAnbang
30f4e31a33
[Chat] Fix lora ( #5946 )
...
* fix merging
* remove filepath
* fix style
4 months ago
Hongxin Liu
09c5f72595
[release] update version ( #5952 )
4 months ago
Hongxin Liu
060892162a
[zero] hotfix update master params ( #5951 )
4 months ago
Runyu Lu
bcf0181ecd
[Feat] Distrifusion Acceleration Support for Diffusion Inference ( #5895 )
...
* Distrifusion Support source
* comp comm overlap optimization
* sd3 benchmark
* pixart distrifusion bug fix
* sd3 bug fix and benchmark
* generation bug fix
* naming fix
* add docstring, fix counter and shape error
* add reference
* readme and requirement
4 months ago
Hongxin Liu
7b38964e3a
[shardformer] hotfix attn mask ( #5947 )
4 months ago
Hongxin Liu
9664b1bc19
[shardformer] hotfix attn mask ( #5945 )
4 months ago
YeAnbang
c8332b9cb5
Merge pull request #5922 from hpcaitech/kto
...
[Chat] Add KTO
4 months ago
YeAnbang
6fd9e86864
fix style
4 months ago
YeAnbang
de1bf08ed0
fix style
4 months ago
YeAnbang
8a3ff4f315
fix style
4 months ago
zhurunhua
ad35a987d3
[Feature] Add a switch to control whether the model checkpoint needs to be saved after each epoch ends ( #5941 )
...
* Add a switch to control whether the model checkpoint needs to be saved after each epoch ends
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
4 months ago
Edenzzzz
2069472e96
[Hotfix] Fix ZeRO typo #5936
...
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
4 months ago
Gao, Ruiyuan
5fb958cc83
[FIX BUG] convert env param to int in ( #5934 )
4 months ago
Insu Jang
a521ffc9f8
Add n_fused as an input from native_module ( #5894 )
4 months ago
YeAnbang
9688e19b32
remove real data path
4 months ago
YeAnbang
b0e15d563e
remove real data path
4 months ago
YeAnbang
12fe8b5858
refactor evaluation
4 months ago
YeAnbang
c5f582f666
fix test data
4 months ago
zhurunhua
4ec17a7cdf
[FIX BUG] UnboundLocalError: cannot access local variable 'default_conversation' where it is not associated with a value ( #5931 )
...
* cannot access local variable 'default_conversation' where it is not associated with a value
set default value for 'default_conversation'
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
4 months ago
YeAnbang
150505cbb8
Merge branch 'kto' of https://github.com/hpcaitech/ColossalAI into kto
4 months ago
YeAnbang
d49550fb49
refactor tokenization
4 months ago
Tong Li
d08c99be0d
Merge branch 'main' into kto
4 months ago
Tong Li
f585d4e38e
[ColossalChat] Hotfix for ColossalChat ( #5910 )
...
* add ignore and tiny llama
* fix path issue
* run style
* fix issue
* update bash
* add ignore and tiny llama
* fix path issue
* run style
* fix issue
* update bash
* fix ddp issue
* add Qwen 1.5 32B
4 months ago
Edenzzzz
8cc8f645cd
[Examples] Add lazy init to OPT and GPT examples ( #5924 )
...
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
4 months ago
YeAnbang
544b7a38a1
fix style, add kto data sample
4 months ago
YeAnbang
845ea7214e
Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into kto
4 months ago
YeAnbang
09d5ffca1a
add kto
4 months ago
Hongxin Liu
e86127925a
[plugin] support all-gather overlap for hybrid parallel ( #5919 )
...
* [plugin] fixed all-gather overlap support for hybrid parallel
4 months ago
Hongxin Liu
73494de577
[release] update version ( #5912 )
4 months ago
Hongxin Liu
27a72f0de1
[misc] support torch2.3 ( #5893 )
...
* [misc] support torch2.3
* [devops] update compatibility ci
* [devops] update compatibility ci
* [devops] add debug
* [devops] add debug
* [devops] add debug
* [devops] add debug
* [devops] remove debug
* [devops] remove debug
4 months ago
アマデウス
530283dba0
fix object_to_tensor usage when torch>=2.3.0 ( #5820 )
4 months ago
Guangyao Zhang
2e28c793ce
[compatibility] support torch 2.2 ( #5875 )
...
* Support Pytorch 2.2.2
* keep build_on_pr file and update .compatibility
4 months ago
YeAnbang
d8bf7e09a2
Merge pull request #5901 from hpcaitech/colossalchat
...
[Chat] fix eval: add in training evaluation, fix orpo sft loss bug
4 months ago
Guangyao Zhang
1c961b20f3
[ShardFormer] fix qwen2 sp ( #5903 )
4 months ago
Stephan Kö
45c49dde96
[Auto Parallel]: Speed up intra-op plan generation by 44% ( #5446 )
...
* Remove unnecessary calls to deepcopy
* Build DimSpec's difference dict only once
This change considerably speeds up construction speed of DimSpec objects. The difference_dict is the same for each DimSpec object, so a single copy of it is enough.
* Fix documentation of DimSpec's difference method
4 months ago
YeAnbang
b3594d4d68
fix orpo cross entropy loss
5 months ago
Hongxin Liu
c068ef0fa0
[zero] support all-gather overlap ( #5898 )
...
* [zero] support all-gather overlap
* [zero] add overlap all-gather flag
* [misc] fix typo
* [zero] update api
5 months ago
YeAnbang
115c4cc5a4
hotfix citation
5 months ago
YeAnbang
e7a8634636
fix eval
5 months ago
YeAnbang
dd9e1cdafe
Merge pull request #5850 from hpcaitech/rlhf_SimPO
...
[Chat] Rlhf support SimPO
5 months ago
pre-commit-ci[bot]
8a9721bafe
[pre-commit.ci] auto fixes from pre-commit.com hooks
...
for more information, see https://pre-commit.ci
5 months ago
YeAnbang
33f15203d3
Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into rlhf_SimPO
5 months ago
YeAnbang
f6ef5c3609
fix style
5 months ago
YeAnbang
d888c3787c
add benchmark for sft, dpo, simpo, orpo. Add benchmarking result. Support lora with gradient checkpoint
5 months ago
Guangyao Zhang
669849d74b
[ShardFormer] Add Ulysses Sequence Parallelism support for Command-R, Qwen2 and ChatGLM ( #5897 )
5 months ago