Tong Li
1aeb5e8847
[hotfix] Remove unused plan section ( #5957 )
...
* remove readme
* fix readme
* update
2024-07-31 17:47:46 +08:00
YeAnbang
66fbf2ecb7
Update README.md ( #5958 )
2024-07-31 17:44:09 +08:00
YeAnbang
30f4e31a33
[Chat] Fix lora ( #5946 )
...
* fix merging
* remove filepath
* fix style
2024-07-31 14:10:17 +08:00
Hongxin Liu
09c5f72595
[release] update version ( #5952 )
2024-07-31 10:04:50 +08:00
Hongxin Liu
060892162a
[zero] hotfix update master params ( #5951 )
2024-07-30 13:36:00 +08:00
Runyu Lu
bcf0181ecd
[Feat] Distrifusion Acceleration Support for Diffusion Inference ( #5895 )
...
* Distrifusion Support source
* comp comm overlap optimization
* sd3 benchmark
* pixart distrifusion bug fix
* sd3 bug fix and benchmark
* generation bug fix
* naming fix
* add docstring, fix counter and shape error
* add reference
* readme and requirement
2024-07-30 10:43:26 +08:00
Hongxin Liu
7b38964e3a
[shardformer] hotfix attn mask ( #5947 )
2024-07-29 19:10:06 +08:00
Hongxin Liu
9664b1bc19
[shardformer] hotfix attn mask ( #5945 )
2024-07-29 13:58:27 +08:00
YeAnbang
c8332b9cb5
Merge pull request #5922 from hpcaitech/kto
...
[Chat] Add KTO
2024-07-29 13:27:00 +08:00
YeAnbang
6fd9e86864
fix style
2024-07-29 01:29:18 +00:00
YeAnbang
de1bf08ed0
fix style
2024-07-26 10:07:15 +00:00
YeAnbang
8a3ff4f315
fix style
2024-07-26 09:55:15 +00:00
zhurunhua
ad35a987d3
[Feature] Add a switch to control whether the model checkpoint needs to be saved after each epoch ends ( #5941 )
...
* Add a switch to control whether the model checkpoint needs to be saved after each epoch ends
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-07-26 11:15:20 +08:00
Edenzzzz
2069472e96
[Hotfix] Fix ZeRO typo #5936
...
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-07-25 09:59:58 +08:00
Hongxin Liu
5fd0592767
[fp8] support all-gather flat tensor ( #5932 )
2024-07-24 16:55:20 +08:00
Gao, Ruiyuan
5fb958cc83
[FIX BUG] convert env param to int in ( #5934 )
2024-07-24 10:30:40 +08:00
Insu Jang
a521ffc9f8
Add n_fused as an input from native_module ( #5894 )
2024-07-23 23:15:39 +08:00
YeAnbang
9688e19b32
remove real data path
2024-07-22 06:13:02 +00:00
YeAnbang
b0e15d563e
remove real data path
2024-07-22 06:11:38 +00:00
YeAnbang
12fe8b5858
refactor evaluation
2024-07-22 05:57:39 +00:00
YeAnbang
c5f582f666
fix test data
2024-07-22 01:31:32 +00:00
zhurunhua
4ec17a7cdf
[FIX BUG] UnboundLocalError: cannot access local variable 'default_conversation' where it is not associated with a value ( #5931 )
...
* cannot access local variable 'default_conversation' where it is not associated with a value
set default value for 'default_conversation'
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-07-21 19:46:01 +08:00
YeAnbang
150505cbb8
Merge branch 'kto' of https://github.com/hpcaitech/ColossalAI into kto
2024-07-19 10:11:05 +00:00
YeAnbang
d49550fb49
refactor tokenization
2024-07-19 10:10:48 +00:00
Tong Li
d08c99be0d
Merge branch 'main' into kto
2024-07-19 15:23:31 +08:00
Tong Li
f585d4e38e
[ColossalChat] Hotfix for ColossalChat ( #5910 )
...
* add ignore and tiny llama
* fix path issue
* run style
* fix issue
* update bash
* add ignore and tiny llama
* fix path issue
* run style
* fix issue
* update bash
* fix ddp issue
* add Qwen 1.5 32B
2024-07-19 13:40:07 +08:00
Edenzzzz
8cc8f645cd
[Examples] Add lazy init to OPT and GPT examples ( #5924 )
...
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-07-19 10:10:08 +08:00
YeAnbang
544b7a38a1
fix style, add kto data sample
2024-07-18 08:38:56 +00:00
Guangyao Zhang
62661cde22
Merge pull request #5921 from BurkeHulk/fp8_fix
...
[Shardformer] Fix Shardformer FP8 communication training accuracy degradation
2024-07-18 16:34:38 +08:00
YeAnbang
845ea7214e
Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into kto
2024-07-18 07:55:43 +00:00
YeAnbang
09d5ffca1a
add kto
2024-07-18 07:54:11 +00:00
Hongxin Liu
e86127925a
[plugin] support all-gather overlap for hybrid parallel ( #5919 )
...
* [plugin] fixed all-gather overlap support for hybrid parallel
2024-07-18 15:33:03 +08:00
GuangyaoZhang
5b969fd831
fix shardformer fp8 communication training degradation
2024-07-18 07:16:36 +00:00
Guangyao Zhang
d0bdb51f48
Merge pull request #5899 from BurkeHulk/SP_fp8
...
[Feature] FP8 communication in ShardFormer
2024-07-18 10:46:59 +08:00
Hongxin Liu
73494de577
[release] update version ( #5912 )
2024-07-17 17:29:59 +08:00
GuangyaoZhang
6a20f07b80
remove all to all
2024-07-17 07:14:55 +00:00
GuangyaoZhang
5a310b9ee1
fix rebase
2024-07-17 03:43:23 +00:00
GuangyaoZhang
457a0de79f
shardformer fp8
2024-07-16 06:56:51 +00:00
Hongxin Liu
27a72f0de1
[misc] support torch2.3 ( #5893 )
...
* [misc] support torch2.3
* [devops] update compatibility ci
* [devops] update compatibility ci
* [devops] add debug
* [devops] add debug
* [devops] add debug
* [devops] add debug
* [devops] remove debug
* [devops] remove debug
2024-07-16 13:59:25 +08:00
アマデウス
530283dba0
fix object_to_tensor usage when torch>=2.3.0 ( #5820 )
2024-07-16 13:59:25 +08:00
Guangyao Zhang
2e28c793ce
[compatibility] support torch 2.2 ( #5875 )
...
* Support Pytorch 2.2.2
* keep build_on_pr file and update .compatibility
2024-07-16 13:59:25 +08:00
Hanks
9470701110
Merge pull request #5885 from BurkeHulk/feature/fp8_comm
...
Feature/fp8 comm
2024-07-16 11:37:05 +08:00
YeAnbang
d8bf7e09a2
Merge pull request #5901 from hpcaitech/colossalchat
...
[Chat] fix eval: add in training evaluation, fix orpo sft loss bug
2024-07-16 11:07:32 +08:00
Guangyao Zhang
1c961b20f3
[ShardFormer] fix qwen2 sp ( #5903 )
2024-07-15 13:58:06 +08:00
Stephan Kö
45c49dde96
[Auto Parallel]: Speed up intra-op plan generation by 44% ( #5446 )
...
* Remove unnecessary calls to deepcopy
* Build DimSpec's difference dict only once
This change considerably speeds up construction speed of DimSpec objects. The difference_dict is the same for each DimSpec object, so a single copy of it is enough.
* Fix documentation of DimSpec's difference method
2024-07-15 12:05:06 +08:00
YeAnbang
b3594d4d68
fix orpo cross entropy loss
2024-07-15 02:12:05 +00:00
pre-commit-ci[bot]
51f916b11d
[pre-commit.ci] auto fixes from pre-commit.com hooks
...
for more information, see https://pre-commit.ci
2024-07-12 07:33:45 +00:00
BurkeHulk
1f1b856354
Merge remote-tracking branch 'origin/feature/fp8_comm' into feature/fp8_comm
...
# Conflicts:
# colossalai/quantization/fp8.py
2024-07-12 15:29:41 +08:00
BurkeHulk
66018749f3
add fp8_communication flag in the script
2024-07-12 15:26:17 +08:00
BurkeHulk
e88190184a
support fp8 communication in pipeline parallelism
2024-07-12 15:25:25 +08:00