Commit Graph

40 Commits (37e35230ff4666231dd65435b5f7b2a2fcfaf9e6)

Author SHA1 Message Date
Tong Li 39e2597426
[ColossalChat] Add PP support (#6001)
* support pp training

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update rm

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* refactor

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update test case

* fix

* change to 4

* fix eval

* test

* add pp

* hotfix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* support pp training

* update rm

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* refactor

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update test case

* fix

* change to 4

* fix eval

* test

* add pp

* hotfix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update

* skip pp eval

* update all reduce

* update sft

* update ignore

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update no cache

* add eval

* remove fi

* remove debug

* remove parentheses to avoid warning

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Revert "add eval"

This reverts commit 3ab2f6fa32.

* add all reduce

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-21 10:47:39 +08:00
YeAnbang ed97d3a5d3
[Chat] fix readme (#5989)
* fix readme

* fix readme, tokenization fully tested

* fix readme, tokenization fully tested

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: root <root@notebook-8f919155-6035-47b4-9c6f-1be133b9e2c9-0.notebook-8f919155-6035-47b4-9c6f-1be133b9e2c9.colossal-ai.svc.cluster.local>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-12 14:55:17 +08:00
YeAnbang 0b2d55c4ab Support overall loss, update KTO logging 2024-08-02 06:51:38 +00:00
Tong Li 1aeb5e8847
[hotfix] Remove unused plan section (#5957)
* remove readme

* fix readme

* update
2024-07-31 17:47:46 +08:00
YeAnbang 66fbf2ecb7
Update README.md (#5958) 2024-07-31 17:44:09 +08:00
YeAnbang 30f4e31a33
[Chat] Fix lora (#5946)
* fix merging

* remove filepath

* fix style
2024-07-31 14:10:17 +08:00
YeAnbang de1bf08ed0 fix style 2024-07-26 10:07:15 +00:00
YeAnbang 9688e19b32 remove real data path 2024-07-22 06:13:02 +00:00
YeAnbang b0e15d563e remove real data path 2024-07-22 06:11:38 +00:00
YeAnbang 12fe8b5858 refactor evaluation 2024-07-22 05:57:39 +00:00
YeAnbang 150505cbb8 Merge branch 'kto' of https://github.com/hpcaitech/ColossalAI into kto 2024-07-19 10:11:05 +00:00
YeAnbang d49550fb49 refactor tokenization 2024-07-19 10:10:48 +00:00
Tong Li d08c99be0d
Merge branch 'main' into kto 2024-07-19 15:23:31 +08:00
Tong Li f585d4e38e
[ColossalChat] Hotfix for ColossalChat (#5910)
* add ignore and tiny llama

* fix path issue

* run style

* fix issue

* update bash

* add ignore and tiny llama

* fix path issue

* run style

* fix issue

* update bash

* fix ddp issue

* add Qwen 1.5 32B
2024-07-19 13:40:07 +08:00
YeAnbang 544b7a38a1 fix style, add kto data sample 2024-07-18 08:38:56 +00:00
YeAnbang 09d5ffca1a add kto 2024-07-18 07:54:11 +00:00
YeAnbang b3594d4d68 fix orpo cross entropy loss 2024-07-15 02:12:05 +00:00
YeAnbang e7a8634636 fix eval 2024-07-11 03:35:03 +00:00
pre-commit-ci[bot] 8a9721bafe [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-07-10 10:44:32 +00:00
YeAnbang d888c3787c add benchmark for sft, dpo, simpo, orpo. Add benchmarking result. Support lora with gradient checkpoint 2024-07-10 10:17:08 +00:00
YeAnbang a8af6ccb73 fix torch colossalai version 2024-06-28 03:58:29 +00:00
YeAnbang 8aad064fe7 fix style 2024-06-27 07:29:33 +00:00
YeAnbang c8d1b4a968 add orpo 2024-06-27 07:20:28 +00:00
YeAnbang f3de5a025c remove debug code 2024-06-24 05:16:29 +00:00
YeAnbang 0b2d6275c4 fix dataloader 2024-06-24 05:10:44 +00:00
YeAnbang 82aecd6374 add SimPO 2024-06-24 02:12:20 +00:00
YeAnbang 84eab13078 update sft trainning script 2024-06-11 02:44:20 +00:00
YeAnbang 2abdede1d7 fix readme 2024-06-10 01:08:42 +00:00
YeAnbang 0d7ff10ea5 replace the customized dataloader setup with the build-in one 2024-06-07 09:43:42 +00:00
YeAnbang 45195ac53d remove local data path 2024-06-07 07:01:31 +00:00
YeAnbang 62eb28b929 remove duplicated test 2024-06-07 07:01:31 +00:00
pre-commit-ci[bot] 1b880ce095 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-06-07 07:01:31 +00:00
YeAnbang 7ae87b3159 fix training script 2024-06-07 07:01:31 +00:00
YeAnbang 0b4a33548c moupdate ci tests, st ci test cases passed, tp failed in generation for ppo, sp is buggy 2024-06-07 07:01:31 +00:00
YeAnbang 7e65b71815 run pre-commit 2024-06-07 07:01:30 +00:00
YeAnbang 929e1e3da4 upgrade ppo dpo rm script 2024-06-07 07:01:30 +00:00
YeAnbang 7a7e86987d upgrade colossal-chat support tp_group>1, add sp for sft 2024-06-07 07:01:30 +00:00
Hongxin Liu 7f8b16635b
[misc] refactor launch API and tensor constructor (#5666)
* [misc] remove config arg from initialize

* [misc] remove old tensor contrusctor

* [plugin] add npu support for ddp

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [devops] fix doc test ci

* [test] fix test launch

* [doc] update launch doc

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-04-29 10:40:11 +08:00
Hongxin Liu 641b1ee71a
[devops] remove post commit ci (#5566)
* [devops] remove post commit ci

* [misc] run pre-commit on all files

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-04-08 15:09:40 +08:00
YeAnbang df5e9c53cf
[ColossalChat] Update RLHF V2 (#5286)
* Add dpo. Fix sft, ppo, lora. Refactor all

* fix and tested ppo

* 2 nd round refactor

* add ci tests

* fix ci

* fix ci

* fix readme, style

* fix readme style

* fix style, fix benchmark

* reproduce benchmark result, remove useless files

* rename to ColossalChat

* use new image

* fix ci workflow

* fix ci

* use local model/tokenizer for ci tests

* fix ci

* fix ci

* fix ci

* fix ci timeout

* fix rm progress bar. fix ci timeout

* fix ci

* fix ci typo

* remove 3d plugin from ci temporary

* test environment

* cannot save optimizer

* support chat template

* fix readme

* fix path

* test ci locally

* restore build_or_pr

* fix ci data path

* fix benchmark

* fix ci, move ci tests to 3080, disable fast tokenizer

* move ci to 85

* support flash attention 2

* add all-in-one data preparation script. Fix colossal-llama2-chat chat template

* add hardware requirements

* move ci test data

* fix save_model, add unwrap

* fix missing bos

* fix missing bos; support grad accumulation with gemini

* fix ci

* fix ci

* fix ci

* fix llama2 chat template config

* debug sft

* debug sft

* fix colossalai version requirement

* fix ci

* add sanity check to prevent NaN loss

* fix requirements

* add dummy data generation script

* add dummy data generation script

* add dummy data generation script

* add dummy data generation script

* update readme

* update readme

* update readme and ignore

* fix logger bug

* support parallel_output

* modify data preparation logic

* fix tokenization

* update lr

* fix inference

* run pre-commit

---------

Co-authored-by: Tong Li <tong.li352711588@gmail.com>
2024-03-29 14:12:29 +08:00