Commit Graph

354 Commits (b07a6f4e27e79e2aa7b12e1300f07eb925d22c30)

Author SHA1 Message Date
Hongxin Liu 8accecd55b [legacy] move engine to legacy (#4560)
* [legacy] move engine to legacy

* [example] fix seq parallel example

* [example] fix seq parallel example

* [test] test gemini pluging hang

* [test] test gemini pluging hang

* [test] test gemini pluging hang

* [test] test gemini pluging hang

* [test] test gemini pluging hang

* [example] update seq parallel requirements
2023-09-05 21:53:10 +08:00
Hongxin Liu 89fe027787 [legacy] move trainer to legacy (#4545)
* [legacy] move trainer to legacy

* [doc] update docs related to trainer

* [test] ignore legacy test
2023-09-05 21:53:10 +08:00
flybird11111 ec0866804c
[shardformer] update shardformer readme (#4617)
[shardformer] update shardformer readme

[shardformer] update shardformer readme
2023-09-05 13:14:41 +08:00
Hongxin Liu a39a5c66fe
Merge branch 'main' into feature/shardformer 2023-09-04 23:43:13 +08:00
flybird11111 0a94fcd351
[shardformer] update bert finetune example with HybridParallelPlugin (#4584)
* [shardformer] fix opt test hanging

* fix

* test

* test

* test

* fix test

* fix test

* remove print

* add fix

* [shardformer] add bert finetune example

* [shardformer] add bert finetune example

* [shardformer] add bert finetune example

* [shardformer] add bert finetune example

* [shardformer] add bert finetune example

* [shardformer] add bert finetune example

* [shardformer] fix epoch change

* [shardformer] broadcast add pp group

* [shardformer] fix opt test hanging

* fix

* test

* test

* [shardformer] zero1+pp and the corresponding tests (#4517)

* pause

* finish pp+zero1

* Update test_shard_vit.py

* [shardformer/fix overlap bug] fix overlap bug, add overlap as an option in shardco… (#4516)

* fix overlap bug and support bert, add overlap as an option in shardconfig

* support overlap for chatglm and bloom

* [shardformer] fix emerged bugs after updating transformers (#4526)

* test

* fix test

* fix test

* remove print

* add fix

* [shardformer] add bert finetune example

* [shardformer] add bert finetune example

* [shardformer] Add overlap support for gpt2 (#4535)

* add overlap support for gpt2

* remove unused code

* remove unused code

* [shardformer] support pp+tp+zero1 tests (#4531)

* [shardformer] fix opt test hanging

* fix

* test

* test

* test

* fix test

* fix test

* remove print

* add fix

* [shardformer] pp+tp+zero1

[shardformer] pp+tp+zero1

[shardformer] pp+tp+zero1

[shardformer] pp+tp+zero1

[shardformer] pp+tp+zero1

[shardformer] pp+tp+zero1

* [shardformer] pp+tp+zero1

* [shardformer] pp+tp+zero1

* [shardformer] pp+tp+zero1

* [shardformer] pp+tp+zero1

* [shardformer] fix submodule replacement bug when enabling pp (#4544)

* [shardformer] support sharded optimizer checkpointIO of HybridParallelPlugin (#4540)

* implement sharded optimizer saving

* add more param info

* finish implementation of sharded optimizer saving

* fix bugs in optimizer sharded saving

* add pp+zero test

* param group loading

* greedy loading of optimizer

* fix bug when loading

* implement optimizer sharded saving

* add optimizer test & arrange checkpointIO utils

* fix gemini sharding state_dict

* add verbose option

* add loading of master params

* fix typehint

* fix master/working mapping in fp16 amp

* [shardformer] add bert finetune example

* [shardformer] add bert finetune example

* [shardformer] add bert finetune example

* [shardformer] add bert finetune example

* [shardformer] fix epoch change

* [shardformer] broadcast add pp group

* rebase feature/shardformer

* update pipeline

* [shardformer] fix

* [shardformer] fix

* [shardformer] bert finetune fix

* [shardformer] add all_reduce operation to loss

add all_reduce operation to loss

* [shardformer] make compatible with pytree.

make compatible with pytree.

* [shardformer] disable tp

disable tp

* [shardformer] add 3d plugin to ci test

* [shardformer] update num_microbatches to None

* [shardformer] update microbatchsize

* [shardformer] update assert

* update scheduler

* update scheduler

---------

Co-authored-by: Jianghai <72591262+CjhHa1@users.noreply.github.com>
Co-authored-by: Bin Jia <45593998+FoolPlayer@users.noreply.github.com>
Co-authored-by: Baizhou Zhang <eddiezhang@pku.edu.cn>
2023-09-04 21:46:29 +08:00
binmakeswell 8d7b02290f
[doc] add llama2 benchmark (#4604)
* [doc] add llama2 benchmark

* [doc] add llama2 benchmark
2023-09-04 13:49:33 +08:00
Tian Siyuan f1ae8c9104
[example] change accelerate version (#4431)
Co-authored-by: Siyuan Tian <siyuant@vmware.com>
Co-authored-by: Hongxin Liu <lhx0217@gmail.com>
2023-08-30 22:56:13 +08:00
ChengDaqi2023 8e2e1992b8
[example] update streamlit 0.73.1 to 1.11.1 (#4386) 2023-08-30 22:54:45 +08:00
Hongxin Liu 0b00def881
[example] add llama2 example (#4527)
* [example] transfer llama-1 example

* [example] fit llama-2

* [example] refactor scripts folder

* [example] fit new gemini plugin

* [cli] fix multinode runner

* [example] fit gemini optim checkpoint

* [example] refactor scripts

* [example] update requirements

* [example] update requirements

* [example] rename llama to llama2

* [example] update readme and pretrain script

* [example] refactor scripts
2023-08-28 17:59:11 +08:00
Hongxin Liu 27061426f7
[gemini] improve compatibility and add static placement policy (#4479)
* [gemini] remove distributed-related part from colotensor (#4379)

* [gemini] remove process group dependency

* [gemini] remove tp part from colo tensor

* [gemini] patch inplace op

* [gemini] fix param op hook and update tests

* [test] remove useless tests

* [test] remove useless tests

* [misc] fix requirements

* [test] fix model zoo

* [test] fix model zoo

* [test] fix model zoo

* [test] fix model zoo

* [test] fix model zoo

* [misc] update requirements

* [gemini] refactor gemini optimizer and gemini ddp (#4398)

* [gemini] update optimizer interface

* [gemini] renaming gemini optimizer

* [gemini] refactor gemini ddp class

* [example] update gemini related example

* [example] update gemini related example

* [plugin] fix gemini plugin args

* [test] update gemini ckpt tests

* [gemini] fix checkpoint io

* [example] fix opt example requirements

* [example] fix opt example

* [example] fix opt example

* [example] fix opt example

* [gemini] add static placement policy (#4443)

* [gemini] add static placement policy

* [gemini] fix param offload

* [test] update gemini tests

* [plugin] update gemini plugin

* [plugin] update gemini plugin docstr

* [misc] fix flash attn requirement

* [test] fix gemini checkpoint io test

* [example] update resnet example result (#4457)

* [example] update bert example result (#4458)

* [doc] update gemini doc (#4468)

* [example] update gemini related examples (#4473)

* [example] update gpt example

* [example] update dreambooth example

* [example] update vit

* [example] update opt

* [example] update palm

* [example] update vit and opt benchmark

* [hotfix] fix bert in model zoo (#4480)

* [hotfix] fix bert in model zoo

* [test] remove chatglm gemini test

* [test] remove sam gemini test

* [test] remove vit gemini test

* [hotfix] fix opt tutorial example (#4497)

* [hotfix] fix opt tutorial example

* [hotfix] fix opt tutorial example
2023-08-24 09:29:25 +08:00
Tian Siyuan ff836790ae
[doc] fix a typo in examples/tutorial/auto_parallel/README.md (#4430)
Co-authored-by: Siyuan Tian <siyuant@vmware.com>
2023-08-15 00:22:57 +08:00
binmakeswell 089c365fa0
[doc] add Series A Funding and NeurIPS news (#4377)
* [doc] add Series A Funding and NeurIPS news

* [kernal] fix mha kernal

* [CI] skip moe

* [CI] fix requirements
2023-08-04 17:42:07 +08:00
caption 16c0acc01b
[hotfix] update gradio 3.11 to 3.34.0 (#4329) 2023-08-01 16:25:25 +08:00
binmakeswell ef4b99ebcd add llama example CI 2023-07-26 14:12:57 +08:00
binmakeswell 7ff11b5537
[example] add llama pretraining (#4257) 2023-07-17 21:07:44 +08:00
github-actions[bot] 4e9b09c222
Automated submodule synchronization (#4217)
Co-authored-by: github-actions <github-actions@github.com>
2023-07-12 17:35:58 +08:00
digger yu 2d40759a53
fix #3852 path error (#4058) 2023-06-28 15:29:44 +08:00
Jianghai 31dc302017
[examples] copy resnet example to image (#4090)
* copy resnet example

* add pytest package

* skip test_ci

* skip test_ci

* skip test_ci
2023-06-27 16:40:46 +08:00
Baizhou Zhang 4da324cd60
[hotfix]fix argument naming in docs and examples (#4083) 2023-06-26 23:50:04 +08:00
LuGY 160c64c645
[example] fix bucket size in example of gpt gemini (#4028) 2023-06-19 11:22:42 +08:00
Baizhou Zhang b3ab7fbabf
[example] update ViT example using booster api (#3940) 2023-06-12 15:02:27 +08:00
Liu Ziming e277534a18
Merge pull request #3905 from MaruyamaAya/dreambooth
[example] Adding an example of training dreambooth with the new booster API
2023-06-09 08:44:18 +08:00
digger yu 33eef714db
fix typo examples and docs (#3932) 2023-06-08 16:09:32 +08:00
Maruyama_Aya 9b5e7ce21f modify shell for check 2023-06-08 14:56:56 +08:00
digger yu 407aa48461
fix typo examples/community/roberta (#3925) 2023-06-08 14:28:34 +08:00
Maruyama_Aya 730a092ba2 modify shell for check 2023-06-08 13:38:18 +08:00
Maruyama_Aya 49567d56d1 modify shell for check 2023-06-08 13:36:05 +08:00
Maruyama_Aya 039854b391 modify shell for check 2023-06-08 13:17:58 +08:00
Baizhou Zhang e417dd004e
[example] update opt example using booster api (#3918) 2023-06-08 11:27:05 +08:00
Maruyama_Aya cf4792c975 modify shell for check 2023-06-08 11:15:10 +08:00
Maruyama_Aya c94a33579b modify shell for check 2023-06-07 17:23:01 +08:00
Liu Ziming b306cecf28
[example] Modify palm example with the new booster API (#3913)
* Modify torch version requirement to adapt torch 2.0

* modify palm example using new booster API

* roll back

* fix port

* polish

* polish
2023-06-07 16:05:00 +08:00
wukong1992 a55fb00c18
[booster] update bert example, using booster api (#3885) 2023-06-07 15:51:00 +08:00
Maruyama_Aya 4fc8bc68ac modify file path 2023-06-07 11:02:19 +08:00
Maruyama_Aya b4437e88c3 fixed port 2023-06-06 16:21:38 +08:00
Maruyama_Aya 79c9f776a9 fixed port 2023-06-06 16:20:45 +08:00
Maruyama_Aya d3379f0be7 fixed model saving bugs 2023-06-06 16:07:34 +08:00
Maruyama_Aya b29e1f0722 change directory 2023-06-06 15:50:03 +08:00
Maruyama_Aya 1c1f71cbd2 fixing insecure hash function 2023-06-06 14:51:11 +08:00
Maruyama_Aya b56c7f4283 update shell file 2023-06-06 14:09:27 +08:00
Maruyama_Aya 176010f289 update performance evaluation 2023-06-06 14:08:22 +08:00
Maruyama_Aya 25447d4407 modify path 2023-06-05 11:47:07 +08:00
Maruyama_Aya 60ec33bb18 Add a new example of Dreambooth training using the booster API 2023-06-02 16:50:51 +08:00
jiangmingyan 5f79008c4a
[example] update gemini examples (#3868)
* [example]update gemini examples

* [example]update gemini examples
2023-05-30 18:41:41 +08:00
digger yu 518b31c059
[docs] change placememt_policy to placement_policy (#3829)
* fix typo colossalai/autochunk auto_parallel amp

* fix typo colossalai/auto_parallel nn utils etc.

* fix typo colossalai/auto_parallel autochunk fx/passes  etc.

* fix typo docs/

* change placememt_policy to placement_policy in docs/ and examples/
2023-05-24 14:51:49 +08:00
github-actions[bot] 62c7e67f9f
[format] applied code formatting on changed files in pull request 3786 (#3787)
Co-authored-by: github-actions <github-actions@github.com>
2023-05-22 14:42:09 +08:00
binmakeswell ad2cf58f50
[chat] add performance and tutorial (#3786) 2023-05-19 18:03:56 +08:00
binmakeswell 15024e40d9
[auto] fix install cmd (#3772) 2023-05-18 13:33:01 +08:00
digger-yu b7141c36dd
[CI] fix some spelling errors (#3707)
* fix spelling error with examples/comminity/

* fix spelling error with tests/

* fix some spelling error with tests/ colossalai/ etc.
2023-05-10 17:12:03 +08:00
Hongxin Liu 3bf09efe74
[booster] update prepare dataloader method for plugin (#3706)
* [booster] add prepare dataloader method for plug

* [booster] update examples and docstr
2023-05-08 15:44:03 +08:00
Hongxin Liu f83ea813f5
[example] add train resnet/vit with booster example (#3694)
* [example] add train vit with booster example

* [example] update readme

* [example] add train resnet with booster example

* [example] enable ci

* [example] enable ci

* [example] add requirements

* [hotfix] fix analyzer init

* [example] update requirements
2023-05-08 10:42:30 +08:00
Hongxin Liu d556648885
[example] add finetune bert with booster example (#3693) 2023-05-06 11:53:13 +08:00
digger-yu b9a8dff7e5
[doc] Fix typo under colossalai and doc(#3618)
* Fixed several spelling errors under colossalai

* Fix the spelling error in colossalai and docs directory

* Cautious Changed the spelling error under the example folder

* Update runtime_preparation_pass.py

revert autograft to autograd

* Update search_chunk.py

utile to until

* Update check_installation.py

change misteach to mismatch in line 91

* Update 1D_tensor_parallel.md

revert to perceptron

* Update 2D_tensor_parallel.md

revert to perceptron in line 73

* Update 2p5D_tensor_parallel.md

revert to perceptron in line 71

* Update 3D_tensor_parallel.md

revert to perceptron in line 80

* Update README.md

revert to resnet in line 42

* Update reorder_graph.py

revert to indice in line 7

* Update p2p.py

revert to megatron in line 94

* Update initialize.py

revert to torchrun in line 198

* Update routers.py

change to detailed in line 63

* Update routers.py

change to detailed in line 146

* Update README.md

revert  random number in line 402
2023-04-26 11:38:43 +08:00
github-actions[bot] d544ed4345
[bot] Automated submodule synchronization (#3596)
Co-authored-by: github-actions <github-actions@github.com>
2023-04-19 10:38:12 +08:00
digger-yu d0fbd4b86f
[example] fix community doc (#3586)
Adjusted the style of Community Examples to be consistent with other titles
2023-04-18 10:37:34 +08:00
binmakeswell f1b3d60cae
[example] reorganize for community examples (#3557) 2023-04-14 16:27:48 +08:00
natalie_cao de84c0311a Polish Code 2023-04-12 18:19:46 +08:00
binmakeswell 0c0455700f
[doc] add requirement and highlight application (#3516)
* [doc] add requirement and highlight application

* [doc] link example and application
2023-04-10 17:37:16 +08:00
mandoxzhang 8f2c55f9c9
[example] remove redundant texts & update roberta (#3493)
* update roberta example

* update roberta example

* modify conflict & update roberta
2023-04-07 11:33:32 +08:00
mandoxzhang ab5fd127e3
[example] update roberta with newer ColossalAI (#3472)
* update roberta example

* update roberta example
2023-04-07 10:34:51 +08:00
NatalieC323 fb8fae6f29
Revert "[dreambooth] fixing the incompatibity in requirements.txt (#3190) (#3378)" (#3481) 2023-04-06 20:22:52 +08:00
NatalieC323 c701b77b11
[dreambooth] fixing the incompatibity in requirements.txt (#3190) (#3378)
* Update requirements.txt

* Update environment.yaml

* Update README.md

* Update environment.yaml

* Update README.md

* Update README.md

* Delete requirements_colossalai.txt

* Update requirements.txt

* Update README.md
2023-04-06 17:50:52 +08:00
Frank Lee 80eba05b0a
[test] refactor tests with spawn (#3452)
* [test] added spawn decorator

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code
2023-04-06 14:51:35 +08:00
Frank Lee 7d8d825681
[booster] fixed the torch ddp plugin with the new checkpoint api (#3442) 2023-04-06 09:43:51 +08:00
ver217 573af84184
[example] update examples related to zero/gemini (#3431)
* [zero] update legacy import

* [zero] update examples

* [example] fix opt tutorial

* [example] fix opt tutorial

* [example] fix opt tutorial

* [example] fix opt tutorial

* [example] fix import
2023-04-04 17:32:51 +08:00
ver217 26b7aac0be
[zero] reorganize zero/gemini folder structure (#3424)
* [zero] refactor low-level zero folder structure

* [zero] fix legacy zero import path

* [zero] fix legacy zero import path

* [zero] remove useless import

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor legacy zero import path

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor legacy zero import path

* [zero] fix test import path

* [zero] fix test

* [zero] fix circular import

* [zero] update import
2023-04-04 13:48:16 +08:00
Jan Roudaut dd367ce795
[doc] polish diffusion example (#3386)
* [examples/images/diffusion]: README.md: typo fixes

* Update README.md

* Grammar fixes

* Reformulated "Step 3" (xformers) introduction

to the cost => at the cost + reworded pip availability.
2023-04-01 23:09:40 +08:00
Jan Roudaut 51cd2fec57
Typofix: malformed `xformers` version (#3384)
s/0.12.0/0.0.12/
2023-03-31 23:32:44 +08:00
YuliangLiu0306 fd6add575d
[examples] polish AutoParallel readme (#3270) 2023-03-28 10:40:07 +08:00
Frank Lee 73d3e4d309
[booster] implemented the torch ddd + resnet example (#3232)
* [booster] implemented the torch ddd + resnet example

* polish code
2023-03-27 10:24:14 +08:00
NatalieC323 280fcdc485
polish code (#3194)
Co-authored-by: YuliangLiu0306 <72588413+YuliangLiu0306@users.noreply.github.com>
2023-03-24 18:44:43 +08:00
Yan Fang 189347963a
[auto] fix requirements typo for issue #3125 (#3209) 2023-03-23 10:22:08 +08:00
NatalieC323 e5f668f280
[dreambooth] fixing the incompatibity in requirements.txt (#3190)
* Update requirements.txt

* Update environment.yaml

* Update README.md

* Update environment.yaml

* Update README.md

* Update README.md

* Delete requirements_colossalai.txt

* Update requirements.txt

* Update README.md
2023-03-21 16:01:13 +08:00
Zihao 18dbe76cae
[auto-parallel] add auto-offload feature (#3154)
* add auto-offload feature

* polish code

* fix syn offload runtime pass bug

* add offload example

* fix offload testing bug

* fix example testing bug
2023-03-21 14:17:41 +08:00
NatalieC323 4e921cfbd6
[examples] Solving the diffusion issue of incompatibility issue#3169 (#3170)
* Update requirements.txt

* Update environment.yaml

* Update README.md

* Update environment.yaml
2023-03-20 14:19:05 +08:00
binmakeswell 3c01280a56
[doc] add community contribution guide (#3153)
* [doc] update contribution guide

* [doc] update contribution guide

* [doc] add community contribution guide
2023-03-17 11:07:24 +08:00
github-actions[bot] 0aa92c0409
Automated submodule synchronization (#3105)
Co-authored-by: github-actions <github-actions@github.com>
2023-03-13 08:58:06 +08:00
binmakeswell 018936a3f3
[tutorial] update notes for TransformerEngine (#3098) 2023-03-10 16:30:52 +08:00
Kirthi Shankar Sivamani 65a4dbda6c
[NVIDIA] Add FP8 example using TE (#3080)
Signed-off-by: Kirthi Shankar Sivamani <ksivamani@nvidia.com>
2023-03-10 16:24:08 +08:00
Fazzie-Maqianli 5d5f475d75
[diffusers] fix ci and docker (#3085) 2023-03-10 10:35:15 +08:00
Camille Zhong e58a3c804c
Fix the version of lightning and colossalai in Stable Diffusion environment requirement (#3073)
1. Modify the README of stable diffusion
2. Fix the version of pytorch lightning&lightning and colossalai version to enable codes running successfully.
2023-03-10 09:55:58 +08:00
binmakeswell 360674283d
[example] fix redundant note (#3065) 2023-03-09 10:59:28 +08:00
Tomek af3888481d
[example] fixed opt model downloading from huggingface 2023-03-09 10:47:41 +08:00
ramos 2ef855c798
support shardinit option to avoid OPT OOM initializing problem (#3037)
Co-authored-by: poe <poe@nemoramo>
2023-03-08 13:45:15 +08:00
Ziyue Jiang 400f63012e
[pipeline] Add Simplified Alpa DP Partition (#2507)
* add alpa dp split

* add alpa dp split

* use fwd+bwd instead of fwd only

---------

Co-authored-by: Ziyue Jiang <ziyue.jiang@gmail.com>
2023-03-07 10:34:31 +08:00
binmakeswell 52a5078988
[doc] add ISC tutorial (#2997)
* [doc] add ISC tutorial

* [doc] add ISC tutorial

* [doc] add ISC tutorial

* [doc] add ISC tutorial
2023-03-06 10:36:38 +08:00
github-actions[bot] 827a0af8cc
Automated submodule synchronization (#2982)
Co-authored-by: github-actions <github-actions@github.com>
2023-03-03 10:55:45 +08:00
github-actions[bot] da056285f2
[format] applied code formatting on changed files in pull request 2922 (#2923)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-27 19:29:06 +08:00
binmakeswell 12bafe057f
[doc] update installation for GPT (#2922) 2023-02-27 18:28:34 +08:00
binmakeswell 0afb55fc5b
[doc] add os scope, update tutorial install and tips (#2914) 2023-02-27 14:59:27 +08:00
Alex_996 a4fc125c34
Fix typos (#2863)
Fix typos, `6.7 -> 6.7b`
2023-02-22 10:59:48 +08:00
dawei-wang 55424a16a5
[doc] fix GPT tutorial (#2860)
Fix hpcaitech/ColossalAI#2851
2023-02-22 10:58:52 +08:00
Zheng Zeng 597914317b
[doc] fix typo in opt inference tutorial (#2849) 2023-02-21 17:16:13 +08:00
github-actions[bot] a5721229d9
Automated submodule synchronization (#2740)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-20 17:35:46 +08:00
Haofan Wang 47ecb22387
[example] add LoRA support (#2821)
* add lora

* format
2023-02-20 16:23:12 +08:00
Jiarui Fang bf0204604f
[exmaple] add bert and albert (#2824) 2023-02-20 10:35:55 +08:00
Fazzie-Maqianli ba84cd80b2
fix pip install colossal (#2764) 2023-02-17 09:54:21 +08:00
cloudhuang 43dffdaba5
[doc] fixed a typo in GPT readme (#2736) 2023-02-15 22:24:45 +08:00
Fazzie-Maqianli d03f4429c1
add ci (#2641) 2023-02-15 09:55:53 +08:00
github-actions[bot] d701ef81b1
Automated submodule synchronization (#2707)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-15 09:39:44 +08:00