Commit Graph

267 Commits (b29e1f07224298aea35aab7ee83284beac28e0d8)

Author SHA1 Message Date
github-actions[bot] da056285f2
[format] applied code formatting on changed files in pull request 2922 (#2923)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-27 19:29:06 +08:00
binmakeswell 12bafe057f
[doc] update installation for GPT (#2922) 2023-02-27 18:28:34 +08:00
binmakeswell 0afb55fc5b
[doc] add os scope, update tutorial install and tips (#2914) 2023-02-27 14:59:27 +08:00
Alex_996 a4fc125c34
Fix typos (#2863)
Fix typos, `6.7 -> 6.7b`
2023-02-22 10:59:48 +08:00
dawei-wang 55424a16a5
[doc] fix GPT tutorial (#2860)
Fix hpcaitech/ColossalAI#2851
2023-02-22 10:58:52 +08:00
Zheng Zeng 597914317b
[doc] fix typo in opt inference tutorial (#2849) 2023-02-21 17:16:13 +08:00
github-actions[bot] a5721229d9
Automated submodule synchronization (#2740)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-20 17:35:46 +08:00
Haofan Wang 47ecb22387
[example] add LoRA support (#2821)
* add lora

* format
2023-02-20 16:23:12 +08:00
Jiarui Fang bf0204604f
[exmaple] add bert and albert (#2824) 2023-02-20 10:35:55 +08:00
Fazzie-Maqianli ba84cd80b2
fix pip install colossal (#2764) 2023-02-17 09:54:21 +08:00
cloudhuang 43dffdaba5
[doc] fixed a typo in GPT readme (#2736) 2023-02-15 22:24:45 +08:00
Fazzie-Maqianli d03f4429c1
add ci (#2641) 2023-02-15 09:55:53 +08:00
github-actions[bot] d701ef81b1
Automated submodule synchronization (#2707)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-15 09:39:44 +08:00
github-actions[bot] 88416019e7
Automated submodule synchronization (#2648)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-13 18:10:54 +08:00
binmakeswell 9ab14b20b5
[doc] add CVPR tutorial (#2666) 2023-02-10 20:43:34 +08:00
Jiatong (Julius) Han a255a38f7f
[example] Polish README.md (#2658)
* [tutorial] polish readme.md

* [example] Update README.md
2023-02-09 20:43:55 +08:00
Fazzie-Maqianli 292c81ed7c
fix/transformer-verison (#2581) 2023-02-08 13:50:27 +08:00
Frank Lee 4ae02c4b1c
[tutorial] added energonai to opt inference requirements (#2625) 2023-02-07 16:58:06 +08:00
binmakeswell 0556f5d468
[tutorial] add video link (#2619) 2023-02-07 15:14:51 +08:00
github-actions[bot] ae86be1fd2
Automated submodule synchronization (#2607)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-07 09:33:27 +08:00
binmakeswell 039b0c487b
[tutorial] polish README (#2568) 2023-02-04 17:49:52 +08:00
oahzxl 4f5ef73a43
[tutorial] update fastfold tutorial (#2565)
* update readme

* update

* update
2023-02-03 16:54:28 +08:00
Fazzie-Maqianli 79079a9d0c
Merge pull request #2561 from Fazziekey/v2
bug/fix diffusion ckpt problem
2023-02-03 15:42:49 +08:00
Fazzie cad1f50512 fix ckpt 2023-02-03 15:39:59 +08:00
YuliangLiu0306 f477a14f4a
[hotfix] fix autoparallel demo (#2533) 2023-01-31 17:42:45 +08:00
HELSON 6e0faa70e0
[gemini] add profiler in the demo (#2534) 2023-01-31 14:21:22 +08:00
Fazzie f35326881c fix README 2023-01-31 10:51:13 +08:00
HELSON 66dfcf5281
[gemini] update the gpt example (#2527) 2023-01-30 17:58:05 +08:00
LuGY ecbad93b65
[example] Add fastfold tutorial (#2528)
* add fastfold example

* pre-commit polish

* pre-commit polish readme and add empty test ci

* Add test_ci and reduce the default sequence length
2023-01-30 17:08:18 +08:00
Jiarui Fang fd8d19a6e7
[example] update lightning dependency for stable diffusion (#2522) 2023-01-29 13:52:15 +08:00
HELSON 707b11d4a0
[gemini] update ddp strict mode (#2518)
* [zero] add strict ddp mode for chunk init

* [gemini] update gpt example
2023-01-28 14:35:25 +08:00
HELSON 2d1a7dfe5f
[zero] add strict ddp mode (#2508)
* [zero] add strict ddp mode

* [polish] add comments for strict ddp mode

* [zero] fix test error
2023-01-20 14:04:38 +08:00
jiaruifang 32390cbe8f add test_ci.sh to dreambooth 2023-01-19 09:46:28 +08:00
jiaruifang 025b482dc1 [example] dreambooth example 2023-01-18 18:42:56 +08:00
jiaruifang e58cc441e2 polish code and fix dataloader bugs 2023-01-18 12:00:08 +08:00
jiaruifang a4b75b78a0 [hotfix] gpt example titans bug #2493 2023-01-18 11:37:16 +08:00
binmakeswell fcc6d61d92
[example] fix requirements (#2488) 2023-01-17 13:07:25 +08:00
Jiarui Fang 3a21485ead
[example] titans for gpt (#2484) 2023-01-16 15:55:41 +08:00
Jiarui Fang 7c31706227
[CI] add test_ci.sh for palm, opt and gpt (#2475) 2023-01-16 14:44:29 +08:00
Jiarui Fang e4c38ba367
[example] stable diffusion add roadmap (#2482) 2023-01-16 12:14:49 +08:00
ver217 f525d1f528
[example] update gpt gemini example ci test (#2477) 2023-01-13 22:37:31 +08:00
Ziyue Jiang fef5c949c3
polish pp middleware (#2476)
Co-authored-by: Ziyue Jiang <ziyue.jiang@gmail.com>
2023-01-13 16:56:01 +08:00
Frank Lee 8b7495dd54
[example] integrate seq-parallel tutorial with CI (#2463) 2023-01-13 14:40:05 +08:00
ver217 8e85d2440a
[example] update vit ci script (#2469)
* [example] update vit ci script

* [example] update requirements

* [example] update requirements
2023-01-13 13:31:27 +08:00
Jiarui Fang 867c8c2d3a
[zero] low level optim supports ProcessGroup (#2464) 2023-01-13 10:05:58 +08:00
Frank Lee e6943e2d11
[example] integrate autoparallel demo with CI (#2466)
* [example] integrate autoparallel demo with CI

* polish code

* polish code

* polish code

* polish code
2023-01-12 16:26:42 +08:00
YuliangLiu0306 c20529fe78
[examples] update autoparallel tutorial demo (#2449)
* [examples] update autoparallel tutorial demo

* add test_ci.sh

* polish

* add conda yaml
2023-01-12 14:30:58 +08:00
Haofan Wang cfd1d5ee49
[example] fixed seed error in train_dreambooth_colossalai.py (#2445) 2023-01-11 16:56:15 +08:00
Frank Lee ac18a445fa
[example] updated large-batch optimizer tutorial (#2448)
* [example] updated large-batch optimizer tutorial

* polish code

* polish code
2023-01-11 16:27:31 +08:00
Frank Lee 39163417a1
[example] updated the hybrid parallel tutorial (#2444)
* [example] updated the hybrid parallel tutorial

* polish code
2023-01-11 15:17:17 +08:00
YuliangLiu0306 2731531bc2
[autoparallel] integrate device mesh initialization into autoparallelize (#2393)
* [autoparallel] integrate device mesh initialization into autoparallelize

* add megatron solution

* update gpt autoparallel examples with latest api

* adapt beta value to fit the current computation cost
2023-01-11 14:03:49 +08:00
Frank Lee a3e5496156
[example] improved the clarity yof the example readme (#2427)
* [example] improved the clarity yof the example readme

* polish workflow

* polish workflow

* polish workflow

* polish workflow

* polish workflow

* polish workflow
2023-01-11 10:46:32 +08:00
Frank Lee 63be79d505
[example] removed duplicated stable diffusion example (#2424) 2023-01-11 10:07:18 +08:00
ZijianYY fe0f7970a2
[examples] adding tflops to PaLM (#2365) 2023-01-10 16:18:56 +08:00
HELSON d84e747975
[hotfix] add DISTPAN argument for benchmark (#2412)
* change the benchmark config file

* change config

* revert config file

* rename distpan to distplan
2023-01-10 11:39:25 +08:00
Frank Lee 8327932d2c
[workflow] refactored the example check workflow (#2411)
* [workflow] refactored the example check workflow

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code
2023-01-10 11:26:19 +08:00
HELSON 498b5ca993
[hotfix] fix gpt gemini example (#2404)
* [hotfix] fix gpt gemini example

* [example] add new assertions
2023-01-09 15:52:17 +08:00
jiaruifang b2e0d502b8 [doc] hotfix #2377 2023-01-07 19:44:50 +08:00
Jiarui Fang 8f72b6f8fb
[hotfix] fix implement error in diffusers 2023-01-07 07:56:39 +08:00
1SAA 33f3023e19 [hotfix] fix implement error in diffusers 2023-01-06 18:37:18 +08:00
Jiarui Fang 12c8bf38d7
[Pipeline] Refine GPT PP Example 2023-01-06 18:03:45 +08:00
Ziyue Jiang ad00894f7f polish 2023-01-06 16:03:16 +08:00
Jiarui Fang 1aaeb596c6
[example] gpt, shard init on all processes (#2366) 2023-01-06 15:44:50 +08:00
Ziyue Jiang 3a15b20421 Move GPT PP Example 2023-01-06 14:48:58 +08:00
HELSON 48d33b1b17
[gemini] add get static torch model (#2356) 2023-01-06 13:41:19 +08:00
Fazzie-Maqianli 7a332b1734
Merge pull request #2338 from haofanwang/patch-1
Fix a typo in train_dreambooth_colossalai.py
2023-01-06 11:50:18 +08:00
YuliangLiu0306 8b1e0dfd80
[example] upload auto parallel gpt2 demo (#2354) 2023-01-06 11:38:38 +08:00
Jiarui Fang 00a9c781fd
[example] add google doc for benchmark results of GPT (#2355) 2023-01-06 11:38:15 +08:00
Jiarui Fang 509a87f3ff
[example] make gpt example directory more clear (#2353) 2023-01-06 11:11:26 +08:00
Ikko Eltociear Ashimine 5e4bced0a3
[NFC] Update roberta/README.md (#2350) 2023-01-06 10:09:14 +08:00
Jiarui Fang 35e22be2f6
[example] simplify opt example (#2344) 2023-01-06 10:08:41 +08:00
ziyuhuang123 7080a8edb0
[workflow]New version: Create workflow files for examples' auto check (#2298)
* [workflows]bug_repair

* [workflow]new_pr_fixing_bugs

Co-authored-by: binmakeswell <binmakeswell@gmail.com>
2023-01-06 09:26:49 +08:00
binmakeswell d7352bef2c
[example] add example requirement (#2345) 2023-01-06 09:03:29 +08:00
Haofan Wang 7ce965c7cc
Update requirement_colossalai.txt (#2348) 2023-01-05 21:16:42 +08:00
ZijianYY f7fd592bf4
[examples]adding tp to PaLM (#2319) 2023-01-05 17:57:50 +08:00
Haofan Wang 9edd0aa75e
Update train_dreambooth_colossalai.py
accelerator.num_processes -> gpc.get_world_size(ParallelMode.DATA)
2023-01-05 15:49:57 +08:00
Fazzie-Maqianli 89f26331e9
[example] diffusion update diffusion,Dreamblooth (#2329) 2023-01-05 11:23:26 +08:00
binmakeswell e512ca9c24
[doc] update stable diffusion link (#2322)
* [doc] update link
2023-01-04 19:38:06 +08:00
Fazzie-Maqianli a9b27b9265
[exmaple] fix dreamblooth format (#2315) 2023-01-04 16:20:00 +08:00
Jiarui Fang 32253315b4
[example] update diffusion readme with official lightning (#2304) 2023-01-04 13:13:38 +08:00
HELSON e00cedd181
[example] update gemini benchmark bash (#2306) 2023-01-04 11:59:26 +08:00
binmakeswell c8144223b8
[doc] update diffusion doc (#2296) 2023-01-03 21:27:44 +08:00
ZijianYY df1d6dc553
[examples] using args and combining two versions for PaLM (#2284) 2023-01-03 17:49:00 +08:00
Ziyue Jiang ac863a01d6
[example] add benchmark (#2276)
* add benchmark

* merge common func

* add total and avg tflops

Co-authored-by: Ziyue Jiang <ziyue.jiang@gmail.com>
2023-01-03 17:20:59 +08:00
BlueRum 1405b4381e
[example] fix save_load bug for dreambooth (#2280) 2023-01-03 17:13:29 +08:00
Jiarui Fang 879df8b943
[example] GPT polish readme (#2274) 2023-01-03 15:46:52 +08:00
Ziyue Jiang 9654df0e9a
Add GPT PP Example (#2272)
Co-authored-by: Ziyue Jiang <ziyue.jiang@gmail.com>
2023-01-03 15:17:26 +08:00
YuliangLiu0306 4b29112ab2
[autoparallel] gpt2 autoparallel examples (#2267)
* [autoparallel] gpt2 autoparallel examples

* polish code

* polish code
2023-01-03 14:23:33 +08:00
HELSON 09c0102fe6
[example] fix gpt example with 0.1.10 (#2265) 2023-01-03 13:38:14 +08:00
Fazzie-Maqianli 89f048a88a
[example] clear diffuser image (#2262) 2023-01-03 10:57:02 +08:00
Frank Lee 89542ceb44
[doc] updated the stable diffussion on docker usage (#2244)
* [doc] updated the stable diffussion on docker usage

* polish doc
2022-12-30 18:00:20 +08:00
Jiarui Fang 50cdf5430e
[example] diffusion install from docker (#2239)
* [builder] builder for scaled_upper_triang_masked_softmax

* add missing files

* fix a bug

* polish code

* [example] diffusion install from docker
2022-12-30 16:25:24 +08:00
Jiarui Fang db4cbdc7fb
[builder] builder for scaled_upper_triang_masked_softmax (#2234) 2022-12-30 09:58:00 +08:00
HELSON 31fe84237b
[example] fix benchmark.sh for gpt example (#2229) 2022-12-29 23:00:14 +08:00
Jiarui Fang 2cdecc9f38
[example] make palm + GeminiDPP work (#2227) 2022-12-29 14:28:31 +08:00
ZijianYY 63cc77173b
[example] Palm adding gemini, still has bugs (#2221) 2022-12-29 14:01:09 +08:00
HELSON 7010e18134
[example] update gpt example (#2225) 2022-12-29 12:01:45 +08:00
Jiarui Fang 49c601da21
[example] add benchmark.sh for gpt (#2226) 2022-12-29 12:00:00 +08:00
HELSON 3629e611cd
[example] update gpt benchmark (#2219) 2022-12-29 10:51:42 +08:00
ZijianYY 92de90dfb3
[examples] replace einsum with matmul (#2210) 2022-12-28 19:03:06 +08:00