Commit Graph

261 Commits (41fb7236aa32c307e83b0b9cc50ce2a6da279343)

Author SHA1 Message Date
jiaruifang 27211d6267 [example] polish diffusion readme 2022-11-09 09:38:05 +08:00
binmakeswell 4ac7d3ec3b
[doc] polish diffusion README (#1840) 2022-11-08 22:36:55 +08:00
Jiarui Fang f86a703bcf
[NFC] update gitignore remove DS_Store (#1830) 2022-11-08 17:18:15 +08:00
Jiarui Fang a25f755331
[example] add TP to GPT example (#1828) 2022-11-08 17:17:19 +08:00
Fazzie-Maqianli 6e9730d7ab
[example] add stable diffuser (#1825) 2022-11-08 16:14:45 +08:00
Jiarui Fang b1263d32ba
[example] simplify the GPT2 huggingface example (#1826) 2022-11-08 16:14:07 +08:00
Jiarui Fang cd5a0d56fa
[Gemini] make gemini usage simple (#1821) 2022-11-08 15:53:13 +08:00
Maruyama_Aya a7e8159da6 add ColoDiffusion codes: /ldm/module/, /ldm/data/, /scripts/test/ 2022-11-08 14:39:35 +08:00
Jiarui Fang 350ccc0481
[example] opt does not depend on Titans (#1811) 2022-11-08 12:02:20 +08:00
Jiarui Fang 203ca57aed
[example] add GPT 2022-11-08 10:58:17 +08:00
Jiarui Fang fd2c8d8156
[example] add opt model in lauguage (#1809) 2022-11-08 10:39:13 +08:00
Jiarui Fang f5a92c288c
[example] add diffusion to example (#1805) 2022-11-07 17:43:36 +08:00
Jiarui Fang a19eb80998
[embedding] updates some default parameters 2022-09-15 15:45:17 +08:00
github-actions[bot] 177d3f5718
Automated submodule synchronization (#1465)
Co-authored-by: github-actions <github-actions@github.com>
2022-08-19 13:39:21 +08:00
github-actions[bot] 9b442ecdc3
Automated submodule synchronization (#1404)
Co-authored-by: github-actions <github-actions@github.com>
2022-08-08 11:24:58 +08:00
github-actions[bot] 1e5eb0874c
Automated submodule synchronization (#1396)
Co-authored-by: github-actions <github-actions@github.com>
2022-08-03 09:18:45 +08:00
github-actions[bot] 50dec605e1
Automated submodule synchronization (#1380)
Co-authored-by: github-actions <github-actions@github.com>
2022-07-28 11:12:52 +08:00
github-actions[bot] fb6f085907
Automated submodule synchronization (#1372)
Co-authored-by: github-actions <github-actions@github.com>
2022-07-27 09:25:03 +08:00
github-actions[bot] 6160a1d6a7
Automated submodule synchronization (#1348)
Co-authored-by: github-actions <github-actions@github.com>
2022-07-21 10:50:27 +08:00
github-actions[bot] 6f2f9eb214
Automated submodule synchronization (#1305)
Co-authored-by: github-actions <github-actions@github.com>
2022-07-14 13:40:54 +08:00
github-actions[bot] 762905da68
Automated submodule synchronization (#1241)
Co-authored-by: github-actions <github-actions@github.com>
2022-07-12 10:32:20 +08:00
github-actions[bot] 4951f7d80c
Automated submodule synchronization (#1204)
Co-authored-by: github-actions <github-actions@github.com>
2022-07-07 15:22:45 +08:00
github-actions[bot] 23442a5bc1
Automated submodule synchronization (#1194)
Co-authored-by: github-actions <github-actions@github.com>
2022-07-04 10:12:17 +08:00
github-actions[bot] 6f0733a1ef
Automated submodule synchronization (#1159)
Co-authored-by: github-actions <github-actions@github.com>
2022-06-29 15:11:36 +08:00
github-actions[bot] e8c34eedfd
Automated submodule synchronization (#1129)
Co-authored-by: github-actions <github-actions@github.com>
2022-06-22 14:39:08 +08:00
github-actions[bot] 85b58093d2
Automated submodule synchronization (#1105)
Co-authored-by: github-actions <github-actions@github.com>
2022-06-14 09:53:30 +08:00
github-actions[bot] e32470b6de
Automated submodule synchronization (#1049)
Co-authored-by: github-actions <github-actions@github.com>
2022-06-01 11:04:32 +08:00
github-actions[bot] 4d8a574cd3
Automated submodule synchronization (#1034)
Co-authored-by: github-actions <github-actions@github.com>
2022-05-27 17:12:48 +08:00
github-actions[bot] 9e3d602dba
Automated submodule synchronization (#1003)
Co-authored-by: github-actions <github-actions@github.com>
2022-05-20 17:08:44 +08:00
github-actions[bot] 46bc95708f
Automated submodule synchronization (#960)
Co-authored-by: github-actions <github-actions@github.com>
2022-05-14 21:55:34 +08:00
github-actions[bot] 7edb38193a
Automated submodule synchronization (#932)
Co-authored-by: github-actions <github-actions@github.com>
2022-05-13 10:22:51 +08:00
github-actions[bot] b61d64685f
Automated submodule synchronization (#929)
Co-authored-by: github-actions <github-actions@github.com>
2022-05-11 09:13:06 +08:00
github-actions[bot] 1cf7fb3cd9
Automated submodule synchronization (#912)
Co-authored-by: github-actions <github-actions@github.com>
2022-05-06 10:10:56 +08:00
github-actions[bot] 3b1f5f07ce
Automated submodule synchronization (#907)
Co-authored-by: github-actions <github-actions@github.com>
2022-05-03 13:14:48 +08:00
github-actions[bot] f271f34716
Automated submodule synchronization (#827)
Co-authored-by: github-actions <github-actions@github.com>
2022-04-22 15:24:58 +08:00
github-actions[bot] 413ce30c45
Automated submodule synchronization (#819)
Co-authored-by: github-actions <github-actions@github.com>
2022-04-21 11:26:58 +08:00
github-actions[bot] 9aae4197bb
Automated submodule synchronization (#810)
Co-authored-by: github-actions <github-actions@github.com>
2022-04-20 13:57:12 +08:00
github-actions[bot] 6978980f6d
Automated submodule synchronization (#751)
Co-authored-by: github-actions <github-actions@github.com>
2022-04-14 15:34:01 +08:00
github-actions[bot] d878d843ad
Automated submodule synchronization (#695)
Co-authored-by: github-actions <github-actions@github.com>
2022-04-08 10:03:53 +08:00
github-actions[bot] d50cdabbc9
Automated submodule synchronization (#556)
Co-authored-by: github-actions <github-actions@github.com>
2022-04-07 22:11:00 +08:00
github-actions[bot] 92f4224867
Automated submodule synchronization (#501) 2022-03-30 14:06:23 +08:00
github-actions[bot] 353566c198
Automated submodule synchronization (#483)
Co-authored-by: github-actions <github-actions@github.com>
2022-03-22 09:34:26 +08:00
github-actions[bot] cfcc8271f3
[Bot] Automated submodule synchronization (#451)
Co-authored-by: github-actions <github-actions@github.com>
2022-03-18 09:51:43 +08:00
github-actions 6098bc4cce Automated submodule synchronization 2022-03-14 00:01:12 +00:00
github-actions b9f8521f8c Automated submodule synchronization 2022-02-15 11:35:37 +08:00
github-actions[bot] 5420809f43
Automated submodule synchronization (#203)
Co-authored-by: github-actions <github-actions@github.com>
2022-02-04 10:19:38 +08:00
Frank Lee ca4ae52d6b
Set examples as submodule (#162)
* remove examples folder

* added examples as submodule

* update .gitmodules
2022-01-19 16:35:36 +08:00
LuGY_mac d143396cac Added rand augment and update the dataloader 2022-01-18 16:14:46 +08:00
HELSON 1ff5be36c2
Added moe parallel example (#140) 2022-01-17 15:34:04 +08:00
ver217 f03bcb359b
update vit example for new API (#98) (#99) 2022-01-04 20:35:33 +08:00
アマデウス 0fedef4f3c
Layer integration (#83)
* integrated parallel layers for ease of building models

* integrated 2.5d layers

* cleaned codes and unit tests

* added log metric by step hook; updated imagenet benchmark; fixed some bugs

* reworked initialization; cleaned codes

Co-authored-by: BoxiangW <45734921+BoxiangW@users.noreply.github.com>
2021-12-27 15:04:32 +08:00
Xin Zhang 648f806315
add example of self-supervised SimCLR training - V2 (#50)
* add example of self-supervised SimCLR training

* simclr v2, replace nvidia dali dataloader

* updated

* sync to latest code writing style

* sync to latest code writing style and modify README

* detail README & standardize dataset path
2021-12-21 08:07:18 +08:00
Frank Lee 35813ed3c4
update examples and sphnix docs for the new api (#63) 2021-12-13 22:07:01 +08:00
Frank Lee da01c234e1
Develop/experiments (#59)
* Add gradient accumulation, fix lr scheduler

* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)

* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes

* fixed trainer

* Revert "fixed trainer"

This reverts commit 2e0b0b7699.

* improved consistency between trainer, engine and schedule (#23)

Co-authored-by: 1SAA <c2h214748@gmail.com>

* Split conv2d, class token, positional embedding in 2d, Fix random number in ddp
Fix convergence in cifar10, Imagenet1000

* Integrate 1d tensor parallel in Colossal-AI (#39)

* fixed 1D and 2D convergence (#38)

* optimized 2D operations

* fixed 1D ViT convergence problem

* Feature/ddp (#49)

* remove redundancy func in setup (#19) (#20)

* use env to control the language of doc (#24) (#25)

* Support TP-compatible Torch AMP and Update trainer API (#27)

* Add gradient accumulation, fix lr scheduler

* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)

* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes

* fixed trainer

* Revert "fixed trainer"

This reverts commit 2e0b0b7699.

* improved consistency between trainer, engine and schedule (#23)

Co-authored-by: 1SAA <c2h214748@gmail.com>

Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>

* add an example of ViT-B/16 and remove w_norm clipping in LAMB (#29)

* add explanation for ViT example (#35) (#36)

* support torch ddp

* fix loss accumulation

* add log for ddp

* change seed

* modify timing hook

Co-authored-by: Frank Lee <somerlee.9@gmail.com>
Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: binmakeswell <binmakeswell@gmail.com>

* Feature/pipeline (#40)

* remove redundancy func in setup (#19) (#20)

* use env to control the language of doc (#24) (#25)

* Support TP-compatible Torch AMP and Update trainer API (#27)

* Add gradient accumulation, fix lr scheduler

* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)

* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes

* fixed trainer

* Revert "fixed trainer"

This reverts commit 2e0b0b7699.

* improved consistency between trainer, engine and schedule (#23)

Co-authored-by: 1SAA <c2h214748@gmail.com>

Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>

* add an example of ViT-B/16 and remove w_norm clipping in LAMB (#29)

* add explanation for ViT example (#35) (#36)

* optimize communication of pipeline parallel

* fix grad clip for pipeline

Co-authored-by: Frank Lee <somerlee.9@gmail.com>
Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: binmakeswell <binmakeswell@gmail.com>

* optimized 3d layer to fix slow computation ; tested imagenet performance with 3d; reworked lr_scheduler config definition; fixed launch args; fixed some printing issues; simplified apis of 3d layers (#51)

* Update 2.5d layer code to get a similar accuracy on imagenet-1k dataset

* update api for better usability (#58)

update api for better usability

Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>
Co-authored-by: puck_WCR <46049915+WANG-CR@users.noreply.github.com>
Co-authored-by: binmakeswell <binmakeswell@gmail.com>
Co-authored-by: アマデウス <kurisusnowdeng@users.noreply.github.com>
Co-authored-by: BoxiangW <45734921+BoxiangW@users.noreply.github.com>
2021-12-09 15:08:29 +08:00
ver217 eb2f8b1f6b
add how to build tfrecord dataset (#48) 2021-12-02 16:31:23 +08:00
ver217 4da256a584
add some details in vit-b16 example (#46) 2021-12-02 09:29:27 +08:00
ver217 e67dab92a9
add some details in vit-b16 example (#43) (#44) 2021-12-02 08:55:11 +08:00
binmakeswell 2528adc62f
add explanation for ViT example (#35) (#36) 2021-11-29 10:25:38 +08:00
ver217 dbe62c67b8
add an example of ViT-B/16 and remove w_norm clipping in LAMB (#29) 2021-11-18 23:45:09 +08:00
Frank Lee 3defa32aee
Support TP-compatible Torch AMP and Update trainer API (#27)
* Add gradient accumulation, fix lr scheduler

* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)

* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes

* fixed trainer

* Revert "fixed trainer"

This reverts commit 2e0b0b7699.

* improved consistency between trainer, engine and schedule (#23)

Co-authored-by: 1SAA <c2h214748@gmail.com>

Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>
2021-11-18 19:45:06 +08:00
zbian 404ecbdcc6 Migrated project 2021-10-28 18:21:23 +02:00