Commit Graph

1817 Commits (867c8c2d3a90bbf55a5bedba80a3aeabe0299d0f)

Author SHA1 Message Date
Shawn-Kong d42aecdda1
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/embedding_handler.py code style (#2368) 2023-01-06 15:47:10 +08:00
Jiarui Fang 1aaeb596c6
[example] gpt, shard init on all processes (#2366) 2023-01-06 15:44:50 +08:00
oahzxl f4a1607e56 seperate input node dim search 2023-01-06 15:36:17 +08:00
binmakeswell 1f8ab6f1f5
[NFC] polish code format (#2367) 2023-01-06 15:34:48 +08:00
oahzxl ae27a8b26d seperate flow tracer 2023-01-06 14:57:33 +08:00
Ziyue Jiang 3a15b20421 Move GPT PP Example 2023-01-06 14:48:58 +08:00
oahzxl fd87d78a28 rename ambiguous variable 2023-01-06 14:28:04 +08:00
oahzxl 2bde9d2b7f code format 2023-01-06 14:21:49 +08:00
oahzxl 8a634af2f5 close mem and code print 2023-01-06 14:19:45 +08:00
oahzxl 1a6d2a740b take apart chunk code gen 2023-01-06 14:14:45 +08:00
ExtremeViscent ac0d30fe2e
[NFC] polish batch_norm_handler.py code style (#2359) 2023-01-06 13:41:38 +08:00
HELSON 48d33b1b17
[gemini] add get static torch model (#2356) 2023-01-06 13:41:19 +08:00
Fazzie-Maqianli 7a332b1734
Merge pull request #2338 from haofanwang/patch-1
Fix a typo in train_dreambooth_colossalai.py
2023-01-06 11:50:18 +08:00
oahzxl d1f0773182 rename 2023-01-06 11:48:33 +08:00
oahzxl 06a5355d98 update test 2023-01-06 11:44:01 +08:00
oahzxl efb1c64c30 restruct dir 2023-01-06 11:39:26 +08:00
YuliangLiu0306 8b1e0dfd80
[example] upload auto parallel gpt2 demo (#2354) 2023-01-06 11:38:38 +08:00
Jiarui Fang 00a9c781fd
[example] add google doc for benchmark results of GPT (#2355) 2023-01-06 11:38:15 +08:00
Jiarui Fang 509a87f3ff
[example] make gpt example directory more clear (#2353) 2023-01-06 11:11:26 +08:00
oahzxl 27ab524096 refactor structure 2023-01-06 11:07:57 +08:00
Arsmart1 7027540d3d
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/_utils.py code style (#2352) 2023-01-06 10:09:34 +08:00
Ikko Eltociear Ashimine 5e4bced0a3
[NFC] Update roberta/README.md (#2350) 2023-01-06 10:09:14 +08:00
Jiarui Fang 35e22be2f6
[example] simplify opt example (#2344) 2023-01-06 10:08:41 +08:00
ziyuhuang123 7080a8edb0
[workflow]New version: Create workflow files for examples' auto check (#2298)
* [workflows]bug_repair

* [workflow]new_pr_fixing_bugs

Co-authored-by: binmakeswell <binmakeswell@gmail.com>
2023-01-06 09:26:49 +08:00
binmakeswell d7352bef2c
[example] add example requirement (#2345) 2023-01-06 09:03:29 +08:00
LuGY e11a005c02
[NFC] polish colossalai/auto_parallel/tensor_shard/utils/factory.py code style (#2349) 2023-01-05 21:17:42 +08:00
Haofan Wang 7ce965c7cc
Update requirement_colossalai.txt (#2348) 2023-01-05 21:16:42 +08:00
ZijianYY f7fd592bf4
[examples]adding tp to PaLM (#2319) 2023-01-05 17:57:50 +08:00
oahzxl 71e72c4890 last version of benchmark 2023-01-05 17:54:25 +08:00
YuliangLiu0306 b5a3a4a65f [device] find best logical mesh 2023-01-05 17:21:29 +08:00
yuxuan-lou 28e2d16794
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/graph_analysis.py code style (#2340) 2023-01-05 16:53:24 +08:00
YuliangLiu0306 9c9246c0d9
[device] alpha beta profiler (#2311)
* [device] alpha beta profiler

* add usage

* fix variable name
2023-01-05 16:39:55 +08:00
Maruyama_Aya bd12a49e2a
[NFC] polish <colossalai/auto_parallel/tensor_shard/deprecated/constants.py> code style (#2339) 2023-01-05 16:20:54 +08:00
Haofan Wang 9edd0aa75e
Update train_dreambooth_colossalai.py
accelerator.num_processes -> gpc.get_world_size(ParallelMode.DATA)
2023-01-05 15:49:57 +08:00
Frank Lee f1bc2418c4
[setup] make cuda extension build optional (#2336)
* [setup] make cuda extension build optional

* polish code

* polish code

* polish code
2023-01-05 15:13:11 +08:00
Frank Lee 8711310cda
[setup] remove torch dependency (#2333) 2023-01-05 13:53:28 +08:00
Zihao 35427bcab4
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/unary_elementwise_handler.py code style (#2326) 2023-01-05 12:18:08 +08:00
oahzxl 55cb713f36 update min memory stratege, reduce mem usage by 30% 2023-01-05 11:29:22 +08:00
Fazzie-Maqianli 89f26331e9
[example] diffusion update diffusion,Dreamblooth (#2329) 2023-01-05 11:23:26 +08:00
Frank Lee 6e34cc0830
[workflow] fixed pypi release workflow error (#2328) 2023-01-05 10:52:43 +08:00
Frank Lee 2916eed34a
[workflow] fixed pypi release workflow error (#2327) 2023-01-05 10:48:38 +08:00
Frank Lee 8d8dec09ba
[workflow] added workflow to release to pypi upon version change (#2320)
* [workflow] added workflow to release to pypi upon version change

* polish code

* polish code

* polish code
2023-01-05 10:40:18 +08:00
Frank Lee 693ef121a1
[workflow] removed unused assign reviewer workflow (#2318) 2023-01-05 10:40:07 +08:00
binmakeswell e512ca9c24
[doc] update stable diffusion link (#2322)
* [doc] update link
2023-01-04 19:38:06 +08:00
Frank Lee e8dfa2e2e0
[workflow] rebuild cuda kernels when kernel-related files change (#2317) 2023-01-04 17:23:59 +08:00
Jiarui Fang db6eea3583
[builder] reconfig op_builder for pypi install (#2314) 2023-01-04 16:32:32 +08:00
Fazzie-Maqianli a9b27b9265
[exmaple] fix dreamblooth format (#2315) 2023-01-04 16:20:00 +08:00
Sze-qq da1c47f060
update ColossalAI logo (#2316)
Co-authored-by: siqi <siqi@siqis-MacBook-Pro.local>
2023-01-04 15:41:53 +08:00
Junming Wu 4a79c10750 [NFC] polish colossalai/cli/benchmark/__init__.py code style (#2308) 2023-01-04 15:09:57 +08:00
Ofey Chan 87d2defda6 [NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/layer_norm_handler.py code style (#2305) 2023-01-04 15:09:57 +08:00