Commit Graph

2469 Commits (2c8ae37f61f123a305f7fe66af29140fe0f68a34)

Author SHA1 Message Date
binmakeswell d4d3387f45
[doc] add open-source contribution invitation (#2714)
* [doc] fix typo

* [doc] add invitation
2023-02-15 11:08:35 +08:00
ver217 f6b4ca4e6c
[devops] add chatgpt ci (#2713) 2023-02-15 10:53:54 +08:00
Ziyue Jiang 4603538ddd
[NFC] posh colossalai/context/process_group_initializer/initializer_sequence.py code style (#2712)
Co-authored-by: Ziyue Jiang <ziyue.jiang@gmail.com>
2023-02-15 10:53:38 +08:00
YuliangLiu0306 cb2c6a2415
[autoparallel] refactor runtime pass (#2644)
* [autoparallel] refactor runtime pass

* add unit test

* polish
2023-02-15 10:36:19 +08:00
Frank Lee 89f8975fb8
[workflow] fixed tensor-nvme build caching (#2711) 2023-02-15 10:12:55 +08:00
Zihao b3d10db5f1
[NFC] polish colossalai/cli/launcher/__init__.py code style (#2709) 2023-02-15 09:57:22 +08:00
Fazzie-Maqianli d03f4429c1
add ci (#2641) 2023-02-15 09:55:53 +08:00
YuliangLiu0306 0b2a738393
[autoparallel] remove deprecated codes (#2664) 2023-02-15 09:54:32 +08:00
YuliangLiu0306 7fa6be49d2
[autoparallel] test compatibility for gemini and auto parallel (#2700) 2023-02-15 09:43:29 +08:00
CZYCW 4ac8bfb072
[NFC] polish colossalai/engine/gradient_handler/utils.py code style (#2708) 2023-02-15 09:40:08 +08:00
github-actions[bot] d701ef81b1
Automated submodule synchronization (#2707)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-15 09:39:44 +08:00
binmakeswell 94f000515b
[doc] add Quick Preview (#2706) 2023-02-14 23:07:30 +08:00
binmakeswell 71deddc87f
[doc] resize figure (#2705)
* [doc] resize figure

* [doc] resize figure
2023-02-14 22:56:15 +08:00
binmakeswell 6a8cd687e3
[doc] add ChatGPT (#2703) 2023-02-14 22:48:30 +08:00
binmakeswell 8408c852a6
[app] fix ChatGPT requirements (#2704) 2023-02-14 22:48:15 +08:00
ver217 1b34701027
[app] add chatgpt application (#2698) 2023-02-14 22:17:25 +08:00
Liu Ziming 6427c406cf
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/strategy_generator.py code style (#2695)
Co-authored-by: shenggan <csg19971016@gmail.com>
2023-02-14 21:30:25 +08:00
ver217 c3abdd085d
[release] update version (#2691) 2023-02-14 19:37:14 +08:00
アマデウス 534f68c83c
[NFC] polish pipeline process group code style (#2694) 2023-02-14 18:12:01 +08:00
LuGY 56ff1921e9
[NFC] polish colossalai/context/moe_context.py code style (#2693) 2023-02-14 18:02:45 +08:00
Shawn-Kong 1712da2800
[NFC] polish colossalai/gemini/gemini_context.py code style (#2690) 2023-02-14 11:55:23 +08:00
binmakeswell 46f20bac41
[doc] update auto parallel paper link (#2686)
* [doc] update auto parallel paper link

* [doc] update auto parallel paper link
2023-02-13 23:05:29 +08:00
github-actions[bot] 88416019e7
Automated submodule synchronization (#2648)
Co-authored-by: github-actions <github-actions@github.com>
2023-02-13 18:10:54 +08:00
HELSON df4f020ee3
[zero1&2] only append parameters with gradients (#2681) 2023-02-13 18:00:16 +08:00
ver217 f0aa191f51
[gemini] fix colo_init_context (#2683) 2023-02-13 17:53:15 +08:00
Frank Lee 5cd8cae0c9
[workflow] fixed communtity report ranking (#2680) 2023-02-13 17:04:49 +08:00
Frank Lee c44fd0c867
[workflow] added trigger to build doc upon release (#2678) 2023-02-13 16:53:26 +08:00
Boyuan Yao 40c916b192
[autoparallel] Patch meta information of `torch.nn.functional.softmax` and `torch.nn.Softmax` (#2674)
* [autoparallel] softmax metainfo

* [autoparallel] softmax metainfo
2023-02-13 16:09:22 +08:00
Frank Lee 327bc06278
[workflow] added doc build test (#2675)
* [workflow] added doc build test

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code

* polish code
2023-02-13 15:55:57 +08:00
HELSON 8213f89fd2
[gemini] add fake_release_chunk for keep-gathered chunk in the inference mode (#2671) 2023-02-13 14:35:32 +08:00
Frank Lee 0966008839
[dooc] fixed the sidebar itemm key (#2672) 2023-02-13 10:45:16 +08:00
Frank Lee 6d60634433
[doc] added documentation sidebar translation (#2670) 2023-02-13 10:10:12 +08:00
Frank Lee 81ea66d25d
[release] v0.2.3 (#2669)
* [release] v0.2.3

* polish code
2023-02-13 09:51:25 +08:00
binmakeswell 9ab14b20b5
[doc] add CVPR tutorial (#2666) 2023-02-10 20:43:34 +08:00
binmakeswell 85bd29817e
Update README-zh-Hans.md 2023-02-10 20:36:22 +08:00
YuliangLiu0306 8de85051b3
[Docs] layout converting management (#2665) 2023-02-10 18:38:32 +08:00
Boyuan Yao 0385b26ebf
[autoparallel] Patch meta information of `torch.nn.LayerNorm` (#2647)
* [autoparallel] layernorm metainfo patch

* [autoparallel] polish test
2023-02-10 14:29:24 +08:00
Frank Lee b673e5f78b
[release] v0.2.2 (#2661) 2023-02-10 11:01:24 +08:00
Frank Lee 94f87f9651
[workflow] fixed gpu memory check condition (#2659) 2023-02-10 09:59:07 +08:00
Jiatong (Julius) Han a255a38f7f
[example] Polish README.md (#2658)
* [tutorial] polish readme.md

* [example] Update README.md
2023-02-09 20:43:55 +08:00
Frank Lee cd4f02bed8
[doc] fixed compatiblity with docusaurus (#2657) 2023-02-09 17:06:29 +08:00
Frank Lee a4ae43f071
[doc] added docusaurus-based version control (#2656) 2023-02-09 16:38:49 +08:00
Frank Lee 85b2303b55
[doc] migrate the markdown files (#2652) 2023-02-09 14:21:38 +08:00
binmakeswell a020eecc70
[doc] fix typo of BLOOM (#2643)
* [doc] fix typo of BLOOM
2023-02-08 17:28:29 +08:00
YuliangLiu0306 37df666f38
[autoparallel] refactor handlers which reshape input tensors (#2615)
* [autoparallel] refactor handlers which reshape input tensors

* polish
2023-02-08 15:02:49 +08:00
YuliangLiu0306 28398f1c70
add overlap option (#2613) 2023-02-08 15:02:31 +08:00
YuliangLiu0306 cb3d1bef62
[autoparallel] adapt autoparallel tests with latest api (#2626) 2023-02-08 15:02:12 +08:00
Frank Lee c375563653
[doc] removed pre-built wheel installation from readme (#2637) 2023-02-08 14:39:36 +08:00
Fazzie-Maqianli 292c81ed7c
fix/transformer-verison (#2581) 2023-02-08 13:50:27 +08:00
Frank Lee d3480396f8
[doc] updated the sphinx theme (#2635) 2023-02-08 13:48:08 +08:00