YuliangLiu0306
|
7fa6be49d2
|
[autoparallel] test compatibility for gemini and auto parallel (#2700)
|
2 years ago |
CZYCW
|
4ac8bfb072
|
[NFC] polish colossalai/engine/gradient_handler/utils.py code style (#2708)
|
2 years ago |
github-actions[bot]
|
d701ef81b1
|
Automated submodule synchronization (#2707)
Co-authored-by: github-actions <github-actions@github.com>
|
2 years ago |
binmakeswell
|
94f000515b
|
[doc] add Quick Preview (#2706)
|
2 years ago |
binmakeswell
|
71deddc87f
|
[doc] resize figure (#2705)
* [doc] resize figure
* [doc] resize figure
|
2 years ago |
binmakeswell
|
6a8cd687e3
|
[doc] add ChatGPT (#2703)
|
2 years ago |
binmakeswell
|
8408c852a6
|
[app] fix ChatGPT requirements (#2704)
|
2 years ago |
ver217
|
1b34701027
|
[app] add chatgpt application (#2698)
|
2 years ago |
Liu Ziming
|
6427c406cf
|
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/strategy_generator.py code style (#2695)
Co-authored-by: shenggan <csg19971016@gmail.com>
|
2 years ago |
ver217
|
c3abdd085d
|
[release] update version (#2691)
|
2 years ago |
アマデウス
|
534f68c83c
|
[NFC] polish pipeline process group code style (#2694)
|
2 years ago |
LuGY
|
56ff1921e9
|
[NFC] polish colossalai/context/moe_context.py code style (#2693)
|
2 years ago |
Shawn-Kong
|
1712da2800
|
[NFC] polish colossalai/gemini/gemini_context.py code style (#2690)
|
2 years ago |
binmakeswell
|
46f20bac41
|
[doc] update auto parallel paper link (#2686)
* [doc] update auto parallel paper link
* [doc] update auto parallel paper link
|
2 years ago |
github-actions[bot]
|
88416019e7
|
Automated submodule synchronization (#2648)
Co-authored-by: github-actions <github-actions@github.com>
|
2 years ago |
HELSON
|
df4f020ee3
|
[zero1&2] only append parameters with gradients (#2681)
|
2 years ago |
ver217
|
f0aa191f51
|
[gemini] fix colo_init_context (#2683)
|
2 years ago |
Frank Lee
|
5cd8cae0c9
|
[workflow] fixed communtity report ranking (#2680)
|
2 years ago |
Frank Lee
|
c44fd0c867
|
[workflow] added trigger to build doc upon release (#2678)
|
2 years ago |
Boyuan Yao
|
40c916b192
|
[autoparallel] Patch meta information of `torch.nn.functional.softmax` and `torch.nn.Softmax` (#2674)
* [autoparallel] softmax metainfo
* [autoparallel] softmax metainfo
|
2 years ago |
Frank Lee
|
327bc06278
|
[workflow] added doc build test (#2675)
* [workflow] added doc build test
* polish code
* polish code
* polish code
* polish code
* polish code
* polish code
* polish code
* polish code
* polish code
|
2 years ago |
HELSON
|
8213f89fd2
|
[gemini] add fake_release_chunk for keep-gathered chunk in the inference mode (#2671)
|
2 years ago |
Frank Lee
|
0966008839
|
[dooc] fixed the sidebar itemm key (#2672)
|
2 years ago |
Frank Lee
|
6d60634433
|
[doc] added documentation sidebar translation (#2670)
|
2 years ago |
Frank Lee
|
81ea66d25d
|
[release] v0.2.3 (#2669)
* [release] v0.2.3
* polish code
|
2 years ago |
binmakeswell
|
9ab14b20b5
|
[doc] add CVPR tutorial (#2666)
|
2 years ago |
binmakeswell
|
85bd29817e
|
Update README-zh-Hans.md
|
2 years ago |
YuliangLiu0306
|
8de85051b3
|
[Docs] layout converting management (#2665)
|
2 years ago |
Boyuan Yao
|
0385b26ebf
|
[autoparallel] Patch meta information of `torch.nn.LayerNorm` (#2647)
* [autoparallel] layernorm metainfo patch
* [autoparallel] polish test
|
2 years ago |
Frank Lee
|
b673e5f78b
|
[release] v0.2.2 (#2661)
|
2 years ago |
Frank Lee
|
94f87f9651
|
[workflow] fixed gpu memory check condition (#2659)
|
2 years ago |
Jiatong (Julius) Han
|
a255a38f7f
|
[example] Polish README.md (#2658)
* [tutorial] polish readme.md
* [example] Update README.md
|
2 years ago |
Frank Lee
|
cd4f02bed8
|
[doc] fixed compatiblity with docusaurus (#2657)
|
2 years ago |
Frank Lee
|
a4ae43f071
|
[doc] added docusaurus-based version control (#2656)
|
2 years ago |
Frank Lee
|
85b2303b55
|
[doc] migrate the markdown files (#2652)
|
2 years ago |
binmakeswell
|
a020eecc70
|
[doc] fix typo of BLOOM (#2643)
* [doc] fix typo of BLOOM
|
2 years ago |
YuliangLiu0306
|
37df666f38
|
[autoparallel] refactor handlers which reshape input tensors (#2615)
* [autoparallel] refactor handlers which reshape input tensors
* polish
|
2 years ago |
YuliangLiu0306
|
28398f1c70
|
add overlap option (#2613)
|
2 years ago |
YuliangLiu0306
|
cb3d1bef62
|
[autoparallel] adapt autoparallel tests with latest api (#2626)
|
2 years ago |
Frank Lee
|
c375563653
|
[doc] removed pre-built wheel installation from readme (#2637)
|
2 years ago |
Fazzie-Maqianli
|
292c81ed7c
|
fix/transformer-verison (#2581)
|
2 years ago |
Frank Lee
|
d3480396f8
|
[doc] updated the sphinx theme (#2635)
|
2 years ago |
Boyuan Yao
|
90a9fdd91d
|
[autoparallel] Patch meta information of `torch.matmul` (#2584)
* [autoparallel] matmul metainfo
* [auto_parallel] remove unused print
* [tests] skip test_matmul_handler when torch version is lower than 1.12.0
|
2 years ago |
Frank Lee
|
4ae02c4b1c
|
[tutorial] added energonai to opt inference requirements (#2625)
|
2 years ago |
oahzxl
|
6ba8364881
|
[autochunk] support diffusion for autochunk (#2621)
* add alphafold benchmark
* renae alphafold test
* rename tests
* rename diffuser
* renme
* rename
* update transformer
* update benchmark
* update benchmark
* update bench memory
* update transformer benchmark
* rename
* support diffuser
* support unet metainfo prop
* fix bug and simplify code
* update linear and support some op
* optimize max region search, support conv
* update unet test
* support some op
* support groupnorm and interpolate
* update flow search
* add fix dim in node flow
* fix utils
* rename
* support diffusion
* update diffuser
* update chunk search
* optimize imports
* import
* finish autochunk
|
2 years ago |
Frank Lee
|
291b051171
|
[doc] fixed broken badge (#2623)
|
2 years ago |
binmakeswell
|
0556f5d468
|
[tutorial] add video link (#2619)
|
2 years ago |
Frank Lee
|
93fdd35b5e
|
[build] fixed the doc build process (#2618)
|
2 years ago |
Frank Lee
|
8518263b80
|
[test] fixed the triton version for testing (#2608)
|
2 years ago |
Frank Lee
|
aa7e9e4794
|
[workflow] fixed the test coverage report (#2614)
* [workflow] fixed the test coverage report
* polish code
|
2 years ago |