Nikita Shulga
|
01066152f1
|
Don't use `torch._six` (#2775)
* Don't use `torch._six`
This is a private API which is gone after https://github.com/pytorch/pytorch/pull/94709
* Update common.py
|
2023-02-17 09:22:45 +08:00 |
ver217
|
a88bc828d5
|
[chatgpt] disable shard init for colossalai (#2767)
|
2023-02-16 20:09:34 +08:00 |
binmakeswell
|
d6d6dec190
|
[doc] update example and OPT serving link (#2769)
* [doc] update OPT serving link
* [doc] update example and OPT serving link
* [doc] update example and OPT serving link
|
2023-02-16 20:07:25 +08:00 |
Frank Lee
|
e376954305
|
[doc] add opt service doc (#2747)
|
2023-02-16 15:45:26 +08:00 |
BlueRum
|
613efebc5c
|
[chatgpt] support colossalai strategy to train rm (#2742)
* [chatgpt]fix train_rm bug with lora
* [chatgpt]support colossalai strategy to train rm
* fix pre-commit
* fix pre-commit 2
|
2023-02-16 11:24:07 +08:00 |
BlueRum
|
648183a960
|
[chatgpt]fix train_rm bug with lora (#2741)
|
2023-02-16 10:25:17 +08:00 |
fastalgo
|
b6e3b955c3
|
Update README.md
|
2023-02-16 07:39:46 +08:00 |
binmakeswell
|
30aee9c45d
|
[NFC] polish code format
[NFC] polish code format
|
2023-02-15 23:21:36 +08:00 |
YuliangLiu0306
|
1dc003c169
|
[autoparallel] distinguish different parallel strategies (#2699)
|
2023-02-15 22:28:28 +08:00 |
YH
|
ae86a29e23
|
Refact method of grad store (#2687)
|
2023-02-15 22:27:58 +08:00 |
cloudhuang
|
43dffdaba5
|
[doc] fixed a typo in GPT readme (#2736)
|
2023-02-15 22:24:45 +08:00 |
binmakeswell
|
93b788b95a
|
Merge branch 'main' into fix/format
|
2023-02-15 20:23:51 +08:00 |
xyupeng
|
2fd528b9f4
|
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/graph_analysis.py code style (#2737)
|
2023-02-15 22:57:45 +08:00 |
Zirui Zhu
|
c9e3ee389e
|
[NFC] polish colossalai/context/process_group_initializer/initializer_2d.py code style (#2726)
|
2023-02-15 22:27:13 +08:00 |
Zangwei Zheng
|
1819373e5c
|
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/batch_norm_handler.py code style (#2728)
|
2023-02-15 22:26:13 +08:00 |
Wangbo Zhao(黑色枷锁)
|
8331420520
|
[NFC] polish colossalai/cli/cli.py code style (#2734)
|
2023-02-15 22:25:28 +08:00 |
Frank Lee
|
5479fdd5b8
|
[doc] updated documentation version list (#2730)
|
2023-02-15 17:39:50 +08:00 |
binmakeswell
|
c5be83afbf
|
Update version.txt (#2727)
|
2023-02-15 16:48:08 +08:00 |
ziyuhuang123
|
d344313533
|
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/embedding_handler.py code style (#2725)
|
2023-02-15 16:31:40 +08:00 |
Xue Fuzhao
|
e81caeb4bc
|
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/cost_graph.py code style (#2720)
Co-authored-by: Fuzhao Xue <fuzhao@login2.ls6.tacc.utexas.edu>
|
2023-02-15 16:12:45 +08:00 |
yuxuan-lou
|
51c45c2460
|
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/where_handler.py code style (#2723)
|
2023-02-15 16:12:24 +08:00 |
CH.Li
|
7aacfad8af
|
fix typo (#2721)
|
2023-02-15 14:54:53 +08:00 |
ver217
|
9c0943ecdb
|
[chatgpt] optimize generation kwargs (#2717)
* [chatgpt] ppo trainer use default generate args
* [chatgpt] example remove generation preparing fn
* [chatgpt] benchmark remove generation preparing fn
* [chatgpt] fix ci
|
2023-02-15 13:59:58 +08:00 |
YuliangLiu0306
|
21d6a48f4d
|
[autoparallel] add shard option (#2696)
* [autoparallel] add shard option
* polish
|
2023-02-15 13:48:28 +08:00 |
YuliangLiu0306
|
5b24987fa7
|
[autoparallel] fix parameters sharding bug (#2716)
|
2023-02-15 12:25:50 +08:00 |
Frank Lee
|
2045d45ab7
|
[doc] updated documentation version list (#2715)
|
2023-02-15 11:24:18 +08:00 |
binmakeswell
|
d4d3387f45
|
[doc] add open-source contribution invitation (#2714)
* [doc] fix typo
* [doc] add invitation
|
2023-02-15 11:08:35 +08:00 |
ver217
|
f6b4ca4e6c
|
[devops] add chatgpt ci (#2713)
|
2023-02-15 10:53:54 +08:00 |
Ziyue Jiang
|
4603538ddd
|
[NFC] posh colossalai/context/process_group_initializer/initializer_sequence.py code style (#2712)
Co-authored-by: Ziyue Jiang <ziyue.jiang@gmail.com>
|
2023-02-15 10:53:38 +08:00 |
YuliangLiu0306
|
cb2c6a2415
|
[autoparallel] refactor runtime pass (#2644)
* [autoparallel] refactor runtime pass
* add unit test
* polish
|
2023-02-15 10:36:19 +08:00 |
Frank Lee
|
89f8975fb8
|
[workflow] fixed tensor-nvme build caching (#2711)
|
2023-02-15 10:12:55 +08:00 |
Zihao
|
b3d10db5f1
|
[NFC] polish colossalai/cli/launcher/__init__.py code style (#2709)
|
2023-02-15 09:57:22 +08:00 |
Fazzie-Maqianli
|
d03f4429c1
|
add ci (#2641)
|
2023-02-15 09:55:53 +08:00 |
YuliangLiu0306
|
0b2a738393
|
[autoparallel] remove deprecated codes (#2664)
|
2023-02-15 09:54:32 +08:00 |
YuliangLiu0306
|
7fa6be49d2
|
[autoparallel] test compatibility for gemini and auto parallel (#2700)
|
2023-02-15 09:43:29 +08:00 |
CZYCW
|
4ac8bfb072
|
[NFC] polish colossalai/engine/gradient_handler/utils.py code style (#2708)
|
2023-02-15 09:40:08 +08:00 |
github-actions[bot]
|
d701ef81b1
|
Automated submodule synchronization (#2707)
Co-authored-by: github-actions <github-actions@github.com>
|
2023-02-15 09:39:44 +08:00 |
binmakeswell
|
94f000515b
|
[doc] add Quick Preview (#2706)
|
2023-02-14 23:07:30 +08:00 |
binmakeswell
|
71deddc87f
|
[doc] resize figure (#2705)
* [doc] resize figure
* [doc] resize figure
|
2023-02-14 22:56:15 +08:00 |
binmakeswell
|
6a8cd687e3
|
[doc] add ChatGPT (#2703)
|
2023-02-14 22:48:30 +08:00 |
binmakeswell
|
8408c852a6
|
[app] fix ChatGPT requirements (#2704)
|
2023-02-14 22:48:15 +08:00 |
ver217
|
1b34701027
|
[app] add chatgpt application (#2698)
|
2023-02-14 22:17:25 +08:00 |
Liu Ziming
|
6427c406cf
|
[NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/strategy_generator.py code style (#2695)
Co-authored-by: shenggan <csg19971016@gmail.com>
|
2023-02-14 21:30:25 +08:00 |
ver217
|
c3abdd085d
|
[release] update version (#2691)
|
2023-02-14 19:37:14 +08:00 |
アマデウス
|
534f68c83c
|
[NFC] polish pipeline process group code style (#2694)
|
2023-02-14 18:12:01 +08:00 |
LuGY
|
56ff1921e9
|
[NFC] polish colossalai/context/moe_context.py code style (#2693)
|
2023-02-14 18:02:45 +08:00 |
Shawn-Kong
|
1712da2800
|
[NFC] polish colossalai/gemini/gemini_context.py code style (#2690)
|
2023-02-14 11:55:23 +08:00 |
binmakeswell
|
46f20bac41
|
[doc] update auto parallel paper link (#2686)
* [doc] update auto parallel paper link
* [doc] update auto parallel paper link
|
2023-02-13 23:05:29 +08:00 |
github-actions[bot]
|
88416019e7
|
Automated submodule synchronization (#2648)
Co-authored-by: github-actions <github-actions@github.com>
|
2023-02-13 18:10:54 +08:00 |
HELSON
|
df4f020ee3
|
[zero1&2] only append parameters with gradients (#2681)
|
2023-02-13 18:00:16 +08:00 |