jiangmingyan
a64df3fa97
[doc] update document of gemini instruction. ( #3842 )
...
* [doc] update meet_gemini.md
* [doc] update meet_gemini.md
* [doc] fix parentheses
* [doc] fix parentheses
* [doc] fix doc test
* [doc] fix doc test
* [doc] fix doc
2023-05-25 14:58:01 +08:00
Frank Lee
54e97ed7ea
[workflow] supported test on CUDA 10.2 ( #3841 )
2023-05-25 14:14:34 +08:00
wukong1992
3229f93e30
[booster] add warning for torch fsdp plugin doc ( #3833 )
2023-05-25 14:00:02 +08:00
Hongxin Liu
7c9f2ed6dd
[dtensor] polish sharding spec docstring ( #3838 )
...
* [dtensor] polish sharding spec docstring
* [dtensor] polish sharding spec example docstring
2023-05-25 13:09:42 +08:00
Frank Lee
84500b7799
[workflow] fixed testmon cache in build CI ( #3806 )
...
* [workflow] fixed testmon cache in build CI
* polish code
2023-05-24 14:59:40 +08:00
digger yu
518b31c059
[docs] change placememt_policy to placement_policy ( #3829 )
...
* fix typo colossalai/autochunk auto_parallel amp
* fix typo colossalai/auto_parallel nn utils etc.
* fix typo colossalai/auto_parallel autochunk fx/passes etc.
* fix typo docs/
* change placememt_policy to placement_policy in docs/ and examples/
2023-05-24 14:51:49 +08:00
digger yu
e90fdb1000
fix typo docs/
2023-05-24 13:57:43 +08:00
Yuanchen
34966378e8
[evaluation] add automatic evaluation pipeline ( #3821 )
...
* add functions for gpt evaluation
* add automatic eval
Update eval.py
* using jload and modify the type of answers1 and answers2
* Update eval.py
Update eval.py
* Update evaluator.py
* support gpt evaluation
* update readme.md
update README.md
update READNE.md
modify readme.md
* add Chinese example for config, battle prompt and evaluation prompt file
* remove GPT-4 config
* remove sample folder
---------
Co-authored-by: Yuanchen Xu <yuanchen.xu00@gmail.com>
Co-authored-by: Camille Zhong <44392324+Camille7777@users.noreply.github.com>
2023-05-24 11:18:23 +08:00
Frank Lee
05b8a8de58
[workflow] changed to doc build to be on schedule and release ( #3825 )
...
* [workflow] changed to doc build to be on schedule and release
* polish code
2023-05-24 10:50:19 +08:00
Yanming W
269150b6f4
[Docker] Fix a couple of build issues ( #3691 )
2023-05-24 10:22:51 +08:00
digger yu
7f8203af69
fix typo colossalai/auto_parallel autochunk fx/passes etc. ( #3808 )
2023-05-24 09:01:50 +08:00
jiangmingyan
725365f297
Merge pull request #3810 from jiangmingyan/amp
...
[doc] update amp document
2023-05-23 18:58:16 +08:00
jiangmingyan
278fcbc444
[doc]fix
2023-05-23 17:53:11 +08:00
jiangmingyan
8aa1fb2c7f
[doc]fix
2023-05-23 17:50:30 +08:00
Frank Lee
1e3b64f26c
[workflow] enblaed doc build from a forked repo ( #3815 )
2023-05-23 17:49:53 +08:00
Hongxin Liu
19d153057e
[doc] add warning about fsdp plugin ( #3813 )
2023-05-23 17:16:10 +08:00
wukong1992
6b305a99d6
[booster] torch fsdp fix ckpt ( #3788 )
2023-05-23 16:58:45 +08:00
jiangmingyan
c425a69d52
[doc] add removed change of config.py
2023-05-23 16:42:36 +08:00
jiangmingyan
75272ef37b
[doc] add removed warning
2023-05-23 16:34:30 +08:00
Mingyan Jiang
a520610bd9
[doc] update amp document
2023-05-23 16:20:29 +08:00
Mingyan Jiang
1167bf5b10
[doc] update amp document
2023-05-23 16:20:17 +08:00
Mingyan Jiang
8c62e50dbb
[doc] update amp document
2023-05-23 16:20:01 +08:00
digger yu
9265f2d4d7
[NFC]fix typo colossalai/auto_parallel nn utils etc. ( #3779 )
...
* fix typo colossalai/autochunk auto_parallel amp
* fix typo colossalai/auto_parallel nn utils etc.
2023-05-23 15:28:20 +08:00
jiangmingyan
e871e342b3
[API] add docstrings and initialization to apex amp, naive amp ( #3783 )
...
* [mixed_precison] add naive amp demo
* [mixed_precison] add naive amp demo
* [api] add docstrings and initialization to apex amp, naive amp
* [api] add docstring to apex amp/ naive amp
* [api] add docstring to apex amp/ naive amp
* [api] add docstring to apex amp/ naive amp
* [api] add docstring to apex amp/ naive amp
* [api] add docstring to apex amp/ naive amp
* [api] add docstring to apex amp/ naive amp
* [api] fix
* [api] fix
2023-05-23 15:17:24 +08:00
Frank Lee
615e2e5fc1
[test] fixed lazy init test import error ( #3799 )
2023-05-23 11:57:15 +08:00
Frank Lee
ad93c736ea
[workflow] enable testing for develop & feature branch ( #3801 )
2023-05-23 11:21:15 +08:00
jiangmingyan
ef02d7ef6d
[doc] update gradient accumulation ( #3771 )
...
* [doc]update gradient accumulation
* [doc]update gradient accumulation
* [doc]update gradient accumulation
* [doc]update gradient accumulation
* [doc]update gradient accumulation, fix
* [doc]update gradient accumulation, fix
* [doc]update gradient accumulation, fix
* [doc]update gradient accumulation, add sidebars
* [doc]update gradient accumulation, fix
* [doc]update gradient accumulation, fix
* [doc]update gradient accumulation, fix
* [doc]update gradient accumulation, resolve comments
* [doc]update gradient accumulation, resolve comments
* fix
2023-05-23 10:52:30 +08:00
Frank Lee
f5c425c898
fixed the example docstring for booster ( #3795 )
2023-05-22 18:10:06 +08:00
Frank Lee
788e07dbc5
[workflow] fixed the docker build workflow ( #3794 )
...
* [workflow] fixed the docker build workflow
* polish code
2023-05-22 16:30:32 +08:00
liuzeming
4d29c0f8e0
Fix/docker action ( #3266 )
...
* [docker] Add ARG VERSION to determine the Tag
* [workflow] fixed the version in the release docker workflow
---------
Co-authored-by: liuzeming <liuzeming@4paradigm.com>
2023-05-22 15:04:00 +08:00
github-actions[bot]
62c7e67f9f
[format] applied code formatting on changed files in pull request 3786 ( #3787 )
...
Co-authored-by: github-actions <github-actions@github.com>
2023-05-22 14:42:09 +08:00
jiangmingyan
fe1561a884
[doc] update gradient cliping document ( #3778 )
...
* [doc] update gradient clipping document
* [doc] update gradient clipping document
* [doc] update gradient clipping document
* [doc] update gradient clipping document
* [doc] update gradient clipping document
* [doc] update gradient clipping document
* [doc] update gradient clipping doc, fix sidebars.json
* [doc] update gradient clipping doc, fix doc test
2023-05-22 14:13:15 +08:00
Yanjia0
d9393b85f1
[doc] add deprecated warning on doc Basics section ( #3754 )
...
* Update colotensor_concept.md
* Update configure_parallelization.md
* Update define_your_config.md
* Update engine_trainer.md
* Update initialize_features.md
* Update model_checkpoint.md
* Update colotensor_concept.md
* Update configure_parallelization.md
* Update define_your_config.md
* Update engine_trainer.md
* Update initialize_features.md
* Update model_checkpoint.md
2023-05-22 11:12:53 +08:00
Hongxin Liu
72688adb2f
[doc] add booster docstring and fix autodoc ( #3789 )
...
* [doc] add docstr for booster methods
* [doc] fix autodoc
2023-05-22 10:56:47 +08:00
Hongxin Liu
3c07a2846e
[plugin] a workaround for zero plugins' optimizer checkpoint ( #3780 )
...
* [test] refactor torch ddp checkpoint test
* [plugin] update low level zero optim checkpoint
* [plugin] update gemini optim checkpoint
2023-05-19 19:42:31 +08:00
Hongxin Liu
60e6a154bc
[doc] add tutorial for booster checkpoint ( #3785 )
...
* [doc] add checkpoint related docstr for booster
* [doc] add en checkpoint doc
* [doc] add zh checkpoint doc
* [doc] add booster checkpoint doc in sidebar
* [doc] add cuation about ckpt for plugins
* [doc] add doctest placeholder
* [doc] add doctest placeholder
* [doc] add doctest placeholder
2023-05-19 18:05:08 +08:00
binmakeswell
ad2cf58f50
[chat] add performance and tutorial ( #3786 )
2023-05-19 18:03:56 +08:00
Hongxin Liu
b4788d63ed
[devops] fix doc test on pr ( #3782 )
2023-05-19 16:28:57 +08:00
digger yu
32f81f14d4
[NFC] fix typo colossalai/amp auto_parallel autochunk ( #3756 )
2023-05-19 13:50:00 +08:00
Hongxin Liu
21e29e2212
[doc] add tutorial for booster plugins ( #3758 )
...
* [doc] add en booster plugins doc
* [doc] add booster plugins doc in sidebar
* [doc] add zh booster plugins doc
* [doc] fix zh booster plugin translation
* [doc] reoganize tutorials order of basic section
* [devops] force sync to test ci
2023-05-19 12:12:42 +08:00
Hongxin Liu
5ce6c9d86f
[doc] add tutorial for cluster utils ( #3763 )
...
* [doc] add en cluster utils doc
* [doc] add zh cluster utils doc
* [doc] add cluster utils doc in sidebar
2023-05-19 12:12:20 +08:00
Hongxin Liu
5452df63c5
[plugin] torch ddp plugin supports sharded model checkpoint ( #3775 )
...
* [plugin] torch ddp plugin add save sharded model
* [test] fix torch ddp ckpt io test
* [test] fix torch ddp ckpt io test
* [test] fix low level zero plugin test
* [test] fix low level zero plugin test
* [test] add debug info
* [test] add debug info
* [test] add debug info
* [test] add debug info
* [test] add debug info
* [test] fix low level zero plugin test
* [test] fix low level zero plugin test
* [test] remove debug info
2023-05-18 20:05:59 +08:00
jiangmingyan
2703a37ac9
[amp] Add naive amp demo ( #3774 )
...
* [mixed_precison] add naive amp demo
* [mixed_precison] add naive amp demo
2023-05-18 16:33:14 +08:00
jiangmingyan
48bd056761
[doc] update hybrid parallelism doc ( #3770 )
2023-05-18 14:16:13 +08:00
binmakeswell
15024e40d9
[auto] fix install cmd ( #3772 )
2023-05-18 13:33:01 +08:00
jiangmingyan
d449525acf
[doc] update booster tutorials ( #3718 )
...
* [booster] update booster tutorials#3717
* [booster] update booster tutorials#3717, fix
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, rename colossalai booster.md
* [booster] update booster tutorials#3717, rename colossalai booster.md
* [booster] update booster tutorials#3717, rename colossalai booster.md
* [booster] update booster tutorials#3717, fix
* [booster] update booster tutorials#3717, fix
* [booster] update tutorials#3717, update booster api doc
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3713
* [booster] update tutorials#3713, modify file
2023-05-18 11:41:56 +08:00
Yuanchen
05759839bd
[chat] fix bugs in stage 3 training ( #3759 )
...
Co-authored-by: Yuanchen Xu <yuanchen.xu00@gmail.com>
2023-05-17 17:44:05 +08:00
Hongxin Liu
5dd573c6b6
[devops] fix ci for document check ( #3751 )
...
* [doc] add test info
* [devops] update doc check ci
* [devops] add debug info
* [devops] add debug info
* [devops] add debug info
* [devops] add debug info
* [devops] add debug info
* [devops] add debug info
* [devops] add debug info
* [devops] add debug info
* [devops] add debug info
* [devops] add debug info
* [devops] remove debug info and update invalid doc
* [devops] add essential comments
2023-05-17 11:24:22 +08:00
Hongxin Liu
c03bd7c6b2
[devops] make build on PR run automatically ( #3748 )
...
* [devops] make build on PR run automatically
* [devops] update build on pr condition
2023-05-17 11:17:37 +08:00
digger yu
1baeb39c72
[NFC] fix typo with colossalai/auto_parallel/tensor_shard ( #3742 )
...
* fix typo applications/ and colossalai/ date 5.11
* fix typo colossalai/
2023-05-17 11:13:23 +08:00