Hongxin Liu
7f8b16635b
[misc] refactor launch API and tensor constructor ( #5666 )
...
* [misc] remove config arg from initialize
* [misc] remove old tensor contrusctor
* [plugin] add npu support for ddp
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* [devops] fix doc test ci
* [test] fix test launch
* [doc] update launch doc
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-04-29 10:40:11 +08:00
digger yu
bce9499ed3
fix some typo ( #5307 )
2024-01-25 13:56:27 +08:00
digger yu
fd3567e089
[nfc] fix typo in docs/ ( #4972 )
2023-11-21 22:06:20 +08:00
ppt0011
335cb105e2
[doc] add supported feature diagram for hybrid parallel plugin ( #4996 )
2023-10-31 19:56:42 +08:00
digger yu
11009103be
[nfc] fix some typo with colossalai/ docs/ etc. ( #4920 )
2023-10-18 15:44:04 +08:00
Zhongkai Zhao
db40e086c8
[test] modify model supporting part of low_level_zero plugin (including correspoding docs)
2023-10-05 15:10:31 +08:00
Hongxin Liu
da15fdb9ca
[doc] add lazy init docs ( #4808 )
2023-09-27 10:24:04 +08:00
Baizhou Zhang
64a08b2dc3
[checkpointio] support unsharded checkpointIO for hybrid parallel ( #4774 )
...
* support unsharded saving/loading for model
* support optimizer unsharded saving
* update doc
* support unsharded loading for optimizer
* small fix
2023-09-26 10:58:03 +08:00
Hongxin Liu
66f3926019
[doc] clean up outdated docs ( #4765 )
...
* [doc] clean up outdated docs
* [doc] fix linking
* [doc] fix linking
2023-09-21 11:36:20 +08:00
Pengtai Xu
4d7537ba25
[doc] put native colossalai plugins first in description section
2023-09-20 09:24:10 +08:00
Pengtai Xu
e10d9f087e
[doc] add model examples for each plugin
2023-09-19 18:01:23 +08:00
Pengtai Xu
a04337bfc3
[doc] put individual plugin explanation in front
2023-09-19 16:27:37 +08:00
Pengtai Xu
10513f203c
[doc] explain suitable use case for each plugin
2023-09-19 15:50:14 +08:00
Hongxin Liu
b5f9e37c70
[legacy] clean up legacy code ( #4743 )
...
* [legacy] remove outdated codes of pipeline (#4692 )
* [legacy] remove cli of benchmark and update optim (#4690 )
* [legacy] remove cli of benchmark and update optim
* [doc] fix cli doc test
* [legacy] fix engine clip grad norm
* [legacy] remove outdated colo tensor (#4694 )
* [legacy] remove outdated colo tensor
* [test] fix test import
* [legacy] move outdated zero to legacy (#4696 )
* [legacy] clean up utils (#4700 )
* [legacy] clean up utils
* [example] update examples
* [legacy] clean up amp
* [legacy] fix amp module
* [legacy] clean up gpc (#4742 )
* [legacy] clean up context
* [legacy] clean core, constants and global vars
* [legacy] refactor initialize
* [example] fix examples ci
* [example] fix examples ci
* [legacy] fix tests
* [example] fix gpt example
* [example] fix examples ci
* [devops] fix ci installation
* [example] fix examples ci
2023-09-18 16:31:06 +08:00
Baizhou Zhang
d151dcab74
[doc] explaination of loading large pretrained models ( #4741 )
2023-09-15 21:04:07 +08:00
Baizhou Zhang
f911d5b09d
[doc] Add user document for Shardformer ( #4702 )
...
* create shardformer doc files
* add docstring for seq-parallel
* update ShardConfig docstring
* add links to llama example
* add outdated massage
* finish introduction & supporting information
* finish 'how shardformer works'
* finish shardformer.md English doc
* fix doctest fail
* add Chinese document
2023-09-15 10:56:39 +08:00
Baizhou Zhang
1d454733c4
[doc] Update booster user documents. ( #4669 )
...
* update booster_api.md
* update booster_checkpoint.md
* update booster_plugins.md
* move transformers importing inside function
* fix Dict typing
* fix autodoc bug
* small fix
2023-09-12 10:47:23 +08:00
Hongxin Liu
554aa9592e
[legacy] move communication and nn to legacy and refactor logger ( #4671 )
...
* [legacy] move communication to legacy (#4640 )
* [legacy] refactor logger and clean up legacy codes (#4654 )
* [legacy] make logger independent to gpc
* [legacy] make optim independent to registry
* [legacy] move test engine to legacy
* [legacy] move nn to legacy (#4656 )
* [legacy] move nn to legacy
* [checkpointio] fix save hf config
* [test] remove useledd rpc pp test
* [legacy] fix nn init
* [example] skip tutorial hybriad parallel example
* [devops] test doc check
* [devops] test doc check
2023-09-11 16:24:28 +08:00
Hongxin Liu
89fe027787
[legacy] move trainer to legacy ( #4545 )
...
* [legacy] move trainer to legacy
* [doc] update docs related to trainer
* [test] ignore legacy test
2023-09-05 21:53:10 +08:00
Baizhou Zhang
c6f6005990
[checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin ( #4302 )
...
* sharded optimizer checkpoint for gemini plugin
* modify test to reduce testing time
* update doc
* fix bug when keep_gatherd is true under GeminiPlugin
2023-07-21 14:39:01 +08:00
Jianghai
711e2b4c00
[doc] update and revise some typos and errs in docs ( #4107 )
...
* fix some typos and problems in doc
* fix some typos and problems in doc
* add doc test
2023-06-28 19:30:37 +08:00
wukong1992
3229f93e30
[booster] add warning for torch fsdp plugin doc ( #3833 )
2023-05-25 14:00:02 +08:00
digger yu
e90fdb1000
fix typo docs/
2023-05-24 13:57:43 +08:00
Hongxin Liu
19d153057e
[doc] add warning about fsdp plugin ( #3813 )
2023-05-23 17:16:10 +08:00
Yanjia0
d9393b85f1
[doc] add deprecated warning on doc Basics section ( #3754 )
...
* Update colotensor_concept.md
* Update configure_parallelization.md
* Update define_your_config.md
* Update engine_trainer.md
* Update initialize_features.md
* Update model_checkpoint.md
* Update colotensor_concept.md
* Update configure_parallelization.md
* Update define_your_config.md
* Update engine_trainer.md
* Update initialize_features.md
* Update model_checkpoint.md
2023-05-22 11:12:53 +08:00
Hongxin Liu
72688adb2f
[doc] add booster docstring and fix autodoc ( #3789 )
...
* [doc] add docstr for booster methods
* [doc] fix autodoc
2023-05-22 10:56:47 +08:00
Hongxin Liu
60e6a154bc
[doc] add tutorial for booster checkpoint ( #3785 )
...
* [doc] add checkpoint related docstr for booster
* [doc] add en checkpoint doc
* [doc] add zh checkpoint doc
* [doc] add booster checkpoint doc in sidebar
* [doc] add cuation about ckpt for plugins
* [doc] add doctest placeholder
* [doc] add doctest placeholder
* [doc] add doctest placeholder
2023-05-19 18:05:08 +08:00
Hongxin Liu
21e29e2212
[doc] add tutorial for booster plugins ( #3758 )
...
* [doc] add en booster plugins doc
* [doc] add booster plugins doc in sidebar
* [doc] add zh booster plugins doc
* [doc] fix zh booster plugin translation
* [doc] reoganize tutorials order of basic section
* [devops] force sync to test ci
2023-05-19 12:12:42 +08:00
jiangmingyan
d449525acf
[doc] update booster tutorials ( #3718 )
...
* [booster] update booster tutorials#3717
* [booster] update booster tutorials#3717, fix
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, update setup doc
* [booster] update booster tutorials#3717, rename colossalai booster.md
* [booster] update booster tutorials#3717, rename colossalai booster.md
* [booster] update booster tutorials#3717, rename colossalai booster.md
* [booster] update booster tutorials#3717, fix
* [booster] update booster tutorials#3717, fix
* [booster] update tutorials#3717, update booster api doc
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, modify file
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3717, fix reference link
* [booster] update tutorials#3713
* [booster] update tutorials#3713, modify file
2023-05-18 11:41:56 +08:00
digger-yu
b9a8dff7e5
[doc] Fix typo under colossalai and doc( #3618 )
...
* Fixed several spelling errors under colossalai
* Fix the spelling error in colossalai and docs directory
* Cautious Changed the spelling error under the example folder
* Update runtime_preparation_pass.py
revert autograft to autograd
* Update search_chunk.py
utile to until
* Update check_installation.py
change misteach to mismatch in line 91
* Update 1D_tensor_parallel.md
revert to perceptron
* Update 2D_tensor_parallel.md
revert to perceptron in line 73
* Update 2p5D_tensor_parallel.md
revert to perceptron in line 71
* Update 3D_tensor_parallel.md
revert to perceptron in line 80
* Update README.md
revert to resnet in line 42
* Update reorder_graph.py
revert to indice in line 7
* Update p2p.py
revert to megatron in line 94
* Update initialize.py
revert to torchrun in line 198
* Update routers.py
change to detailed in line 63
* Update routers.py
change to detailed in line 146
* Update README.md
revert random number in line 402
2023-04-26 11:38:43 +08:00
Frank Lee
80eba05b0a
[test] refactor tests with spawn ( #3452 )
...
* [test] added spawn decorator
* polish code
* polish code
* polish code
* polish code
* polish code
* polish code
2023-04-06 14:51:35 +08:00
Frank Lee
85b2303b55
[doc] migrate the markdown files ( #2652 )
2023-02-09 14:21:38 +08:00