Commit Graph

21 Commits (c94a33579b7c70d96905ea8b2c3a4baf28451cb0)

Author SHA1 Message Date
jiangmingyan 281b33f362
[doc] update document of zero with chunk. (#3855)
* [doc] fix title of mixed precision

* [doc]update document of zero with chunk

* [doc] update document of zero with chunk, fix

* [doc] update document of zero with chunk, fix

* [doc] update document of zero with chunk, fix

* [doc] update document of zero with chunk, add doc test

* [doc] update document of zero with chunk, add doc test

* [doc] update document of zero with chunk, fix installation

* [doc] update document of zero with chunk, fix zero with chunk doc

* [doc] update document of zero with chunk, fix zero with chunk doc
2023-05-30 18:41:56 +08:00
jiangmingyan b0474878bf
[doc] update nvme offload documents. (#3850) 2023-05-26 01:22:01 +08:00
digger yu 518b31c059
[docs] change placememt_policy to placement_policy (#3829)
* fix typo colossalai/autochunk auto_parallel amp

* fix typo colossalai/auto_parallel nn utils etc.

* fix typo colossalai/auto_parallel autochunk fx/passes  etc.

* fix typo docs/

* change placememt_policy to placement_policy in docs/ and examples/
2023-05-24 14:51:49 +08:00
digger yu e90fdb1000 fix typo docs/ 2023-05-24 13:57:43 +08:00
jiangmingyan 278fcbc444 [doc]fix 2023-05-23 17:53:11 +08:00
jiangmingyan 8aa1fb2c7f [doc]fix 2023-05-23 17:50:30 +08:00
jiangmingyan 75272ef37b [doc] add removed warning 2023-05-23 16:34:30 +08:00
Mingyan Jiang a520610bd9 [doc] update amp document 2023-05-23 16:20:29 +08:00
Mingyan Jiang 8c62e50dbb [doc] update amp document 2023-05-23 16:20:01 +08:00
jiangmingyan ef02d7ef6d
[doc] update gradient accumulation (#3771)
* [doc]update gradient accumulation

* [doc]update gradient accumulation

* [doc]update gradient accumulation

* [doc]update gradient accumulation

* [doc]update gradient accumulation, fix

* [doc]update gradient accumulation, fix

* [doc]update gradient accumulation, fix

* [doc]update gradient accumulation, add sidebars

* [doc]update gradient accumulation, fix

* [doc]update gradient accumulation, fix

* [doc]update gradient accumulation, fix

* [doc]update gradient accumulation, resolve comments

* [doc]update gradient accumulation, resolve comments

* fix
2023-05-23 10:52:30 +08:00
jiangmingyan fe1561a884
[doc] update gradient cliping document (#3778)
* [doc] update gradient clipping document

* [doc] update gradient clipping document

* [doc] update gradient clipping document

* [doc] update gradient clipping document

* [doc] update gradient clipping document

* [doc] update gradient clipping document

* [doc] update gradient clipping doc, fix sidebars.json

* [doc] update gradient clipping doc, fix doc test
2023-05-22 14:13:15 +08:00
Hongxin Liu 72688adb2f
[doc] add booster docstring and fix autodoc (#3789)
* [doc] add docstr for booster methods

* [doc] fix autodoc
2023-05-22 10:56:47 +08:00
Hongxin Liu 5ce6c9d86f
[doc] add tutorial for cluster utils (#3763)
* [doc] add en cluster utils doc

* [doc] add zh cluster utils doc

* [doc] add cluster utils doc in sidebar
2023-05-19 12:12:20 +08:00
jiangmingyan 48bd056761
[doc] update hybrid parallelism doc (#3770) 2023-05-18 14:16:13 +08:00
Hongxin Liu 5dd573c6b6
[devops] fix ci for document check (#3751)
* [doc] add test info

* [devops] update doc check ci

* [devops] add debug info

* [devops] add debug info

* [devops] add debug info

* [devops] add debug info

* [devops] add debug info

* [devops] add debug info

* [devops] add debug info

* [devops] add debug info

* [devops] add debug info

* [devops] add debug info

* [devops] remove debug info and update invalid doc

* [devops] add essential comments
2023-05-17 11:24:22 +08:00
digger-yu 1c7734bc94
[doc] Update 1D_tensor_parallel.md (#3563)
Display format optimization, fix bug#3562
Specific changes
1. "This is called a column-parallel fashion" Translate to Chinese
2. use the ```math code block syntax to display a math expression as a block, No modification of formula content

Please check that the math formula is displayed correctly
If OK, I will change the format of the English version of the formula in parallel
2023-04-14 22:12:32 +08:00
ver217 26b7aac0be
[zero] reorganize zero/gemini folder structure (#3424)
* [zero] refactor low-level zero folder structure

* [zero] fix legacy zero import path

* [zero] fix legacy zero import path

* [zero] remove useless import

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor legacy zero import path

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor legacy zero import path

* [zero] fix test import path

* [zero] fix test

* [zero] fix circular import

* [zero] update import
2023-04-04 13:48:16 +08:00
Frank Lee 416a50dbd7
[doc] moved doc test command to bottom (#3075) 2023-03-09 18:10:45 +08:00
ver217 378d827c6b
[doc] update nvme offload doc (#3014)
* [doc] update nvme offload doc

* [doc] add doc testing cmd and requirements

* [doc] add api reference

* [doc] add dependencies
2023-03-07 17:49:01 +08:00
Frank Lee e0a1c1321c
[doc] added reference to related works (#2994)
* [doc] added reference to related works

* polish code
2023-03-04 17:32:22 +08:00
Frank Lee cd4f02bed8
[doc] fixed compatiblity with docusaurus (#2657) 2023-02-09 17:06:29 +08:00