Commit Graph

135 Commits (3e05c07bb8921f2a8f9736b6f6673d4e9f1697d0)

Author SHA1 Message Date
Baizhou Zhang 493a5efeab
[doc] add shardformer doc to sidebar (#4768)
1 year ago
Hongxin Liu 66f3926019
[doc] clean up outdated docs (#4765)
1 year ago
Pengtai Xu 4d7537ba25 [doc] put native colossalai plugins first in description section
1 year ago
Pengtai Xu e10d9f087e [doc] add model examples for each plugin
1 year ago
Pengtai Xu a04337bfc3 [doc] put individual plugin explanation in front
1 year ago
Pengtai Xu 10513f203c [doc] explain suitable use case for each plugin
1 year ago
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code (#4743)
1 year ago
Baizhou Zhang d151dcab74
[doc] explaination of loading large pretrained models (#4741)
1 year ago
Baizhou Zhang 451c3465fb
[doc] polish shardformer doc (#4735)
1 year ago
Bin Jia 6a03c933a0
[shardformer] update seq parallel document (#4730)
1 year ago
flybird11111 46162632e5
[shardformer] update pipeline parallel document (#4725)
1 year ago
Baizhou Zhang 50e5602c2d
[doc] add shardformer support matrix/update tensor parallel documents (#4728)
1 year ago
github-actions[bot] 8c2dda7410
[format] applied code formatting on changed files in pull request 4726 (#4727)
1 year ago
Baizhou Zhang f911d5b09d
[doc] Add user document for Shardformer (#4702)
1 year ago
binmakeswell ce97790ed7
[doc] fix llama2 code link (#4726)
1 year ago
Baizhou Zhang 1d454733c4
[doc] Update booster user documents. (#4669)
1 year ago
Hongxin Liu 554aa9592e
[legacy] move communication and nn to legacy and refactor logger (#4671)
1 year ago
Hongxin Liu ac178ca5c1 [legacy] move builder and registry to legacy (#4603)
1 year ago
Hongxin Liu 8accecd55b [legacy] move engine to legacy (#4560)
1 year ago
Hongxin Liu 89fe027787 [legacy] move trainer to legacy (#4545)
1 year ago
binmakeswell 7a978eb3d0
[DOC] hotfix/llama2news (#4595)
1 year ago
Hongxin Liu 27061426f7
[gemini] improve compatibility and add static placement policy (#4479)
1 year ago
binmakeswell 089c365fa0
[doc] add Series A Funding and NeurIPS news (#4377)
1 year ago
flybird1111 f40b718959
[doc] Fix gradient accumulation doc. (#4349)
1 year ago
Baizhou Zhang c6f6005990
[checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (#4302)
1 year ago
binmakeswell 7ff11b5537
[example] add llama pretraining (#4257)
1 year ago
Jianghai 711e2b4c00
[doc] update and revise some typos and errs in docs (#4107)
1 year ago
digger yu 769cddcb2c
fix typo docs/ (#4033)
1 year ago
Baizhou Zhang 4da324cd60
[hotfix]fix argument naming in docs and examples (#4083)
1 year ago
Frank Lee ddcf58cacf
Revert "[sync] sync feature/shardformer with develop"
1 year ago
FoolPlayer 24651fdd4f
Merge pull request #3931 from FrankLeeeee/sync/develop-to-shardformer
1 year ago
digger yu 33eef714db
fix typo examples and docs (#3932)
1 year ago
Hongxin Liu 12c90db3f3
[doc] add lazy init tutorial (#3922)
1 year ago
Baizhou Zhang c1535ccbba
[doc] fix docs about booster api usage (#3898)
1 year ago
jiangmingyan 07cb21142f
[doc]update moe chinese document. (#3890)
1 year ago
jiangmingyan 281b33f362
[doc] update document of zero with chunk. (#3855)
2 years ago
jiangmingyan b0474878bf
[doc] update nvme offload documents. (#3850)
2 years ago
jiangmingyan a64df3fa97
[doc] update document of gemini instruction. (#3842)
2 years ago
Frank Lee 54e97ed7ea
[workflow] supported test on CUDA 10.2 (#3841)
2 years ago
wukong1992 3229f93e30
[booster] add warning for torch fsdp plugin doc (#3833)
2 years ago
digger yu 518b31c059
[docs] change placememt_policy to placement_policy (#3829)
2 years ago
digger yu e90fdb1000 fix typo docs/
2 years ago
jiangmingyan 725365f297
Merge pull request #3810 from jiangmingyan/amp
2 years ago
jiangmingyan 278fcbc444 [doc]fix
2 years ago
jiangmingyan 8aa1fb2c7f [doc]fix
2 years ago
Hongxin Liu 19d153057e
[doc] add warning about fsdp plugin (#3813)
2 years ago
jiangmingyan c425a69d52 [doc] add removed change of config.py
2 years ago
jiangmingyan 75272ef37b [doc] add removed warning
2 years ago
Mingyan Jiang a520610bd9 [doc] update amp document
2 years ago
Mingyan Jiang 1167bf5b10 [doc] update amp document
2 years ago