Commit Graph

88 Commits (2899cfdabfdf205121c5ebc8a47494db3728c7af)

Author SHA1 Message Date
Wenhao Chen 7172459e74
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088)
1 year ago
digger yu d5661f0f25
[nfc] fix typo change directoty to directory (#5111)
1 year ago
digger yu 2bdf76f1f2
fix typo change lazy_iniy to lazy_init (#5099)
1 year ago
digger yu 0d482302a1
[nfc] fix typo and author name (#5089)
1 year ago
digger yu fd3567e089
[nfc] fix typo in docs/ (#4972)
1 year ago
ppt0011 335cb105e2
[doc] add supported feature diagram for hybrid parallel plugin (#4996)
1 year ago
digger yu 11009103be
[nfc] fix some typo with colossalai/ docs/ etc. (#4920)
1 year ago
Baizhou Zhang 21ba89cab6
[gemini] support gradient accumulation (#4869)
1 year ago
flybird11111 6a21f96a87
[doc] update advanced tutorials, training gpt with hybrid parallelism (#4866)
1 year ago
Zhongkai Zhao db40e086c8 [test] modify model supporting part of low_level_zero plugin (including correspoding docs)
1 year ago
Hongxin Liu da15fdb9ca
[doc] add lazy init docs (#4808)
1 year ago
Baizhou Zhang 64a08b2dc3
[checkpointio] support unsharded checkpointIO for hybrid parallel (#4774)
1 year ago
Baizhou Zhang a2db75546d
[doc] polish shardformer doc (#4779)
1 year ago
Hongxin Liu 66f3926019
[doc] clean up outdated docs (#4765)
1 year ago
Pengtai Xu 4d7537ba25 [doc] put native colossalai plugins first in description section
1 year ago
Pengtai Xu e10d9f087e [doc] add model examples for each plugin
1 year ago
Pengtai Xu a04337bfc3 [doc] put individual plugin explanation in front
1 year ago
Pengtai Xu 10513f203c [doc] explain suitable use case for each plugin
1 year ago
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code (#4743)
1 year ago
Baizhou Zhang d151dcab74
[doc] explaination of loading large pretrained models (#4741)
1 year ago
Baizhou Zhang 451c3465fb
[doc] polish shardformer doc (#4735)
1 year ago
Bin Jia 6a03c933a0
[shardformer] update seq parallel document (#4730)
1 year ago
flybird11111 46162632e5
[shardformer] update pipeline parallel document (#4725)
1 year ago
Baizhou Zhang 50e5602c2d
[doc] add shardformer support matrix/update tensor parallel documents (#4728)
1 year ago
Baizhou Zhang f911d5b09d
[doc] Add user document for Shardformer (#4702)
1 year ago
Baizhou Zhang 1d454733c4
[doc] Update booster user documents. (#4669)
1 year ago
Hongxin Liu 554aa9592e
[legacy] move communication and nn to legacy and refactor logger (#4671)
1 year ago
Hongxin Liu ac178ca5c1 [legacy] move builder and registry to legacy (#4603)
1 year ago
Hongxin Liu 8accecd55b [legacy] move engine to legacy (#4560)
1 year ago
Hongxin Liu 89fe027787 [legacy] move trainer to legacy (#4545)
1 year ago
Hongxin Liu 27061426f7
[gemini] improve compatibility and add static placement policy (#4479)
1 year ago
flybird1111 f40b718959
[doc] Fix gradient accumulation doc. (#4349)
1 year ago
Baizhou Zhang c6f6005990
[checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (#4302)
1 year ago
Jianghai 711e2b4c00
[doc] update and revise some typos and errs in docs (#4107)
1 year ago
digger yu 769cddcb2c
fix typo docs/ (#4033)
1 year ago
Baizhou Zhang 4da324cd60
[hotfix]fix argument naming in docs and examples (#4083)
1 year ago
Frank Lee ddcf58cacf
Revert "[sync] sync feature/shardformer with develop"
1 year ago
FoolPlayer 24651fdd4f
Merge pull request #3931 from FrankLeeeee/sync/develop-to-shardformer
1 year ago
digger yu 33eef714db
fix typo examples and docs (#3932)
1 year ago
Hongxin Liu 12c90db3f3
[doc] add lazy init tutorial (#3922)
2 years ago
Baizhou Zhang c1535ccbba
[doc] fix docs about booster api usage (#3898)
2 years ago
jiangmingyan 07cb21142f
[doc]update moe chinese document. (#3890)
2 years ago
jiangmingyan 281b33f362
[doc] update document of zero with chunk. (#3855)
2 years ago
jiangmingyan b0474878bf
[doc] update nvme offload documents. (#3850)
2 years ago
jiangmingyan a64df3fa97
[doc] update document of gemini instruction. (#3842)
2 years ago
Frank Lee 54e97ed7ea
[workflow] supported test on CUDA 10.2 (#3841)
2 years ago
wukong1992 3229f93e30
[booster] add warning for torch fsdp plugin doc (#3833)
2 years ago
digger yu 518b31c059
[docs] change placememt_policy to placement_policy (#3829)
2 years ago
digger yu e90fdb1000 fix typo docs/
2 years ago
jiangmingyan 725365f297
Merge pull request #3810 from jiangmingyan/amp
2 years ago