146 Commits (feature/lora)

Author SHA1 Message Date
linsj20 fcf776ff1b
[Feature] LoRA rebased to main branch (#5622) 7 months ago
linsj20 52a2dded36
[Feature] qlora support (#5586) 7 months ago
digger yu 11009103be
[nfc] fix some typo with colossalai/ docs/ etc. (#4920) 1 year ago
Baizhou Zhang 21ba89cab6
[gemini] support gradient accumulation (#4869) 1 year ago
flybird11111 6a21f96a87
[doc] update advanced tutorials, training gpt with hybrid parallelism (#4866) 1 year ago
Zhongkai Zhao db40e086c8 [test] modify model supporting part of low_level_zero plugin (including correspoding docs) 1 year ago
binmakeswell 822051d888
[doc] update slack link (#4823) 1 year ago
Hongxin Liu da15fdb9ca
[doc] add lazy init docs (#4808) 1 year ago
Baizhou Zhang 64a08b2dc3
[checkpointio] support unsharded checkpointIO for hybrid parallel (#4774) 1 year ago
Baizhou Zhang a2db75546d
[doc] polish shardformer doc (#4779) 1 year ago
binmakeswell d512a4d38d
[doc] add llama2 domain-specific solution news (#4789) 1 year ago
Baizhou Zhang 493a5efeab
[doc] add shardformer doc to sidebar (#4768) 1 year ago
Hongxin Liu 66f3926019
[doc] clean up outdated docs (#4765) 1 year ago
Pengtai Xu 4d7537ba25 [doc] put native colossalai plugins first in description section 1 year ago
Pengtai Xu e10d9f087e [doc] add model examples for each plugin 1 year ago
Pengtai Xu a04337bfc3 [doc] put individual plugin explanation in front 1 year ago
Pengtai Xu 10513f203c [doc] explain suitable use case for each plugin 1 year ago
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code (#4743) 1 year ago
Baizhou Zhang d151dcab74
[doc] explaination of loading large pretrained models (#4741) 1 year ago
Baizhou Zhang 451c3465fb
[doc] polish shardformer doc (#4735) 1 year ago
Bin Jia 6a03c933a0
[shardformer] update seq parallel document (#4730) 1 year ago
flybird11111 46162632e5
[shardformer] update pipeline parallel document (#4725) 1 year ago
Baizhou Zhang 50e5602c2d
[doc] add shardformer support matrix/update tensor parallel documents (#4728) 1 year ago
github-actions[bot] 8c2dda7410
[format] applied code formatting on changed files in pull request 4726 (#4727) 1 year ago
Baizhou Zhang f911d5b09d
[doc] Add user document for Shardformer (#4702) 1 year ago
binmakeswell ce97790ed7
[doc] fix llama2 code link (#4726) 1 year ago
Baizhou Zhang 1d454733c4
[doc] Update booster user documents. (#4669) 1 year ago
Hongxin Liu 554aa9592e
[legacy] move communication and nn to legacy and refactor logger (#4671) 1 year ago
Hongxin Liu ac178ca5c1 [legacy] move builder and registry to legacy (#4603) 1 year ago
Hongxin Liu 8accecd55b [legacy] move engine to legacy (#4560) 1 year ago
Hongxin Liu 89fe027787 [legacy] move trainer to legacy (#4545) 1 year ago
binmakeswell 7a978eb3d0
[DOC] hotfix/llama2news (#4595) 1 year ago
Hongxin Liu 27061426f7
[gemini] improve compatibility and add static placement policy (#4479) 1 year ago
binmakeswell 089c365fa0
[doc] add Series A Funding and NeurIPS news (#4377) 1 year ago
flybird1111 f40b718959
[doc] Fix gradient accumulation doc. (#4349) 1 year ago
Baizhou Zhang c6f6005990
[checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (#4302) 1 year ago
binmakeswell 7ff11b5537
[example] add llama pretraining (#4257) 1 year ago
Jianghai 711e2b4c00
[doc] update and revise some typos and errs in docs (#4107) 1 year ago
digger yu 769cddcb2c
fix typo docs/ (#4033) 1 year ago
Baizhou Zhang 4da324cd60
[hotfix]fix argument naming in docs and examples (#4083) 1 year ago
Frank Lee ddcf58cacf
Revert "[sync] sync feature/shardformer with develop" 1 year ago
digger yu 33eef714db
fix typo examples and docs (#3932) 1 year ago
Hongxin Liu 12c90db3f3
[doc] add lazy init tutorial (#3922) 1 year ago
Baizhou Zhang c1535ccbba
[doc] fix docs about booster api usage (#3898) 1 year ago
jiangmingyan 07cb21142f
[doc]update moe chinese document. (#3890) 1 year ago
jiangmingyan 281b33f362
[doc] update document of zero with chunk. (#3855) 2 years ago
jiangmingyan b0474878bf
[doc] update nvme offload documents. (#3850) 2 years ago
jiangmingyan a64df3fa97
[doc] update document of gemini instruction. (#3842) 2 years ago
Frank Lee 54e97ed7ea
[workflow] supported test on CUDA 10.2 (#3841) 2 years ago
wukong1992 3229f93e30
[booster] add warning for torch fsdp plugin doc (#3833) 2 years ago