Commit Graph

158 Commits (7d8e0338a40da196436bef866ebdc43d5ed1c677)

Author SHA1 Message Date
Frank Lee 8823cc4831
Merge pull request #5310 from hpcaitech/feature/npu
10 months ago
digger yu bce9499ed3
fix some typo (#5307)
10 months ago
ver217 148469348a Merge branch 'main' into sync/npu
11 months ago
Hongxin Liu d202cc28c0
[npu] change device to accelerator api (#5239)
11 months ago
binmakeswell 7bc6969ce6
[doc] SwiftInfer release (#5236)
11 months ago
binmakeswell b9b32b15e6
[doc] add Colossal-LLaMA-2-13B (#5234)
11 months ago
flybird11111 681d9b12ef
[doc] update pytorch version in documents. (#5177)
12 months ago
binmakeswell 177c79f2d1
[doc] add moe news (#5128)
1 year ago
Wenhao Chen 7172459e74
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088)
1 year ago
digger yu d5661f0f25
[nfc] fix typo change directoty to directory (#5111)
1 year ago
digger yu 2bdf76f1f2
fix typo change lazy_iniy to lazy_init (#5099)
1 year ago
digger yu 0d482302a1
[nfc] fix typo and author name (#5089)
1 year ago
digger yu fd3567e089
[nfc] fix typo in docs/ (#4972)
1 year ago
ppt0011 335cb105e2
[doc] add supported feature diagram for hybrid parallel plugin (#4996)
1 year ago
digger yu 11009103be
[nfc] fix some typo with colossalai/ docs/ etc. (#4920)
1 year ago
Baizhou Zhang 21ba89cab6
[gemini] support gradient accumulation (#4869)
1 year ago
flybird11111 6a21f96a87
[doc] update advanced tutorials, training gpt with hybrid parallelism (#4866)
1 year ago
Zhongkai Zhao db40e086c8 [test] modify model supporting part of low_level_zero plugin (including correspoding docs)
1 year ago
binmakeswell 822051d888
[doc] update slack link (#4823)
1 year ago
Hongxin Liu da15fdb9ca
[doc] add lazy init docs (#4808)
1 year ago
Baizhou Zhang 64a08b2dc3
[checkpointio] support unsharded checkpointIO for hybrid parallel (#4774)
1 year ago
Baizhou Zhang a2db75546d
[doc] polish shardformer doc (#4779)
1 year ago
binmakeswell d512a4d38d
[doc] add llama2 domain-specific solution news (#4789)
1 year ago
Baizhou Zhang 493a5efeab
[doc] add shardformer doc to sidebar (#4768)
1 year ago
Hongxin Liu 66f3926019
[doc] clean up outdated docs (#4765)
1 year ago
Pengtai Xu 4d7537ba25 [doc] put native colossalai plugins first in description section
1 year ago
Pengtai Xu e10d9f087e [doc] add model examples for each plugin
1 year ago
Pengtai Xu a04337bfc3 [doc] put individual plugin explanation in front
1 year ago
Pengtai Xu 10513f203c [doc] explain suitable use case for each plugin
1 year ago
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code (#4743)
1 year ago
Baizhou Zhang d151dcab74
[doc] explaination of loading large pretrained models (#4741)
1 year ago
Baizhou Zhang 451c3465fb
[doc] polish shardformer doc (#4735)
1 year ago
Bin Jia 6a03c933a0
[shardformer] update seq parallel document (#4730)
1 year ago
flybird11111 46162632e5
[shardformer] update pipeline parallel document (#4725)
1 year ago
Baizhou Zhang 50e5602c2d
[doc] add shardformer support matrix/update tensor parallel documents (#4728)
1 year ago
github-actions[bot] 8c2dda7410
[format] applied code formatting on changed files in pull request 4726 (#4727)
1 year ago
Baizhou Zhang f911d5b09d
[doc] Add user document for Shardformer (#4702)
1 year ago
binmakeswell ce97790ed7
[doc] fix llama2 code link (#4726)
1 year ago
Baizhou Zhang 1d454733c4
[doc] Update booster user documents. (#4669)
1 year ago
Hongxin Liu 554aa9592e
[legacy] move communication and nn to legacy and refactor logger (#4671)
1 year ago
Hongxin Liu ac178ca5c1 [legacy] move builder and registry to legacy (#4603)
1 year ago
Hongxin Liu 8accecd55b [legacy] move engine to legacy (#4560)
1 year ago
Hongxin Liu 89fe027787 [legacy] move trainer to legacy (#4545)
1 year ago
binmakeswell 7a978eb3d0
[DOC] hotfix/llama2news (#4595)
1 year ago
Hongxin Liu 27061426f7
[gemini] improve compatibility and add static placement policy (#4479)
1 year ago
binmakeswell 089c365fa0
[doc] add Series A Funding and NeurIPS news (#4377)
1 year ago
flybird1111 f40b718959
[doc] Fix gradient accumulation doc. (#4349)
1 year ago
Baizhou Zhang c6f6005990
[checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (#4302)
1 year ago
binmakeswell 7ff11b5537
[example] add llama pretraining (#4257)
1 year ago
Jianghai 711e2b4c00
[doc] update and revise some typos and errs in docs (#4107)
1 year ago