Commit Graph

190 Commits (main)

Author SHA1 Message Date
Hongxin Liu 070df689e6
[devops] fix extention building (#5427)
9 months ago
flybird11111 29695cf70c
[example]add gpt2 benchmark example script. (#5295)
9 months ago
Hongxin Liu d882d18c65
[example] reuse flash attn patch (#5400)
9 months ago
digger yu 71321a07cf
fix typo change dosen't to doesn't (#5308)
10 months ago
Frank Lee 8823cc4831
Merge pull request #5310 from hpcaitech/feature/npu
10 months ago
flybird11111 f7e3f82a7e
fix llama pretrain (#5287)
10 months ago
ver217 148469348a Merge branch 'main' into sync/npu
10 months ago
Wenhao Chen ef4f0ee854
[hotfix]: add pp sanity check and fix mbs arg (#5268)
10 months ago
binmakeswell c174c4fc5f
[doc] fix doc typo (#5256)
11 months ago
Hongxin Liu d202cc28c0
[npu] change device to accelerator api (#5239)
11 months ago
Xuanlei Zhao dd2c28a323
[npu] use extension for op builder (#5172)
11 months ago
Wenhao Chen 3c0d82b19b
[pipeline]: support arbitrary batch size in forward_only mode (#5201)
11 months ago
Wenhao Chen 4fa689fca1
[pipeline]: fix p2p comm, add metadata cache and support llama interleaved pp (#5134)
11 months ago
flybird11111 21aa5de00b
[gemini] hotfix NaN loss while using Gemini + tensor_parallel (#5150)
12 months ago
binmakeswell 177c79f2d1
[doc] add moe news (#5128)
1 year ago
Wenhao Chen 7172459e74
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088)
1 year ago
digger yu d5661f0f25
[nfc] fix typo change directoty to directory (#5111)
1 year ago
Xuanlei Zhao 3acbf6d496
[npu] add npu support for hybrid plugin and llama (#5090)
1 year ago
flybird11111 aae496631c
[shardformer]fix flash attention, when mask is casual, just don't unpad it (#5084)
1 year ago
github-actions[bot] 8921a73c90
[format] applied code formatting on changed files in pull request 5067 (#5072)
1 year ago
Hongxin Liu e5ce4c8ea6
[npu] add npu support for gemini and zero (#5067)
1 year ago
flybird11111 bc09b95f50
[exampe] fix llama example' loss error when using gemini plugin (#5060)
1 year ago
Elsa Granger b2ad0d9e8f
[pipeline,shardformer] Fix p2p efficiency in pipeline, allow skipping loading weight not in weight_map when `strict=False`, fix llama flash attention forward, add flop estimation by megatron in llama benchmark (#5017)
1 year ago
Wenhao Chen 724441279b
[moe]: fix ep/tp tests, add hierarchical all2all (#4982)
1 year ago
Xuanlei Zhao f71e63b0f3
[moe] support optimizer checkpoint (#5015)
1 year ago
Xuanlei Zhao dc003c304c
[moe] merge moe into main (#4978)
1 year ago
Blagoy Simandoff 8aed02b957
[nfc] fix minor typo in README (#4846)
1 year ago
Baizhou Zhang df66741f77
[bug] fix get_default_parser in examples (#4764)
1 year ago
Wenhao Chen 7b9b86441f
[chat]: update rm, add wandb and fix bugs (#4471)
1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752)
1 year ago
github-actions[bot] 3c6b831c26
[format] applied code formatting on changed files in pull request 4743 (#4750)
1 year ago
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code (#4743)
1 year ago
flybird11111 4c4482f3ad
[example] llama2 add fine-tune example (#4673)
1 year ago
Bin Jia 608cffaed3
[example] add gpt2 HybridParallelPlugin example (#4653)
1 year ago
binmakeswell ce97790ed7
[doc] fix llama2 code link (#4726)
1 year ago
Baizhou Zhang 068372a738
[doc] add potential solution for OOM in llama2 example (#4699)
1 year ago
Hongxin Liu 554aa9592e
[legacy] move communication and nn to legacy and refactor logger (#4671)
1 year ago
flybird11111 7486ed7d3a
[shardformer] update llama2/opt finetune example and fix llama2 policy (#4645)
1 year ago
Baizhou Zhang 660eed9124
[pipeline] set optimizer to optional in execute_pipeline (#4630)
1 year ago
Hongxin Liu fae6c92ead
Merge branch 'main' into feature/shardformer
1 year ago
Hongxin Liu ac178ca5c1 [legacy] move builder and registry to legacy (#4603)
1 year ago
Hongxin Liu 89fe027787 [legacy] move trainer to legacy (#4545)
1 year ago
flybird11111 ec0866804c
[shardformer] update shardformer readme (#4617)
1 year ago
Hongxin Liu a39a5c66fe
Merge branch 'main' into feature/shardformer
1 year ago
flybird11111 0a94fcd351
[shardformer] update bert finetune example with HybridParallelPlugin (#4584)
1 year ago
binmakeswell 8d7b02290f
[doc] add llama2 benchmark (#4604)
1 year ago
Hongxin Liu 0b00def881
[example] add llama2 example (#4527)
1 year ago
Hongxin Liu 27061426f7
[gemini] improve compatibility and add static placement policy (#4479)
1 year ago
binmakeswell ef4b99ebcd add llama example CI
1 year ago
binmakeswell 7ff11b5537
[example] add llama pretraining (#4257)
1 year ago
digger yu 2d40759a53
fix #3852 path error (#4058)
1 year ago
Baizhou Zhang 4da324cd60
[hotfix]fix argument naming in docs and examples (#4083)
1 year ago
LuGY 160c64c645
[example] fix bucket size in example of gpt gemini (#4028)
1 year ago
Baizhou Zhang b3ab7fbabf
[example] update ViT example using booster api (#3940)
1 year ago
digger yu 33eef714db
fix typo examples and docs (#3932)
1 year ago
Baizhou Zhang e417dd004e
[example] update opt example using booster api (#3918)
1 year ago
Liu Ziming b306cecf28
[example] Modify palm example with the new booster API (#3913)
1 year ago
wukong1992 a55fb00c18
[booster] update bert example, using booster api (#3885)
1 year ago
jiangmingyan 5f79008c4a
[example] update gemini examples (#3868)
2 years ago
digger yu 518b31c059
[docs] change placememt_policy to placement_policy (#3829)
2 years ago
binmakeswell 15024e40d9
[auto] fix install cmd (#3772)
2 years ago
digger-yu b9a8dff7e5
[doc] Fix typo under colossalai and doc(#3618)
2 years ago
binmakeswell f1b3d60cae
[example] reorganize for community examples (#3557)
2 years ago
mandoxzhang 8f2c55f9c9
[example] remove redundant texts & update roberta (#3493)
2 years ago
mandoxzhang ab5fd127e3
[example] update roberta with newer ColossalAI (#3472)
2 years ago
Frank Lee 80eba05b0a
[test] refactor tests with spawn (#3452)
2 years ago
ver217 573af84184
[example] update examples related to zero/gemini (#3431)
2 years ago
ver217 26b7aac0be
[zero] reorganize zero/gemini folder structure (#3424)
2 years ago
Yan Fang 189347963a
[auto] fix requirements typo for issue #3125 (#3209)
2 years ago
Zihao 18dbe76cae
[auto-parallel] add auto-offload feature (#3154)
2 years ago
binmakeswell 360674283d
[example] fix redundant note (#3065)
2 years ago
Tomek af3888481d
[example] fixed opt model downloading from huggingface
2 years ago
ramos 2ef855c798
support shardinit option to avoid OPT OOM initializing problem (#3037)
2 years ago
Ziyue Jiang 400f63012e
[pipeline] Add Simplified Alpa DP Partition (#2507)
2 years ago
github-actions[bot] da056285f2
[format] applied code formatting on changed files in pull request 2922 (#2923)
2 years ago
binmakeswell 12bafe057f
[doc] update installation for GPT (#2922)
2 years ago
Alex_996 a4fc125c34
Fix typos (#2863)
2 years ago
dawei-wang 55424a16a5
[doc] fix GPT tutorial (#2860)
2 years ago
Jiarui Fang bf0204604f
[exmaple] add bert and albert (#2824)
2 years ago
cloudhuang 43dffdaba5
[doc] fixed a typo in GPT readme (#2736)
2 years ago
Jiatong (Julius) Han a255a38f7f
[example] Polish README.md (#2658)
2 years ago
HELSON 6e0faa70e0
[gemini] add profiler in the demo (#2534)
2 years ago
HELSON 66dfcf5281
[gemini] update the gpt example (#2527)
2 years ago
HELSON 707b11d4a0
[gemini] update ddp strict mode (#2518)
2 years ago
HELSON 2d1a7dfe5f
[zero] add strict ddp mode (#2508)
2 years ago
Jiarui Fang e327e95144
[hotfix] gpt example titans bug #2493 (#2494)
2 years ago
binmakeswell fcc6d61d92
[example] fix requirements (#2488)
2 years ago
Jiarui Fang 3a21485ead
[example] titans for gpt (#2484)
2 years ago
Jiarui Fang 7c31706227
[CI] add test_ci.sh for palm, opt and gpt (#2475)
2 years ago
ver217 f525d1f528
[example] update gpt gemini example ci test (#2477)
2 years ago
Ziyue Jiang fef5c949c3
polish pp middleware (#2476)
2 years ago
Jiarui Fang 867c8c2d3a
[zero] low level optim supports ProcessGroup (#2464)
2 years ago
YuliangLiu0306 2731531bc2
[autoparallel] integrate device mesh initialization into autoparallelize (#2393)
2 years ago
ZijianYY fe0f7970a2
[examples] adding tflops to PaLM (#2365)
2 years ago
HELSON d84e747975
[hotfix] add DISTPAN argument for benchmark (#2412)
2 years ago
HELSON 498b5ca993
[hotfix] fix gpt gemini example (#2404)
2 years ago
Jiarui Fang 12c8bf38d7
[Pipeline] Refine GPT PP Example
2 years ago
Ziyue Jiang ad00894f7f polish
2 years ago
Jiarui Fang 1aaeb596c6
[example] gpt, shard init on all processes (#2366)
2 years ago
Ziyue Jiang 3a15b20421 Move GPT PP Example
2 years ago