2136 Commits (258b43317c4a5cafb8d3da0ff63c8843443bc448)
 

Author SHA1 Message Date
github-actions[bot] da056285f2
[format] applied code formatting on changed files in pull request 2922 (#2923) 2 years ago
binmakeswell 12bafe057f
[doc] update installation for GPT (#2922) 2 years ago
binmakeswell 0afb55fc5b
[doc] add os scope, update tutorial install and tips (#2914) 2 years ago
YH 7b13f7db18
[zero] trivial zero optimizer refactoring (#2869) 2 years ago
fastalgo dbc01b9c04
Update README.md 2 years ago
Frank Lee e33c043dec
[workflow] moved pre-commit to post-commit (#2895) 2 years ago
Jiatong (Julius) Han 8c8a39be95
[hotfix]: Remove math.prod dependency (#2837) 2 years ago
YuliangLiu0306 819e25d8b1
[hotfix] fix autoparallel compatibility test issues (#2754) 2 years ago
YuliangLiu0306 0f392d7403
[autoparallel] find repeat blocks (#2854) 2 years ago
BlueRum 2e16f842a9
[chatgpt]support opt & gpt for rm training (#2876) 2 years ago
junxu c52edcf0eb
Rename class method of ZeroDDP (#2692) 2 years ago
HELSON 6e4ac08172
[hotfix] fix chunk size can not be divided (#2867) 2 years ago
Alex_996 a4fc125c34
Fix typos (#2863) 2 years ago
dawei-wang 55424a16a5
[doc] fix GPT tutorial (#2860) 2 years ago
Boyuan Yao eae77c831d
[autoparallel] Patch meta information for nodes that will not be handled by SPMD solver (#2823) 2 years ago
Boyuan Yao c7764d3f22
[autoparallel] Patch meta information of `torch.where` (#2822) 2 years ago
Boyuan Yao fcc4097efa
[autoparallel] Patch meta information of `torch.tanh()` and `torch.nn.Dropout` (#2773) 2 years ago
BlueRum 34ca324b0d
[chatgpt] Support saving ckpt in examples (#2846) 2 years ago
Zheng Zeng 597914317b
[doc] fix typo in opt inference tutorial (#2849) 2 years ago
Frank Lee 935346430f
[cli] handled version check exceptions (#2848) 2 years ago
BlueRum 3eebc4dff7
[chatgpt] fix rm eval (#2829) 2 years ago
Frank Lee 918bc94b6b
[triton] added copyright information for flash attention (#2835) 2 years ago
Boyuan Yao 7ea6bc7f69
[autoparallel] Patch tensor related operations meta information (#2789) 2 years ago
github-actions[bot] a5721229d9
Automated submodule synchronization (#2740) 2 years ago
Haofan Wang 47ecb22387
[example] add LoRA support (#2821) 2 years ago
ver217 b6a108cb91
[chatgpt] add test checkpoint (#2797) 2 years ago
Michelle c008d4ad0c
[NFC] polish colossalai/engine/schedule/_pipeline_schedule.py code style (#2744) 2 years ago
mickogoin 58abde2857
Update README.md (#2791) 2 years ago
Marco Rodrigues 89f0017a9c
Typo (#2826) 2 years ago
Jiarui Fang bf0204604f
[exmaple] add bert and albert (#2824) 2 years ago
YuliangLiu0306 cf6409dd40
Hotfix/auto parallel zh doc (#2820) 2 years ago
YuliangLiu0306 2059fdd6b0
[hotfix] add copyright for solver and device mesh (#2803) 2 years ago
LuGY dbd0fd1522
[CI/CD] fix nightly release CD running on forked repo (#2812) 2 years ago
Boyuan Yao 8593ae1a3f
[autoparallel] rotor solver refactor (#2813) 2 years ago
binmakeswell 09f457479d
[doc] update OPT serving (#2804) 2 years ago
HELSON 56ddc9ca7a
[hotfix] add correct device for fake_param (#2796) 2 years ago
ver217 a619a190df
[chatgpt] update readme about checkpoint (#2792) 2 years ago
ver217 4ee311c026
[chatgpt] startegy add prepare method (#2766) 2 years ago
Boyuan Yao a2b43e393d
[autoparallel] Patch meta information of `torch.nn.Embedding` (#2760) 2 years ago
Boyuan Yao 8e3f66a0d1
[zero] fix wrong import (#2777) 2 years ago
Fazzie-Maqianli ba84cd80b2
fix pip install colossal (#2764) 2 years ago
Nikita Shulga 01066152f1
Don't use `torch._six` (#2775) 2 years ago
ver217 a88bc828d5
[chatgpt] disable shard init for colossalai (#2767) 2 years ago
binmakeswell d6d6dec190
[doc] update example and OPT serving link (#2769) 2 years ago
Frank Lee e376954305
[doc] add opt service doc (#2747) 2 years ago
BlueRum 613efebc5c
[chatgpt] support colossalai strategy to train rm (#2742) 2 years ago
BlueRum 648183a960
[chatgpt]fix train_rm bug with lora (#2741) 2 years ago
fastalgo b6e3b955c3
Update README.md 2 years ago
binmakeswell 30aee9c45d
[NFC] polish code format 2 years ago
YuliangLiu0306 1dc003c169
[autoparallel] distinguish different parallel strategies (#2699) 2 years ago