2136 Commits (258b43317c4a5cafb8d3da0ff63c8843443bc448)
 

Author SHA1 Message Date
HELSON 552183bb74
[polish] polish ColoTensor and its submodules (#2537) 2 years ago
github-actions[bot] 51d4d6e718
Automated submodule synchronization (#2492) 2 years ago
Frank Lee 4af31d263d
[doc] updated the CHANGE_LOG.md for github release page (#2552) 2 years ago
Frank Lee 578374d0de
[doc] fixed the typo in pr template (#2556) 2 years ago
Frank Lee dd14783f75
[kernel] fixed repeated loading of kernels (#2549) 2 years ago
Frank Lee 8438c35a5f
[doc] added pull request template (#2550) 2 years ago
ver217 5b1854309a
[hotfix] fix zero ddp warmup check (#2545) 2 years ago
oahzxl fa3d66feb9
support unet metainfo prop (#2544) 2 years ago
oahzxl c4b15661d7
[autochunk] add benchmark for transformer and alphafold (#2543) 2 years ago
binmakeswell 9885ec2b2e
[git] remove invalid submodule (#2540) 2 years ago
oahzxl 05671fcb42
[autochunk] support multi outputs chunk search (#2538) 2 years ago
YuliangLiu0306 f477a14f4a
[hotfix] fix autoparallel demo (#2533) 2 years ago
oahzxl 63199c6687
[autochunk] support transformer (#2526) 2 years ago
HELSON 6e0faa70e0
[gemini] add profiler in the demo (#2534) 2 years ago
Fazzie-Maqianli df437ca039
Merge pull request #2532 from Fazziekey/fix 2 years ago
Fazzie f35326881c fix README 2 years ago
HELSON a4ed9125ac
[hotfix] fix lightning error (#2529) 2 years ago
Frank Lee b55deb0662
[workflow] only report coverage for changed files (#2524) 2 years ago
HELSON 66dfcf5281
[gemini] update the gpt example (#2527) 2 years ago
LuGY ecbad93b65
[example] Add fastfold tutorial (#2528) 2 years ago
Frank Lee af151032f2
[workflow] fixed the precommit CI (#2525) 2 years ago
HELSON b528eea0f0
[zero] add zero wrappers (#2523) 2 years ago
Super Daniel c198c7c0b0
[hotfix] meta tensor default device. (#2510) 2 years ago
HELSON 077a5cdde4
[zero] fix gradient clipping in hybrid parallelism (#2521) 2 years ago
Jiarui Fang fd8d19a6e7
[example] update lightning dependency for stable diffusion (#2522) 2 years ago
YuliangLiu0306 aa0f6686f9
[autoparallel] accelerate gpt2 training (#2495) 2 years ago
binmakeswell a360b9bc44
[doc] update example link (#2520) 2 years ago
HELSON 707b11d4a0
[gemini] update ddp strict mode (#2518) 2 years ago
Frank Lee 0af793836c
[workflow] fixed changed file detection (#2515) 2 years ago
binmakeswell a6a10616ec
[doc] update opt and tutorial links (#2509) 2 years ago
HELSON 2d1a7dfe5f
[zero] add strict ddp mode (#2508) 2 years ago
oahzxl c04f183237
[autochunk] support parsing blocks (#2506) 2 years ago
Super Daniel 35c0c0006e
[utils] lazy init. (#2148) 2 years ago
oahzxl 72341e65f4
[auto-chunk] support extramsa (#3) (#2504) 2 years ago
Ziyue Jiang 0f02b8c6e6
add avg partition (#2483) 2 years ago
アマデウス 99d9713b02 Revert "Update parallel_context.py (#2408)" 2 years ago
oahzxl ecccc91f21
[autochunk] support autochunk on evoformer (#2497) 2 years ago
Fazzie-Maqianli 304f1ba124
Merge pull request #2499 from feifeibear/dev0116_10 2 years ago
jiaruifang 32390cbe8f add test_ci.sh to dreambooth 2 years ago
jiaruifang 7f822a5c45 Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into dev0116 2 years ago
jiaruifang 025b482dc1 [example] dreambooth example 2 years ago
oahzxl 5db3a5bf42
[fx] allow control of ckpt_codegen init (#2498) 2 years ago
Jiarui Fang e327e95144
[hotfix] gpt example titans bug #2493 (#2494) 2 years ago
jiaruifang e58cc441e2 polish code and fix dataloader bugs 2 years ago
jiaruifang a4b75b78a0 [hotfix] gpt example titans bug #2493 2 years ago
jiaruifang 8208fd023a Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into dev0116 2 years ago
HELSON d565a24849
[zero] add unit testings for hybrid parallelism (#2486) 2 years ago
binmakeswell fcc6d61d92
[example] fix requirements (#2488) 2 years ago
oahzxl 4953b4ace1
[autochunk] support evoformer tracer (#2485) 2 years ago
YuliangLiu0306 67e1912b59
[autoparallel] support origin activation ckpt on autoprallel system (#2468) 2 years ago