Fazzie-Maqianli
304f1ba124
Merge pull request #2499 from feifeibear/dev0116_10
...
[example] check dreambooth example gradient accmulation must be 1
2 years ago
jiaruifang
32390cbe8f
add test_ci.sh to dreambooth
2 years ago
jiaruifang
7f822a5c45
Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into dev0116
2 years ago
jiaruifang
025b482dc1
[example] dreambooth example
2 years ago
oahzxl
5db3a5bf42
[fx] allow control of ckpt_codegen init ( #2498 )
...
* [fx] allow control of ckpt_codegen init
Currently in ColoGraphModule, ActivationCheckpointCodeGen will be set automatically in __init__. But other codegen can't be set if so.
So I add an arg to control whether to set ActivationCheckpointCodeGen in __init__.
* code style
2 years ago
Jiarui Fang
e327e95144
[hotfix] gpt example titans bug #2493 ( #2494 )
2 years ago
jiaruifang
e58cc441e2
polish code and fix dataloader bugs
2 years ago
jiaruifang
a4b75b78a0
[hotfix] gpt example titans bug #2493
2 years ago
jiaruifang
8208fd023a
Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into dev0116
2 years ago
HELSON
d565a24849
[zero] add unit testings for hybrid parallelism ( #2486 )
2 years ago
binmakeswell
fcc6d61d92
[example] fix requirements ( #2488 )
2 years ago
oahzxl
4953b4ace1
[autochunk] support evoformer tracer ( #2485 )
...
support full evoformer tracer, which is a main module of alphafold. previously we just support a simplifed version of it.
1. support some evoformer's op in fx
2. support evoformer test
3. add repos for test code
2 years ago
YuliangLiu0306
67e1912b59
[autoparallel] support origin activation ckpt on autoprallel system ( #2468 )
2 years ago
Jiarui Fang
3a21485ead
[example] titans for gpt ( #2484 )
2 years ago
jiaruifang
438ea608f3
update readme
2 years ago
jiaruifang
38424db6ff
polish code
2 years ago
jiaruifang
92f65fbbe3
remove license
2 years ago
jiaruifang
315e1433ce
polish readme
2 years ago
jiaruifang
37baea20cb
[example] titans for gpt
2 years ago
jiaruifang
236b4195ff
Merge branch 'main' of https://github.com/hpcaitech/ColossalAI into dev0116
2 years ago
jiaruifang
e64a05b38b
polish code
2 years ago
Jiarui Fang
7c31706227
[CI] add test_ci.sh for palm, opt and gpt ( #2475 )
2 years ago
Jiarui Fang
e4c38ba367
[example] stable diffusion add roadmap ( #2482 )
2 years ago
jiaruifang
9cba38b492
add dummy test_ci.sh
2 years ago
jiaruifang
f78bad21ed
[example] stable diffusion add roadmap
2 years ago
Frank Lee
579dba572f
[workflow] fixed the skip condition of example weekly check workflow ( #2481 )
2 years ago
HELSON
21c88220ce
[zero] add unit test for low-level zero init ( #2474 )
2 years ago
ver217
f525d1f528
[example] update gpt gemini example ci test ( #2477 )
2 years ago
Ziyue Jiang
fef5c949c3
polish pp middleware ( #2476 )
...
Co-authored-by: Ziyue Jiang <ziyue.jiang@gmail.com>
2 years ago
HELSON
a5dc4253c6
[zero] polish low level optimizer ( #2473 )
2 years ago
Frank Lee
8b7495dd54
[example] integrate seq-parallel tutorial with CI ( #2463 )
2 years ago
ver217
8e85d2440a
[example] update vit ci script ( #2469 )
...
* [example] update vit ci script
* [example] update requirements
* [example] update requirements
2 years ago
Jiarui Fang
867c8c2d3a
[zero] low level optim supports ProcessGroup ( #2464 )
2 years ago
Frank Lee
e6943e2d11
[example] integrate autoparallel demo with CI ( #2466 )
...
* [example] integrate autoparallel demo with CI
* polish code
* polish code
* polish code
* polish code
2 years ago
Frank Lee
14d9299360
[cli] fixed hostname mismatch error ( #2465 )
2 years ago
YuliangLiu0306
c20529fe78
[examples] update autoparallel tutorial demo ( #2449 )
...
* [examples] update autoparallel tutorial demo
* add test_ci.sh
* polish
* add conda yaml
2 years ago
Haofan Wang
9358262992
Fix False warning in initialize.py ( #2456 )
...
* Update initialize.py
* pre-commit run check
2 years ago
Frank Lee
32c46e146e
[workflow] automated bdist wheel build ( #2459 )
...
* [workflow] automated bdist wheel build
* polish workflow
* polish readme
* polish readme
2 years ago
YuliangLiu0306
8221fd7485
[autoparallel] update binary elementwise handler ( #2451 )
...
* [autoparallel] update binary elementwise handler
* polish
2 years ago
Frank Lee
c9ec5190a0
[workflow] automated the compatiblity test ( #2453 )
...
* [workflow] automated the compatiblity test
* polish code
2 years ago
Frank Lee
483efdabc5
[workflow] fixed the on-merge condition check ( #2452 )
2 years ago
Haofan Wang
cfd1d5ee49
[example] fixed seed error in train_dreambooth_colossalai.py ( #2445 )
2 years ago
Frank Lee
ac18a445fa
[example] updated large-batch optimizer tutorial ( #2448 )
...
* [example] updated large-batch optimizer tutorial
* polish code
* polish code
2 years ago
HELSON
2bfeb24308
[zero] add warning for ignored parameters ( #2446 )
2 years ago
Frank Lee
39163417a1
[example] updated the hybrid parallel tutorial ( #2444 )
...
* [example] updated the hybrid parallel tutorial
* polish code
2 years ago
HELSON
5521af7877
[zero] fix state_dict and load_state_dict for ddp ignored parameters ( #2443 )
...
* [ddp] add is_ddp_ignored
[ddp] rename to is_ddp_ignored
* [zero] fix state_dict and load_state_dict
* fix bugs
* [zero] update unit test for ZeroDDP
2 years ago
YuliangLiu0306
2731531bc2
[autoparallel] integrate device mesh initialization into autoparallelize ( #2393 )
...
* [autoparallel] integrate device mesh initialization into autoparallelize
* add megatron solution
* update gpt autoparallel examples with latest api
* adapt beta value to fit the current computation cost
2 years ago
Frank Lee
c72c827e95
[cli] provided more details if colossalai run fail ( #2442 )
2 years ago
Super Daniel
c41e59e5ad
[fx] allow native ckpt trace and codegen. ( #2438 )
2 years ago
YuliangLiu0306
41429b9b28
[autoparallel] add shard option ( #2423 )
2 years ago