Jiarui Fang
|
0aab52301e
|
[hotfix] fix a bug in model data stats tracing (#655)
|
3 years ago |
YuliangLiu0306
|
ade05a5d83
|
[refactor] pipeline, put runtime schedule into engine. (#627)
|
3 years ago |
HELSON
|
e5d615aeee
|
[hotfix] fix bugs in testing (#659)
* remove hybrid adam in test_moe_zero_optim
* fix activation checkpointing and its unitest
|
3 years ago |
Jiarui Fang
|
036404ca8a
|
Revert "[zero] polish init context (#645)" (#657)
|
3 years ago |
HELSON
|
b31daed4cf
|
fix bugs in CPU adam (#633)
* add cpu adam counter for all cpu adam
* fixed updating error in adam kernel
|
3 years ago |
LuGY
|
1e2557e801
|
[zero] fixed the activation offload (#647)
* fixed the activation offload
* polish
|
3 years ago |
Liang Bowen
|
828e465622
|
[hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622)
|
3 years ago |
binmakeswell
|
e0f875a8e2
|
[GitHub] Add prefix and label in issue template (#652)
|
3 years ago |
Jiarui Fang
|
67b4928244
|
[zero] polish init context (#645)
|
3 years ago |
ver217
|
f5d3a9c2b0
|
polish checkpoint docstring (#637)
|
3 years ago |
HELSON
|
055fbf5be6
|
[zero] adapt zero for unsharded paramters (Optimizer part) (#601)
|
3 years ago |
KAIYUAN GAN
|
229382c844
|
[NFC] polish colossalai/kernel/cuda_native/csrc/kernels/cuda_util.cu code stype (#625)
|
3 years ago |
アマデウス
|
354b7954d1
|
[model checkpoint] added unit tests for checkpoint save/load (#599)
|
3 years ago |
アマデウス
|
28b515d610
|
[model checkpoint] updated checkpoint hook (#598)
|
3 years ago |
アマデウス
|
77ad24bf94
|
[model checkpoint] updated saving/loading for 3d layers (#597)
|
3 years ago |
アマデウス
|
93089ed708
|
[model checkpoint] updated saving/loading for 2.5d layers (#596)
|
3 years ago |
アマデウス
|
6302069c0e
|
[model checkpoint] updated communication ops for cpu tensors (#590)
|
3 years ago |
アマデウス
|
c50bfb807b
|
[model checkpoint] updated saving/loading for 1d layers (#594)
|
3 years ago |
アマデウス
|
7636d518e1
|
[model checkpoint] updated saving/loading for 2d layers (#595)
|
3 years ago |
アマデウス
|
cd13b63832
|
[model checkpoint] reworked unified layers for ease of save/load states (#593)
|
3 years ago |
アマデウス
|
acae68eb04
|
[model checkpoint] updated checkpoint save/load utils (#592)
|
3 years ago |
Ziyue Jiang
|
1c40ee8749
|
[TP] add assert for tp1d (#621)
|
3 years ago |
ver217
|
369a288bf3
|
polish utils docstring (#620)
|
3 years ago |
ver217
|
e619a651fb
|
polish optimizer docstring (#619)
|
3 years ago |
ver217
|
8432dc7080
|
polish moe docsrting (#618)
|
3 years ago |
ver217
|
c5b488edf8
|
polish amp docstring (#616)
|
3 years ago |
ver217
|
f69507dd22
|
update rst (#615)
|
3 years ago |
FredHuang99
|
93f14d2a33
|
[zero] test zero tensor utils (#609)
|
3 years ago |
ver217
|
0ef8819c67
|
polish docstring of zero (#612)
|
3 years ago |
LuGY
|
02b187c14f
|
[zero] add sampling time for memstats collector (#610)
|
3 years ago |
ver217
|
9bee119104
|
[hotfix] fix sharded optim zero grad (#604)
* fix sharded optim zero grad
* polish comments
|
3 years ago |
アマデウス
|
297b8baae2
|
[model checkpoint] add gloo groups for cpu tensor communication (#589)
|
3 years ago |
アマデウス
|
54e688b623
|
moved ensure_path_exists to utils.common (#591)
|
3 years ago |
Jiarui Fang
|
e956d93ac2
|
[refactor] memory utils (#577)
|
3 years ago |
ver217
|
104cbbb313
|
[hotfix] add hybrid adam to __init__ (#584)
|
3 years ago |
HELSON
|
e6d50ec107
|
[zero] adapt zero for unsharded parameters (#561)
* support existing sharded and unsharded parameters in zero
* add unitest for moe-zero model init
* polish moe gradient handler
|
3 years ago |
LuGY
|
13ed4b6441
|
[model zoo] add activation offload for gpt model (#582)
|
3 years ago |
Wesley
|
46c9ba33da
|
update code format
|
3 years ago |
Wesley
|
666cfd094a
|
fix parallel_input flag for Linear1D_Col gather_output
|
3 years ago |
BoxiangW
|
a9f778f1b1
|
[tool] create .clang-format for pre-commit (#578)
Change the clang-format style to google style
|
3 years ago |
ver217
|
7c6c427db1
|
[zero] trace states of fp16/32 grad and fp32 param (#571)
|
3 years ago |
Jiarui Fang
|
7675366fce
|
[polish] rename col_attr -> colo_attr (#558)
|
3 years ago |
Liang Bowen
|
2c45efc398
|
html refactor (#555)
|
3 years ago |
Jiarui Fang
|
d1211148a7
|
[utils] update colo tensor moving APIs (#553)
|
3 years ago |
LuGY
|
c44d797072
|
[docs] updatad docs of hybrid adam and cpu adam (#552)
|
3 years ago |
ver217
|
014bac0c49
|
[zero] hijack p.grad in sharded model (#554)
* hijack p.grad in sharded model
* polish comments
* polish comments
|
3 years ago |
Jiarui Fang
|
f552b11294
|
[zero] label state for param fp16 and grad (#551)
|
3 years ago |
github-actions[bot]
|
92f4224867
|
Automated submodule synchronization (#501)
|
3 years ago |
Jiarui Fang
|
214da761d4
|
[zero] add stateful tensor (#549)
|
3 years ago |
Jiarui Fang
|
107b99ddb1
|
[zero] dump memory stats for sharded model (#548)
|
3 years ago |