45 Commits (258b43317c4a5cafb8d3da0ff63c8843443bc448)

Author SHA1 Message Date
Frank Lee 8518263b80
[test] fixed the triton version for testing (#2608) 2 years ago
HELSON 077a5cdde4
[zero] fix gradient clipping in hybrid parallelism (#2521) 2 years ago
Frank Lee 40d376c566
[setup] support pre-build and jit-build of cuda kernels (#2374) 2 years ago
xyupeng b965585d05 [NFC] polish colossalai/amp/torch_amp/torch_amp.py code style (#2290) 2 years ago
Ziheng Qin 3041014089 [NFC] polish colossalai/amp/naive_amp/grad_scaler/dynamic_grad_scaler.py code style (#2299) 2 years ago
HELSON 5d3a2be3af
[amp] add gradient clipping for unit tests (#2283) 2 years ago
YuliangLiu0306 f027ef7913
[hotfix] fix fp16 optimzier bug (#2273) 2 years ago
Jiarui Fang 355ffb386e
[builder] unified cpu_optim fused_optim inferface (#2190) 2 years ago
Jiarui Fang d42afd30f8
[builder] runtime adam and fused_optim builder (#2184) 2 years ago
ver217 f8a7148dec
[kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
Junming Wu 14a0b18305
[NFC] polish colossalai/amp/naive_amp/__init__.py code style (#1905) 2 years ago
LuGY 94329fc139
[NFC] polish colossalai/amp/apex_amp/__init__.py code style (#1853) 2 years ago
zbian 1559a09fb7 [NFC] polish amp.naive_amp.grad_scaler code style 2 years ago
Genghan Zhang b25030cc07 [NFC] polish ./colossalai/amp/torch_amp/__init__.py code style (#1836) 2 years ago
Ziyue Jiang 5da03c936d [NFC] polish colossalai/amp/torch_amp/_grad_scaler.py code style (#1823) 2 years ago
Fazzie-Maqianli 399f84d8f6 [NFC] polish colossalai/amp/naive_amp/_fp16_optimizer.py code style (#1819) 2 years ago
CsRic 9623ec1b02 [NFC] polish colossalai/amp/naive_amp/_utils.py code style (#1816) 2 years ago
ver217 d068af81a3
[doc] update rst and docstring (#1351) 2 years ago
YuliangLiu0306 e27645376d
[hotfix]different overflow status lead to communication stuck. (#1175) 2 years ago
Frank Lee 72bd7c696b
[amp] included dict for type casting of model output (#1102) 2 years ago
Frank Lee 9fdebadd69
[doc] improved docstring in the amp module (#857) 3 years ago
HELSON 4c4388c46e
[hotfix] fix memory leak in zero (#781) 3 years ago
Frank Lee a4e91bc87f
[bug] fixed grad scaler compatibility with torch 1.8 (#735) 3 years ago
Jiarui Fang 4d90a7b513
[refactor] zero directory (#724) 3 years ago
Kai Wang (Victor Kai) b0f708dfc1 fix format (#570) 3 years ago
ver217 c5b488edf8
polish amp docstring (#616) 3 years ago
Liang Bowen 2c45efc398
html refactor (#555) 3 years ago
Liang Bowen ec5086c49c Refactored docstring to google style 3 years ago
Jiarui Fang 496cbb0760
[hotfix] fix initialize bug with zero (#442) 3 years ago
Frank Lee 14a7094243
fixed fp16 optimizer none grad bug (#432) 3 years ago
Frank Lee e79ea44247
[fp16] refactored fp16 optimizer (#392) 3 years ago
Kai Wang (Victor Kai) 53bb3bcc0a fix format (#362) 3 years ago
Frank Lee 3d5d64bd10 refactored grad scaler (#338) 3 years ago
Frank Lee 6a3188167c set criterion as optional in colossalai initialize (#336) 3 years ago
Frank Lee e17e54e32a added buffer sync to naive amp model wrapper (#291) 3 years ago
Frank Lee f5ca88ec97 fixed apex import (#227) 3 years ago
アマデウス 9ee197d0e9 moved env variables to global variables; (#215) 3 years ago
HELSON 0f8c7f9804
Fixed docstring in colossalai (#171) 3 years ago
Frank Lee e2089c5c15
adapted for sequence parallel (#163) 3 years ago
puck_WCR 9473a1b9c8
AMP docstring/markdown update (#160) 3 years ago
ver217 96780e6ee4
Optimize pipeline schedule (#94) 3 years ago
ver217 8f02a88db2
add interleaved pipeline, fix naive amp and update pipeline model initializer (#80) 3 years ago
Frank Lee 91c327cb44
fixed zero level 3 dtype bug (#76) 3 years ago
Frank Lee 35813ed3c4
update examples and sphnix docs for the new api (#63) 3 years ago
Frank Lee da01c234e1
Develop/experiments (#59) 3 years ago