Commit Graph

55 Commits (365671be1043aa7a94e4c014d747d70b0a2f971a)

Author SHA1 Message Date
Hongxin Liu e5ce4c8ea6
[npu] add npu support for gemini and zero (#5067)
1 year ago
littsk 83b52c56cd
[feature] Add clip_grad_norm for hybrid_parallel_plugin (#4837)
1 year ago
Baizhou Zhang c0a033700c
[shardformer] fix master param sync for hybrid plugin/rewrite unwrapping logic (#4758)
1 year ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752)
1 year ago
Hongxin Liu b5f9e37c70
[legacy] clean up legacy code (#4743)
1 year ago
Baizhou Zhang 0ceec8f9a9 [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354)
1 year ago
Hongxin Liu 261eab02fb [plugin] add 3d parallel plugin (#4295)
1 year ago
Hongxin Liu ae02d4e4f7
[bf16] add bf16 support (#3882)
1 year ago
digger yu 32f81f14d4
[NFC] fix typo colossalai/amp auto_parallel autochunk (#3756)
2 years ago
lucasliunju 4b95464994 [NFC] polish colossalai/amp/__init__.py code style (#3272)
2 years ago
Frank Lee 8518263b80
[test] fixed the triton version for testing (#2608)
2 years ago
HELSON 077a5cdde4
[zero] fix gradient clipping in hybrid parallelism (#2521)
2 years ago
Frank Lee 40d376c566
[setup] support pre-build and jit-build of cuda kernels (#2374)
2 years ago
xyupeng b965585d05 [NFC] polish colossalai/amp/torch_amp/torch_amp.py code style (#2290)
2 years ago
Ziheng Qin 3041014089 [NFC] polish colossalai/amp/naive_amp/grad_scaler/dynamic_grad_scaler.py code style (#2299)
2 years ago
HELSON 5d3a2be3af
[amp] add gradient clipping for unit tests (#2283)
2 years ago
YuliangLiu0306 f027ef7913
[hotfix] fix fp16 optimzier bug (#2273)
2 years ago
Jiarui Fang 355ffb386e
[builder] unified cpu_optim fused_optim inferface (#2190)
2 years ago
Jiarui Fang d42afd30f8
[builder] runtime adam and fused_optim builder (#2184)
2 years ago
ver217 f8a7148dec
[kernel] move all symlinks of kernel to `colossalai._C` (#1971)
2 years ago
Junming Wu 14a0b18305
[NFC] polish colossalai/amp/naive_amp/__init__.py code style (#1905)
2 years ago
LuGY 94329fc139
[NFC] polish colossalai/amp/apex_amp/__init__.py code style (#1853)
2 years ago
zbian 1559a09fb7 [NFC] polish amp.naive_amp.grad_scaler code style
2 years ago
Genghan Zhang b25030cc07 [NFC] polish ./colossalai/amp/torch_amp/__init__.py code style (#1836)
2 years ago
Ziyue Jiang 5da03c936d [NFC] polish colossalai/amp/torch_amp/_grad_scaler.py code style (#1823)
2 years ago
Fazzie-Maqianli 399f84d8f6 [NFC] polish colossalai/amp/naive_amp/_fp16_optimizer.py code style (#1819)
2 years ago
CsRic 9623ec1b02 [NFC] polish colossalai/amp/naive_amp/_utils.py code style (#1816)
2 years ago
ver217 d068af81a3
[doc] update rst and docstring (#1351)
2 years ago
YuliangLiu0306 e27645376d
[hotfix]different overflow status lead to communication stuck. (#1175)
2 years ago
Frank Lee 72bd7c696b
[amp] included dict for type casting of model output (#1102)
2 years ago
Frank Lee 9fdebadd69
[doc] improved docstring in the amp module (#857)
3 years ago
HELSON 4c4388c46e
[hotfix] fix memory leak in zero (#781)
3 years ago
Frank Lee a4e91bc87f
[bug] fixed grad scaler compatibility with torch 1.8 (#735)
3 years ago
Jiarui Fang 4d90a7b513
[refactor] zero directory (#724)
3 years ago
Kai Wang (Victor Kai) b0f708dfc1 fix format (#570)
3 years ago
ver217 c5b488edf8
polish amp docstring (#616)
3 years ago
Liang Bowen 2c45efc398
html refactor (#555)
3 years ago
Liang Bowen ec5086c49c Refactored docstring to google style
3 years ago
Jiarui Fang 496cbb0760
[hotfix] fix initialize bug with zero (#442)
3 years ago
Frank Lee 14a7094243
fixed fp16 optimizer none grad bug (#432)
3 years ago
Frank Lee e79ea44247
[fp16] refactored fp16 optimizer (#392)
3 years ago
Kai Wang (Victor Kai) 53bb3bcc0a fix format (#362)
3 years ago
Frank Lee 3d5d64bd10 refactored grad scaler (#338)
3 years ago
Frank Lee 6a3188167c set criterion as optional in colossalai initialize (#336)
3 years ago
Frank Lee e17e54e32a added buffer sync to naive amp model wrapper (#291)
3 years ago
Frank Lee f5ca88ec97 fixed apex import (#227)
3 years ago
アマデウス 9ee197d0e9 moved env variables to global variables; (#215)
3 years ago
HELSON 0f8c7f9804
Fixed docstring in colossalai (#171)
3 years ago
Frank Lee e2089c5c15
adapted for sequence parallel (#163)
3 years ago
puck_WCR 9473a1b9c8
AMP docstring/markdown update (#160)
3 years ago