Frank Lee
|
80eba05b0a
|
[test] refactor tests with spawn (#3452)
* [test] added spawn decorator
* polish code
* polish code
* polish code
* polish code
* polish code
* polish code
|
2 years ago |
ver217
|
933048ad3e
|
[test] reorganize zero/gemini tests (#3445)
|
2 years ago |
ver217
|
26b7aac0be
|
[zero] reorganize zero/gemini folder structure (#3424)
* [zero] refactor low-level zero folder structure
* [zero] fix legacy zero import path
* [zero] fix legacy zero import path
* [zero] remove useless import
* [zero] refactor gemini folder structure
* [zero] refactor gemini folder structure
* [zero] refactor legacy zero import path
* [zero] refactor gemini folder structure
* [zero] refactor gemini folder structure
* [zero] refactor gemini folder structure
* [zero] refactor legacy zero import path
* [zero] fix test import path
* [zero] fix test
* [zero] fix circular import
* [zero] update import
|
2 years ago |
HELSON
|
a088022efc
|
[moe] fix moe bugs (#1633)
|
2 years ago |
HELSON
|
f7f2248771
|
[moe] fix MoE bugs (#1628)
* remove forced FP32 modules
* correct no_shard-contexts' positions
|
2 years ago |
Frank Lee
|
5a1a095b92
|
[test] refactored with the new rerun decorator (#763)
* [test] refactored with the new rerun decorator
* polish test case
|
3 years ago |
HELSON
|
22c4b88d56
|
[zero] refactor ShardedParamV2 for convenience (#742)
|
3 years ago |
Jiarui Fang
|
53cb584808
|
[utils] correct cpu memory used and capacity in the context of multi-process (#726)
|
3 years ago |
HELSON
|
b9b469ea50
|
[moe] add checkpoint for moe zero test (#729)
|
3 years ago |
Jiarui Fang
|
193dc8dacb
|
[refactor] refactor the memory utils (#715)
|
3 years ago |
HELSON
|
a9b8300d54
|
[zero] improve adaptability for not-shard parameters (#708)
* adapt post grad hooks for not-shard parameters
* adapt optimizer for not-shard parameters
* offload gradients for not-replicated parameters
|
3 years ago |
HELSON
|
ee112fe1da
|
[zero] adapt zero hooks for unsharded module (#699)
|
3 years ago |
HELSON
|
d7ecaf362b
|
[zero] fix init bugs in zero context (#686)
* adapt model weight initialization for methods in Pytorch nn.init
|
3 years ago |
Jiarui Fang
|
0aab52301e
|
[hotfix] fix a bug in model data stats tracing (#655)
|
3 years ago |
HELSON
|
055fbf5be6
|
[zero] adapt zero for unsharded paramters (Optimizer part) (#601)
|
3 years ago |
HELSON
|
e6d50ec107
|
[zero] adapt zero for unsharded parameters (#561)
* support existing sharded and unsharded parameters in zero
* add unitest for moe-zero model init
* polish moe gradient handler
|
3 years ago |
Jiarui Fang
|
7675366fce
|
[polish] rename col_attr -> colo_attr (#558)
|
3 years ago |
HELSON
|
8c90d4df54
|
[zero] add zero context manager to change config during initialization (#546)
|
3 years ago |