ColossalAI/colossalai
Jiarui Fang de0468c7a8 [zero] zero init context (#321)
* add zero init context

* add more flags for zero init context
fix bug of repeated converting param to ShardedParamV2

* polish code
2022-03-11 15:50:28 +08:00
..
amp added buffer sync to naive amp model wrapper (#291) 2022-03-11 15:50:28 +08:00
builder add pytorch hooks (#179) 2022-01-25 22:20:54 +08:00
communication Added profiler communication operations 2022-03-11 15:50:28 +08:00
context moved env variables to global variables; (#215) 2022-02-15 11:31:13 +08:00
engine fix sharded param hook and unit test 2022-03-11 15:50:28 +08:00
kernel [zero] cpu adam kernel (#288) 2022-03-11 15:50:28 +08:00
logging fixed mkdir conflict and align yapf config with flake (#220) 2022-02-15 11:31:13 +08:00
nn [zero] cpu adam kernel (#288) 2022-03-11 15:50:28 +08:00
registry add pytorch hooks (#179) 2022-01-25 22:20:54 +08:00
trainer Added profiler communication operations 2022-03-11 15:50:28 +08:00
utils Added profiler communication operations 2022-03-11 15:50:28 +08:00
zero [zero] zero init context (#321) 2022-03-11 15:50:28 +08:00
__init__.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
constants.py moved env variables to global variables; (#215) 2022-02-15 11:31:13 +08:00
core.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
global_variables.py Optimized MoE layer and fixed some bugs; 2022-03-11 15:50:28 +08:00
initialize.py added buffer sync to naive amp model wrapper (#291) 2022-03-11 15:50:28 +08:00