ColossalAI/colossalai/nn/layer/moe
ver217 26b7aac0be
[zero] reorganize zero/gemini folder structure (#3424)
* [zero] refactor low-level zero folder structure

* [zero] fix legacy zero import path

* [zero] fix legacy zero import path

* [zero] remove useless import

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor legacy zero import path

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor legacy zero import path

* [zero] fix test import path

* [zero] fix test

* [zero] fix circular import

* [zero] update import
2023-04-04 13:48:16 +08:00
..
__init__.py [moe] add checkpoint for moe models (#3354) 2023-03-31 09:20:33 +08:00
_operation.py [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
checkpoint.py [moe] add checkpoint for moe models (#3354) 2023-03-31 09:20:33 +08:00
experts.py [zero] reorganize zero/gemini folder structure (#3424) 2023-04-04 13:48:16 +08:00
layers.py [zero] reorganize zero/gemini folder structure (#3424) 2023-04-04 13:48:16 +08:00
routers.py [moe] fix moe bugs (#1633) 2022-09-23 15:33:57 +08:00
utils.py [gemini] add GeminiMemoryManger (#832) 2022-04-24 13:08:48 +08:00