ColossalAI/colossalai/nn/layer
ver217 26b7aac0be
[zero] reorganize zero/gemini folder structure (#3424)
* [zero] refactor low-level zero folder structure

* [zero] fix legacy zero import path

* [zero] fix legacy zero import path

* [zero] remove useless import

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor legacy zero import path

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor gemini folder structure

* [zero] refactor legacy zero import path

* [zero] fix test import path

* [zero] fix test

* [zero] fix circular import

* [zero] update import
2023-04-04 13:48:16 +08:00
..
colossalai_layer fixed using zero with tp cannot access weight correctly 2023-02-28 10:52:30 +08:00
moe [zero] reorganize zero/gemini folder structure (#3424) 2023-04-04 13:48:16 +08:00
parallel_1d [tensorparallel] fixed tp layers (#1938) 2022-11-14 17:34:03 +08:00
parallel_2d [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2022-09-06 20:18:35 +08:00
parallel_2p5d [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2022-09-06 20:18:35 +08:00
parallel_3d improved allgather & reducescatter for 3d 2023-01-03 17:46:08 +08:00
parallel_sequence
utils
vanilla added skip_bias_add for non-tp linear 2022-11-09 15:41:08 +08:00
wrapper [NFC] polish colossalai/nn/layer/wrapper/pipeline_wrapper.py code style (#1303) 2022-07-13 19:01:07 +08:00
__init__.py
base_layer.py [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2022-09-06 20:18:35 +08:00