You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer
ver217 f8a7148dec
[kernel] move all symlinks of kernel to `colossalai._C` (#1971)
2 years ago
..
colossalai_layer added skip_bias_add for non-tp linear 2 years ago
moe [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
parallel_1d [tensorparallel] fixed tp layers (#1938) 2 years ago
parallel_2d [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2 years ago
parallel_2p5d [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2 years ago
parallel_3d [tensorparallel] fixed tp layers (#1938) 2 years ago
parallel_sequence [NFC] polish colossalai/nn/layer/parallel_sequence/layers.py code style (#1280) 2 years ago
utils [NFC] polish colossalai/nn/layer/utils/common.py code style (#983) 3 years ago
vanilla added skip_bias_add for non-tp linear 2 years ago
wrapper [NFC] polish colossalai/nn/layer/wrapper/pipeline_wrapper.py code style (#1303) 2 years ago
__init__.py
base_layer.py [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2 years ago