You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer
アマデウス 622f863291
[hotfix] Jit type hint #2161 (#2164)
2 years ago
..
colossalai_layer added skip_bias_add for non-tp linear 2 years ago
moe [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
parallel_1d [tensorparallel] fixed tp layers (#1938) 2 years ago
parallel_2d [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2 years ago
parallel_2p5d [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2 years ago
parallel_3d [hotfix] Jit type hint #2161 (#2164) 2 years ago
parallel_sequence
utils
vanilla added skip_bias_add for non-tp linear 2 years ago
wrapper
__init__.py
base_layer.py [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2 years ago