You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer
Frank Lee 2b2dc1c86b
[pipeline] refactor the pipeline module (#1087)
3 years ago
..
colossalai_layer [TP] allow layernorm without bias (#750) 3 years ago
moe [gemini] add GeminiMemoryManger (#832) 3 years ago
parallel_1d [Tensor] 1d row embedding (#1075) 3 years ago
parallel_2d [NFC] polish colossalai/nn/layer/parallel_2d/layers.py code style (#976) 3 years ago
parallel_2p5d [NFC] polish colossalai/nn/layer/parallel_2p5d/layers.py code style (#972) 3 years ago
parallel_3d [NFC] polish colossalai/nn/layer/parallel_3d/layers.py code style (#966) 3 years ago
parallel_sequence Refactored docstring to google style 3 years ago
utils [NFC] polish colossalai/nn/layer/utils/common.py code style (#983) 3 years ago
vanilla [TP] allow layernorm without bias (#750) 3 years ago
wrapper [pipeline] refactor the pipeline module (#1087) 3 years ago
__init__.py [MOE] changed parallelmode to dist process group (#460) 3 years ago
base_layer.py [model checkpoint] reworked unified layers for ease of save/load states (#593) 3 years ago