You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer
Yuer867 4a0f8c2c50
fix format parallel_2p5d (#357)
3 years ago
..
colossalai_layer moved env variables to global variables; (#215) 3 years ago
moe Added TPExpert for special situation 3 years ago
parallel_1d Qifan formated file ColossalAI\colossalai\nn\layer\parallel_1d\layers.py (#342) 3 years ago
parallel_2d fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 3 years ago
parallel_2p5d fix format parallel_2p5d (#357) 3 years ago
parallel_3d fix format for dir-[parallel_3d] (#333) 3 years ago
parallel_sequence adapted for sequence parallel (#163) 3 years ago
utils flake8 style (#352) 3 years ago
vanilla flake8 style (#352) 3 years ago
wrapper flake8 style (#352) 3 years ago
__init__.py Hotfix/Colossalai layers (#92) 3 years ago
base_layer.py Migrated project 3 years ago