You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer
DouJS cbb6436ff0
fix format for dir-[parallel_3d] (#333)
3 years ago
..
colossalai_layer moved env variables to global variables; (#215) 3 years ago
moe Added TPExpert for special situation 3 years ago
parallel_1d fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 3 years ago
parallel_2d fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 3 years ago
parallel_2p5d fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 3 years ago
parallel_3d fix format for dir-[parallel_3d] (#333) 3 years ago
parallel_sequence adapted for sequence parallel (#163) 3 years ago
utils moved env variables to global variables; (#215) 3 years ago
vanilla moved env variables to global variables; (#215) 3 years ago
wrapper Fixed docstring in colossalai (#171) 3 years ago
__init__.py Hotfix/Colossalai layers (#92) 3 years ago
base_layer.py Migrated project 3 years ago