You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer/parallel_2d
zbian 3dba070580
fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial
3 years ago
..
__init__.py moved env variables to global variables; (#215) 3 years ago
_operation.py moved env variables to global variables; (#215) 3 years ago
_utils.py moved env variables to global variables; (#215) 3 years ago
layers.py fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 3 years ago