You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer
HELSON d7ea63992b
[MOE] add FP32LinearGate for MOE in NaiveAMP context (#480)
3 years ago
..
colossalai_layer moved env variables to global variables; (#215) 3 years ago
moe [MOE] add FP32LinearGate for MOE in NaiveAMP context (#480) 3 years ago
parallel_1d fix format (#376) 3 years ago
parallel_2d fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 3 years ago
parallel_2p5d fix format parallel_2p5d (#357) 3 years ago
parallel_3d fix format for dir-[parallel_3d] (#333) 3 years ago
parallel_sequence adapted for sequence parallel (#163) 3 years ago
utils Added activation offload (#331) 3 years ago
vanilla flake8 style (#352) 3 years ago
wrapper flake8 style (#352) 3 years ago
__init__.py [MOE] changed parallelmode to dist process group (#460) 3 years ago
base_layer.py Migrated project 3 years ago