You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests/test_layers
zbian 3dba070580
fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial
3 years ago
..
test_1d moved env variables to global variables; (#215) 3 years ago
test_2d moved env variables to global variables; (#215) 3 years ago
test_2p5d moved env variables to global variables; (#215) 3 years ago
test_3d fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 3 years ago
test_sequence adapted for sequence parallel (#163) 3 years ago