You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer
Jiarui Fang a445e118cf
[polish] polish singleton and global context (#500)
3 years ago
..
colossalai_layer
moe [polish] polish singleton and global context (#500) 3 years ago
parallel_1d fix format (#376) 3 years ago
parallel_2d
parallel_2p5d
parallel_3d
parallel_sequence
utils
vanilla
wrapper
__init__.py [MOE] changed parallelmode to dist process group (#460) 3 years ago
base_layer.py