You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer/colossalai_layer
zbian 61e687831d
fixed using zero with tp cannot access weight correctly
2 years ago
..
__init__.py
_utils.py fixed using zero with tp cannot access weight correctly 2 years ago
dropout.py fixed using zero with tp cannot access weight correctly 2 years ago
embedding.py
linear.py added skip_bias_add for non-tp linear 2 years ago
normalization.py