You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer/colossalai_layer
zbian 61e687831d
fixed using zero with tp cannot access weight correctly
2 years ago
..
__init__.py moved env variables to global variables; (#215) 3 years ago
_utils.py fixed using zero with tp cannot access weight correctly 2 years ago
dropout.py fixed using zero with tp cannot access weight correctly 2 years ago
embedding.py [model checkpoint] reworked unified layers for ease of save/load states (#593) 3 years ago
linear.py added skip_bias_add for non-tp linear 2 years ago
normalization.py [TP] allow layernorm without bias (#750) 3 years ago