You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer/colossalai_layer
zbian 653b0a620e
added skip_bias_add for non-tp linear
2 years ago
..
__init__.py moved env variables to global variables; (#215) 3 years ago
_utils.py [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
dropout.py [NFC] polish colossalai/nn/layer/colossalai_layer/dropout.py code style (#1568) 2 years ago
embedding.py [model checkpoint] reworked unified layers for ease of save/load states (#593) 3 years ago
linear.py added skip_bias_add for non-tp linear 2 years ago
normalization.py [TP] allow layernorm without bias (#750) 3 years ago