You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer/colossalai_layer
アマデウス b8899e0905
[TP] allow layernorm without bias (#750)
3 years ago
..
__init__.py moved env variables to global variables; (#215) 3 years ago
_utils.py [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
dropout.py [model checkpoint] reworked unified layers for ease of save/load states (#593) 3 years ago
embedding.py [model checkpoint] reworked unified layers for ease of save/load states (#593) 3 years ago
linear.py [model checkpoint] reworked unified layers for ease of save/load states (#593) 3 years ago
normalization.py [TP] allow layernorm without bias (#750) 3 years ago