You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer
Liang Bowen 828e465622
[hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622)
3 years ago
..
colossalai_layer [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
moe polish moe docsrting (#618) 3 years ago
parallel_1d [model checkpoint] updated saving/loading for 1d layers (#594) 3 years ago
parallel_2d [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
parallel_2p5d [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
parallel_3d [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
parallel_sequence Refactored docstring to google style 3 years ago
utils Refactored docstring to google style 3 years ago
vanilla Refactored docstring to google style 3 years ago
wrapper Refactored docstring to google style 3 years ago
__init__.py [MOE] changed parallelmode to dist process group (#460) 3 years ago
base_layer.py [model checkpoint] reworked unified layers for ease of save/load states (#593) 3 years ago