You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
Frank Lee eda30a058e
[compatibility] fixed tensor parallel compatibility with torch 1.9 (#700)
3 years ago
..
layer [compatibility] fixed tensor parallel compatibility with torch 1.9 (#700) 3 years ago
loss [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
lr_scheduler Refactored docstring to google style 3 years ago
metric [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
model Develop/experiments (#59) 3 years ago
optimizer [zero] improve adaptability for not-shard parameters (#708) 3 years ago
__init__.py Layer integration (#83) 3 years ago
init.py Refactored docstring to google style 3 years ago