You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
ver217 ccf3c58c89
embedding op use gather_out (#1143)
2 years ago
..
_ops embedding op use gather_out (#1143) 2 years ago
layer [pipeline] refactor the pipeline module (#1087) 3 years ago
loss [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
lr_scheduler Refactored docstring to google style 3 years ago
metric [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
optimizer [optim] refactor fused sgd (#1134) 2 years ago
parallel [zero] avoid zero hook spam by changing log to debug level (#1137) 2 years ago
__init__.py [pipeline] refactor the pipeline module (#1087) 3 years ago
init.py Refactored docstring to google style 3 years ago