ColossalAI/colossalai/nn
ver217 ccf3c58c89
embedding op use gather_out (#1143)
2022-06-21 13:21:20 +08:00
..
_ops embedding op use gather_out (#1143) 2022-06-21 13:21:20 +08:00
layer [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
loss [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 2022-04-02 16:12:04 +08:00
lr_scheduler Refactored docstring to google style 2022-03-29 17:17:47 +08:00
metric [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 2022-04-02 16:12:04 +08:00
optimizer [optim] refactor fused sgd (#1134) 2022-06-20 11:19:38 +08:00
parallel [zero] avoid zero hook spam by changing log to debug level (#1137) 2022-06-21 10:44:01 +08:00
__init__.py [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
init.py Refactored docstring to google style 2022-03-29 17:17:47 +08:00