You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
HELSON abba4d84e1
[hotfix] fix bert model test in unitests (#1272)
2 years ago
..
_ops [hotfix] fix bert model test in unitests (#1272) 2 years ago
graph [refactor] move process group from _DistSpec to ColoTensor. (#1203) 2 years ago
layer
loss [tensor] add unitest for colo_tensor 1DTP cross_entropy (#1230) 2 years ago
lr_scheduler [NFC] polish <colossalai/nn/lr_scheduler/poly.py> code style (#1267) 2 years ago
metric
optimizer [optim] refactor fused sgd (#1134) 2 years ago
parallel [tensor] a shorter shard and replicate spec (#1245) 2 years ago
__init__.py
init.py