You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
Geng Zhang 0e06f62160
[NFC] polish colossalai/nn/layer/parallel_sequence/_operation.py code style (#1266)
2 years ago
..
_ops [hotfix] fix bert model test in unitests (#1272) 2 years ago
graph [refactor] move process group from _DistSpec to ColoTensor. (#1203) 2 years ago
layer [NFC] polish colossalai/nn/layer/parallel_sequence/_operation.py code style (#1266) 2 years ago
loss [tensor] add unitest for colo_tensor 1DTP cross_entropy (#1230) 2 years ago
lr_scheduler [NFC] polish colossalai/nn/lr_scheduler/__init__.py (#1255) 2 years ago
metric [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
optimizer [optim] refactor fused sgd (#1134) 2 years ago
parallel [tensor] a shorter shard and replicate spec (#1245) 2 years ago
__init__.py [pipeline] refactor the pipeline module (#1087) 2 years ago
init.py [NFC] polish colossalai/nn/init.py code style (#1292) 2 years ago