You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
Frank Lee ae1b58cd16
[tensor] added linear implementation for the new sharding spec (#1416)
2 years ago
..
_ops [tensor] added linear implementation for the new sharding spec (#1416) 2 years ago
graph
layer [NFC] polish colossalai/nn/layer/wrapper/pipeline_wrapper.py code style (#1303) 2 years ago
loss [tensor] add unitest for colo_tensor 1DTP cross_entropy (#1230) 2 years ago
lr_scheduler [NFC] polish colossalai/nn/lr_scheduler/onecycle.py code style (#1269) 2 years ago
metric
optimizer [hotfix] fix CPUAdam kernel nullptr (#1410) 2 years ago
parallel [FAW] export FAW in _ops (#1438) 2 years ago
__init__.py
init.py [NFC] polish colossalai/nn/init.py code style (#1292) 2 years ago