You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
Tongping Liu ab54fed292
[hotfix] add kwargs for colo_addmm (#2171)
2 years ago
..
_ops [hotfix] add kwargs for colo_addmm (#2171) 2 years ago
layer [hotfix] Jit type hint #2161 (#2164) 2 years ago
loss [NFC] polish colossalai/nn/loss/loss_2p5d.py code style (#1553) 2 years ago
lr_scheduler [NFC] polish colossalai/nn/lr_scheduler/linear.py code style (#1716) 2 years ago
metric [NFC] polish colossalai/nn/metric/_utils.py code style (#1727) 2 years ago
optimizer [optimizer] add div_scale for optimizers (#2117) 2 years ago
parallel [Gemini] GeminiDPP convert to PyTorch Module. (#2151) 2 years ago
__init__.py [kernel] added jit warmup (#1792) 2 years ago
init.py [NFC] polish colossalai/nn/init.py code style (#1292) 2 years ago