You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
Jiarui Fang 2827f41898
[Gemini] GeminiDPP convert to PyTorch Module. (#2151)
2 years ago
..
_ops [hotfix] add bert test for gemini fwd bwd (#2035) 2 years ago
layer [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
loss
lr_scheduler
metric [NFC] polish colossalai/nn/metric/_utils.py code style (#1727) 2 years ago
optimizer [optimizer] add div_scale for optimizers (#2117) 2 years ago
parallel [Gemini] GeminiDPP convert to PyTorch Module. (#2151) 2 years ago
__init__.py [kernel] added jit warmup (#1792) 2 years ago
init.py