You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
Jiarui Fang 986f8cbaa7
[inference] overlap comm and compute in Linear1D_Row when stream_chunk_num > 1 (#1876)
2 years ago
..
_ops [hotfix[ fix colotensor.type() raise NotImplementedError (#1682) 2 years ago
graph
layer [inference] overlap comm and compute in Linear1D_Row when stream_chunk_num > 1 (#1876) 2 years ago
loss [NFC] polish colossalai/nn/loss/loss_2p5d.py code style (#1553) 2 years ago
lr_scheduler [NFC] polish colossalai/nn/lr_scheduler/linear.py code style (#1716) 2 years ago
metric [NFC] polish colossalai/nn/metric/_utils.py code style (#1727) 2 years ago
optimizer add optimizer README for tutorials (#1707) 2 years ago
parallel [Gemini] make gemini usage simple (#1821) 2 years ago
__init__.py [kernel] added jit warmup (#1792) 2 years ago
init.py