You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
binmakeswell 73e9eb13b7
[NFC] polish colossalai/nn/lr_scheduler/cosine.py code style
2 years ago
..
_ops [embedding] tablewise sharding polish (#1535) 2 years ago
graph [NFC] polish doc style for ColoTensor (#1457) 2 years ago
layer add gather_output for VocabParallelClassifier1D (#1569) 2 years ago
loss
lr_scheduler [NFC] polish colossalai/nn/lr_scheduler/cosine.py code style 2 years ago
metric
optimizer fix nvme docstring (#1450) 2 years ago
parallel [embedding] cache_embedding small improvement (#1564) 2 years ago
__init__.py
init.py [NFC] polish colossalai/nn/init.py code style (#1292) 2 years ago