You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/_ops
Tongping Liu ab54fed292
[hotfix] add kwargs for colo_addmm (#2171)
2 years ago
..
__init__.py [Gemini] patch for supporting orch.add_ function for ColoTensor (#2003) 2 years ago
_utils.py [embedding] tablewise sharding polish (#1535) 2 years ago
addmm.py [hotfix] add kwargs for colo_addmm (#2171) 2 years ago
batch_norm.py [Gemini] patch for supporting orch.add_ function for ColoTensor (#2003) 2 years ago
element_wise.py [hotfix] add bert test for gemini fwd bwd (#2035) 2 years ago
embedding.py [NFC] polish colossalai/nn/_ops/embedding.py code style (#1561) 2 years ago
embedding_bag.py [NFC] polish colossalai/nn/_ops/embedding_bag.py code style (#1552) 2 years ago
layernorm.py [NFC] polish colossalai/nn/_ops/layernorm.py code style (#1555) 2 years ago
linear.py [tensor] added linear implementation for the new sharding spec (#1416) 2 years ago
loss.py
view.py