You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/_ops
Jiarui Fang d209aff684
Add FreqAwareEmbeddingBag (#1421)
2 years ago
..
cache_embedding Add FreqAwareEmbeddingBag (#1421) 2 years ago
__init__.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
_utils.py [refactor] move process group from _DistSpec to ColoTensor. (#1203) 2 years ago
addmm.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
element_wise.py [tensor] add zero_like colo op, important for Optimizer (#1236) 2 years ago
embedding.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
embedding_bag.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
layernorm.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
linear.py [hotfix] fix shape error in backward when using ColoTensor (#1298) 2 years ago
loss.py [tensor] fix a assertion in colo_tensor cross_entropy (#1232) 2 years ago
view.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago