You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/_ops
Frank Lee ae1b58cd16
[tensor] added linear implementation for the new sharding spec (#1416)
2 years ago
..
__init__.py [FAW] export FAW in _ops (#1438) 2 years ago
_utils.py [FAW] parallel FreqAwareEmbedding (#1424) 2 years ago
addmm.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
element_wise.py [tensor] add zero_like colo op, important for Optimizer (#1236) 2 years ago
embedding.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
embedding_bag.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
layernorm.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
linear.py [tensor] added linear implementation for the new sharding spec (#1416) 2 years ago
loss.py [tensor] fix a assertion in colo_tensor cross_entropy (#1232) 2 years ago
view.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago