You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor/_ops
ver217 a3b66f6def
[tensor] refactor parallel action (#1007)
3 years ago
..
__init__.py add DistSpec for loss and test_model (#947) 3 years ago
_utils.py [tensor] refactor colo-tensor (#992) 3 years ago
addmm.py [tensor] refactor parallel action (#1007) 3 years ago
element_wise.py [tensor] refactor colo-tensor (#992) 3 years ago
embedding.py [tensor] refactor parallel action (#1007) 3 years ago
layernorm.py [tensor] refactor colo-tensor (#992) 3 years ago
linear.py [tensor] refactor parallel action (#1007) 3 years ago
loss.py [tensor] refactor parallel action (#1007) 3 years ago