You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/_ops
HELSON abba4d84e1
[hotfix] fix bert model test in unitests (#1272)
2 years ago
..
__init__.py [tensor] add embedding bag op (#1156) 2 years ago
_utils.py [refactor] move process group from _DistSpec to ColoTensor. (#1203) 2 years ago
addmm.py [tensor] redistribute among different process groups (#1247) 2 years ago
element_wise.py [tensor] add zero_like colo op, important for Optimizer (#1236) 2 years ago
embedding.py [hotfix] fix bert model test in unitests (#1272) 2 years ago
embedding_bag.py [tensor] a shorter shard and replicate spec (#1245) 2 years ago
layernorm.py [tensor] a shorter shard and replicate spec (#1245) 2 years ago
linear.py [tensor] redistribute among different process groups (#1247) 2 years ago
loss.py [tensor] fix a assertion in colo_tensor cross_entropy (#1232) 2 years ago