You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
ver217 a3b66f6def
[tensor] refactor parallel action (#1007)
3 years ago
..
_ops [tensor] refactor parallel action (#1007) 3 years ago
graph [Graph] building computing graph with ColoTensor, Linear only (#917) 3 years ago
optim [Tensor] initialize the ColoOptimizer (#898) 3 years ago
__init__.py [tensor] refactor colo-tensor (#992) 3 years ago
colo_parameter.py [tensor] refactor parallel action (#1007) 3 years ago
colo_tensor.py [tensor] refactor parallel action (#1007) 3 years ago
const.py [Tensor] init ColoParameter (#914) 3 years ago
dist_spec_mgr.py [tensor] refactor colo-tensor (#992) 3 years ago
distspec.py [tensor] refactor colo-tensor (#992) 3 years ago
op_wrapper.py [tensor] refactor colo-tensor (#992) 3 years ago
spec.py [tensor] refactor parallel action (#1007) 3 years ago
utils.py [Tensor] get named parameters for model using ColoTensors (#874) 3 years ago