You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
Ziyue Jiang 32291dd73f
[Tensor] add module handler for linear (#1021)
3 years ago
..
_ops [tensor] refactor parallel action (#1007) 3 years ago
graph [Graph] building computing graph with ColoTensor, Linear only (#917) 3 years ago
modules [Tensor] add module handler for linear (#1021) 3 years ago
optim [Tensor] initialize the ColoOptimizer (#898) 3 years ago
__init__.py [Tensor] add module handler for linear (#1021) 3 years ago
colo_parameter.py [tensor] refactor parallel action (#1007) 3 years ago
colo_tensor.py [tensor] refactor parallel action (#1007) 3 years ago
const.py [Tensor] init ColoParameter (#914) 3 years ago
dist_spec_mgr.py [tensor] refactor colo-tensor (#992) 3 years ago
distspec.py [tensor] refactor colo-tensor (#992) 3 years ago
module_utils.py [Tensor] add module handler for linear (#1021) 3 years ago
op_wrapper.py [tensor] refactor colo-tensor (#992) 3 years ago
spec.py [tensor] refactor parallel action (#1007) 3 years ago
utils.py