You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
YuliangLiu0306 0f3042363c
[tensor] shape consistency generate transform path and communication cost (#1435)
2 years ago
..
__init__.py [hotfix] fix no optimizer in save/load (#1363) 2 years ago
colo_parameter.py [Optimizer] Remove useless ColoOptimizer (#1312) 2 years ago
colo_tensor.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
compute_spec.py [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
const.py [Tensor] init ColoParameter (#914) 3 years ago
dist_spec_mgr.py [hotfix] Dist Mgr gather torch version (#1284) 2 years ago
distspec.py [tensor] distributed checkpointing for parameters (#1240) 2 years ago
op_wrapper.py [doc] update rst and docstring (#1351) 2 years ago
param_op_hook.py [doc] update rst and docstring (#1351) 2 years ago
process_group.py [hotfix] adapt ProcessGroup and Optimizer to ColoTensor (#1388) 2 years ago
shape_consistency.py [tensor] shape consistency generate transform path and communication cost (#1435) 2 years ago
sharding_spec.py [tensor] shape consistency generate transform path and communication cost (#1435) 2 years ago
tensor_spec.py [refactor] move process group from _DistSpec to ColoTensor. (#1203) 2 years ago
utils.py [tensor] shape consistency generate transform path and communication cost (#1435) 2 years ago