You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
Frank Lee ae1b58cd16
[tensor] added linear implementation for the new sharding spec (#1416)
2 years ago
..
__init__.py [hotfix] fix no optimizer in save/load (#1363) 2 years ago
colo_parameter.py
colo_tensor.py
compute_spec.py
const.py
dist_spec_mgr.py
distspec.py
op_wrapper.py [doc] update rst and docstring (#1351) 2 years ago
param_op_hook.py [doc] update rst and docstring (#1351) 2 years ago
process_group.py [hotfix] adapt ProcessGroup and Optimizer to ColoTensor (#1388) 2 years ago
shape_consistency.py [tensor] add shape consistency feature to support auto spec transform (#1418) 2 years ago
sharding_spec.py [tensor] added linear implementation for the new sharding spec (#1416) 2 years ago
tensor_spec.py
utils.py [doc] update rst and docstring (#1351) 2 years ago