You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
YuliangLiu0306 258b43317c
[hotfix] layout converting issue (#3188)
2 years ago
..
d_tensor [hotfix] layout converting issue (#3188) 2 years ago
__init__.py [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2 years ago
colo_parameter.py [polish] polish ColoTensor and its submodules (#2537) 2 years ago
colo_tensor.py [hotfix]: Remove math.prod dependency (#2837) 2 years ago
comm_spec.py [hotfix] add shard dim to aviod backward communication error (#2954) 2 years ago
compute_spec.py [polish] polish ColoTensor and its submodules (#2537) 2 years ago
const.py
dist_spec_mgr.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
distspec.py [polish] polish ColoTensor and its submodules (#2537) 2 years ago
op_wrapper.py
param_op_hook.py [hotfix] fix implement error in diffusers 2 years ago
process_group.py [polish] polish ColoTensor and its submodules (#2537) 2 years ago
shape_consistency.py [autoparallel] fix runtime apply memory estimation (#2281) 2 years ago
sharding_spec.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
tensor_spec.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
utils.py [autoparallel] mix gather (#1977) 2 years ago