You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
YuliangLiu0306 81330b0352
[autoparallel] add experimental permute handler (#2029)
2 years ago
..
__init__.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
colo_parameter.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
colo_tensor.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
comm_spec.py [autoparallel] add experimental permute handler (#2029) 2 years ago
compute_spec.py [NFC] polish doc style for ColoTensor (#1457) 2 years ago
const.py [Tensor] init ColoParameter (#914) 3 years ago
dist_spec_mgr.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
distspec.py [Doc] add more doc for ColoTensor. (#1458) 2 years ago
op_wrapper.py [doc] update rst and docstring (#1351) 2 years ago
param_op_hook.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
process_group.py [doc] update docstring in ProcessGroup (#1468) 2 years ago
shape_consistency.py [autoparallel] mix gather (#1977) 2 years ago
sharding_spec.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
tensor_spec.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
utils.py [autoparallel] mix gather (#1977) 2 years ago