You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/tensor
Hongxin Liu 27061426f7
[gemini] improve compatibility and add static placement policy (#4479)
1 year ago
..
d_tensor [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354) 1 year ago
__init__.py [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2 years ago
colo_parameter.py [gemini] improve compatibility and add static placement policy (#4479) 1 year ago
colo_tensor.py [gemini] improve compatibility and add static placement policy (#4479) 1 year ago
comm_spec.py [test] fixed tests failed due to dtensor change (#4082) 1 year ago
compute_spec.py [doc] Fix typo under colossalai and doc(#3618) 2 years ago
const.py
dist_spec_mgr.py [tensor] Refactor handle_trans_spec in DistSpecManager 2 years ago
distspec.py [doc] Fix typo under colossalai and doc(#3618) 2 years ago
op_wrapper.py [doc] update rst and docstring (#1351) 2 years ago
param_op_hook.py [gemini] improve compatibility and add static placement policy (#4479) 1 year ago
process_group.py [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2 years ago
shape_consistency.py [test] fixed tests failed due to dtensor change (#4082) 1 year ago
sharding_spec.py [test] fixed tests failed due to dtensor change (#4082) 1 year ago
tensor_spec.py [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
utils.py [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2 years ago