ColossalAI/colossalai/tensor
Jiarui Fang 8f72b6f8fb
[hotfix] fix implement error in diffusers
2023-01-07 07:56:39 +08:00
..
__init__.py [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2022-12-05 17:11:06 +08:00
colo_parameter.py [zero] fix error for BEiT models (#2169) 2022-12-26 15:03:54 +08:00
colo_tensor.py [example] gpt, shard init on all processes (#2366) 2023-01-06 15:44:50 +08:00
comm_spec.py [autoparallel] add experimental permute handler (#2029) 2022-11-27 20:26:52 +08:00
compute_spec.py [NFC] polish doc style for ColoTensor (#1457) 2022-08-16 09:21:05 +08:00
const.py [Tensor] init ColoParameter (#914) 2022-05-06 12:57:14 +08:00
dist_spec_mgr.py [autoparallel] fix bugs caused by negative dim key (#1808) 2022-11-08 17:03:50 +08:00
distspec.py [Doc] add more doc for ColoTensor. (#1458) 2022-08-16 10:38:41 +08:00
op_wrapper.py [doc] update rst and docstring (#1351) 2022-07-21 15:54:53 +08:00
param_op_hook.py [hotfix] fix implement error in diffusers 2023-01-06 18:37:18 +08:00
process_group.py [doc] update docstring in ProcessGroup (#1468) 2022-08-19 13:41:57 +08:00
shape_consistency.py [autoparallel] fix runtime apply memory estimation (#2281) 2023-01-03 17:18:07 +08:00
sharding_spec.py [autoparallel] fix bugs caused by negative dim key (#1808) 2022-11-08 17:03:50 +08:00
tensor_spec.py [autoparallel] fix bugs caused by negative dim key (#1808) 2022-11-08 17:03:50 +08:00
utils.py [autoparallel] mix gather (#1977) 2022-11-23 21:49:17 +08:00