ColossalAI/colossalai/tensor
YuliangLiu0306 aa0f6686f9
[autoparallel] accelerate gpt2 training (#2495)
2023-01-29 11:13:15 +08:00
..
__init__.py [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2022-12-05 17:11:06 +08:00
colo_parameter.py [zero] fix error for BEiT models (#2169) 2022-12-26 15:03:54 +08:00
colo_tensor.py [gemini] update ddp strict mode (#2518) 2023-01-28 14:35:25 +08:00
comm_spec.py [autoparallel] accelerate gpt2 training (#2495) 2023-01-29 11:13:15 +08:00
compute_spec.py [NFC] polish doc style for ColoTensor (#1457) 2022-08-16 09:21:05 +08:00
const.py [Tensor] init ColoParameter (#914) 2022-05-06 12:57:14 +08:00
dist_spec_mgr.py [autoparallel] fix bugs caused by negative dim key (#1808) 2022-11-08 17:03:50 +08:00
distspec.py [Doc] add more doc for ColoTensor. (#1458) 2022-08-16 10:38:41 +08:00
op_wrapper.py [doc] update rst and docstring (#1351) 2022-07-21 15:54:53 +08:00
param_op_hook.py [hotfix] fix implement error in diffusers 2023-01-06 18:37:18 +08:00
process_group.py [doc] update docstring in ProcessGroup (#1468) 2022-08-19 13:41:57 +08:00
shape_consistency.py [autoparallel] fix runtime apply memory estimation (#2281) 2023-01-03 17:18:07 +08:00
sharding_spec.py [autoparallel] fix bugs caused by negative dim key (#1808) 2022-11-08 17:03:50 +08:00
tensor_spec.py [autoparallel] fix bugs caused by negative dim key (#1808) 2022-11-08 17:03:50 +08:00
utils.py [autoparallel] mix gather (#1977) 2022-11-23 21:49:17 +08:00