ColossalAI/colossalai/tensor
YuliangLiu0306 702dbc5288
[tensor] use communication autograd func (#1617)
* [tensor] use communication autograd func

* change all to all comm spec info

* rename pattern and distinguish fwd/bwd

* polish code
2022-09-23 13:31:15 +08:00
..
__init__.py [Doc] add more doc for ColoTensor. (#1458) 2022-08-16 10:38:41 +08:00
colo_parameter.py [Optimizer] Remove useless ColoOptimizer (#1312) 2022-07-14 16:57:48 +08:00
colo_tensor.py [NFC] polish doc style for ColoTensor (#1457) 2022-08-16 09:21:05 +08:00
compute_spec.py [NFC] polish doc style for ColoTensor (#1457) 2022-08-16 09:21:05 +08:00
const.py [Tensor] init ColoParameter (#914) 2022-05-06 12:57:14 +08:00
dist_spec_mgr.py [hotfix] Dist Mgr gather torch version (#1284) 2022-07-13 00:18:56 +08:00
distspec.py [Doc] add more doc for ColoTensor. (#1458) 2022-08-16 10:38:41 +08:00
op_wrapper.py [doc] update rst and docstring (#1351) 2022-07-21 15:54:53 +08:00
param_op_hook.py [doc] update rst and docstring (#1351) 2022-07-21 15:54:53 +08:00
process_group.py [doc] update docstring in ProcessGroup (#1468) 2022-08-19 13:41:57 +08:00
shape_consistency.py [tensor] use communication autograd func (#1617) 2022-09-23 13:31:15 +08:00
sharding_spec.py [autoparallel] Add conv handler to generate strategies and costs info for conv (#1467) 2022-08-19 14:57:23 +08:00
tensor_spec.py [NFC] polish doc style for ColoTensor (#1457) 2022-08-16 09:21:05 +08:00
utils.py [tensor] shape consistency generate transform path and communication cost (#1435) 2022-08-12 14:02:32 +08:00