ColossalAI/colossalai/tensor
Baizhou Zhang 0ceec8f9a9 [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354)
* add naive optimizer for 3DPlugin/refactor gpt2 shardformer test

* merge tests of PP/DP/TP combinations into one test file

* fix bug when sync grad for dp in HybridPlugin

* update supported precisions for 3DPlugin/fix bug when shifting tp_degree

* improve the passing of lazy_init

* modify lazy_init/use sync_shared_params
2023-08-15 23:25:14 +08:00
..
d_tensor [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354) 2023-08-15 23:25:14 +08:00
__init__.py
colo_parameter.py
colo_tensor.py
comm_spec.py [test] fixed tests failed due to dtensor change (#4082) 2023-07-04 16:05:01 +08:00
compute_spec.py
const.py
dist_spec_mgr.py
distspec.py
op_wrapper.py
param_op_hook.py [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00
process_group.py [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00
shape_consistency.py [test] fixed tests failed due to dtensor change (#4082) 2023-07-04 16:05:01 +08:00
sharding_spec.py [test] fixed tests failed due to dtensor change (#4082) 2023-07-04 16:05:01 +08:00
tensor_spec.py
utils.py [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00