ColossalAI/colossalai/tensor
Hongxin Liu d921ce8391 [shardformer] support inplace sharding (#4251)
* [shardformer] embedding support inplace sharding

* [shardformer] linear support inplace sharding

* [shardformer] layernorm support inplace sharding

* [shardformer] qkv support inplace sharding

* [test] update shardformer layer test

* [shardformer] fix shared param sharding

* [shardformer] fix bert policy

* [shardformer] fix bloom policy

* [shardformer] fix llama policy

* [shardformer] fix opt policy

* [shardformer] fix t5 policy

* [shardformer] fix fused qkv linear

* [shardformer] fix bugs

* force sync

* [test] fix bugs

* [test] fix transformer version
2023-08-15 23:25:14 +08:00
..
d_tensor [shardformer] support inplace sharding (#4251) 2023-08-15 23:25:14 +08:00
__init__.py
colo_parameter.py
colo_tensor.py
comm_spec.py [test] fixed tests failed due to dtensor change (#4082) 2023-07-04 16:05:01 +08:00
compute_spec.py
const.py
dist_spec_mgr.py
distspec.py
op_wrapper.py
param_op_hook.py [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00
process_group.py [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00
shape_consistency.py [test] fixed tests failed due to dtensor change (#4082) 2023-07-04 16:05:01 +08:00
sharding_spec.py [test] fixed tests failed due to dtensor change (#4082) 2023-07-04 16:05:01 +08:00
tensor_spec.py
utils.py [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00