ColossalAI/colossalai/shardformer/layer
Hongxin Liu d921ce8391 [shardformer] support inplace sharding (#4251)
* [shardformer] embedding support inplace sharding

* [shardformer] linear support inplace sharding

* [shardformer] layernorm support inplace sharding

* [shardformer] qkv support inplace sharding

* [test] update shardformer layer test

* [shardformer] fix shared param sharding

* [shardformer] fix bert policy

* [shardformer] fix bloom policy

* [shardformer] fix llama policy

* [shardformer] fix opt policy

* [shardformer] fix t5 policy

* [shardformer] fix fused qkv linear

* [shardformer] fix bugs

* force sync

* [test] fix bugs

* [test] fix transformer version
2023-08-15 23:25:14 +08:00
..
__init__.py [shardformer] support inplace sharding (#4251) 2023-08-15 23:25:14 +08:00
_operation.py [format] applied code formatting on changed files in pull request 4152 (#4157) 2023-07-04 16:07:47 +08:00
dropout.py [shardformer] supported bloom model (#4098) 2023-07-04 16:05:01 +08:00
embedding.py [shardformer] support inplace sharding (#4251) 2023-08-15 23:25:14 +08:00
linear.py [shardformer] support inplace sharding (#4251) 2023-08-15 23:25:14 +08:00
loss.py fix some typo colossalai/shardformer (#4160) 2023-07-04 17:53:39 +08:00
normalization.py [shardformer] support inplace sharding (#4251) 2023-08-15 23:25:14 +08:00
parallel_module.py [shardformer] supported fused qkv checkpoint (#4073) 2023-07-04 16:05:01 +08:00
qkv_fused_linear.py [shardformer] support inplace sharding (#4251) 2023-08-15 23:25:14 +08:00
utils.py [shardformer] supported bloom model (#4098) 2023-07-04 16:05:01 +08:00