ColossalAI/colossalai/shardformer/layer
flybird11111 59e252ecdb
[shardformer] chatglm support sequence parallel (#4482)
* [shardformer] chatglm support sequence parallel

[shardformer] chatglm support sequence parallel

[shardformer] chatglm support sequence parallel

[shardformer] chatglm support sequence parallel

[shardformer] chatglm support sequence parallel

[shardformer] chatglm support sequence parallel

* fix

fix

fix

fix
2023-08-22 23:59:31 +08:00
..
__init__.py [shardformer] fix import 2023-08-15 23:25:14 +08:00
_operation.py [shardformer] bert support sequence parallel. (#4455) 2023-08-18 18:04:55 +08:00
dropout.py [shardformer] supported bloom model (#4098) 2023-07-04 16:05:01 +08:00
embedding.py [shardformer] fix embedding 2023-08-15 23:25:14 +08:00
linear.py [shardformer] chatglm support sequence parallel (#4482) 2023-08-22 23:59:31 +08:00
loss.py fix some typo colossalai/shardformer (#4160) 2023-07-04 17:53:39 +08:00
normalization.py [shardformer] support inplace sharding (#4251) 2023-08-15 23:25:14 +08:00
parallel_module.py [shardformer] supported fused qkv checkpoint (#4073) 2023-07-04 16:05:01 +08:00
qkv_fused_linear.py [shardformer/sequence parallel] Cherry pick commit to new branch (#4450) 2023-08-16 15:41:20 +08:00
utils.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00