ColossalAI/colossalai/shardformer/layer
Hongxin Liu bbb2c21f16
[shardformer] fix chatglm implementation (#5644)
* [shardformer] fix chatglm policy

* [shardformer] fix chatglm flash attn

* [shardformer] update readme

* [shardformer] fix chatglm init

* [shardformer] fix chatglm test

* [pipeline] fix chatglm merge batch
2024-04-25 14:41:17 +08:00
..
__init__.py [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
_operation.py [shardformer] Sequence Parallelism Optimization (#5533) 2024-04-03 17:15:47 +08:00
attn.py [coloattention]modify coloattention (#5627) 2024-04-25 10:47:14 +08:00
dropout.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
embedding.py [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
linear.py [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
loss.py [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
normalization.py [shardformer] fix chatglm implementation (#5644) 2024-04-25 14:41:17 +08:00
parallel_module.py [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
qkv_fused_linear.py [shardformer] Sequence Parallelism Optimization (#5533) 2024-04-03 17:15:47 +08:00
utils.py [shardformer] Sequence Parallelism Optimization (#5533) 2024-04-03 17:15:47 +08:00