ColossalAI/colossalai/shardformer/layer
Hongxin Liu 641b1ee71a
[devops] remove post commit ci (#5566)
* [devops] remove post commit ci

* [misc] run pre-commit on all files

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-04-08 15:09:40 +08:00
..
__init__.py [devops] remove post commit ci (#5566) 2024-04-08 15:09:40 +08:00
_operation.py [shardformer] Sequence Parallelism Optimization (#5533) 2024-04-03 17:15:47 +08:00
attn.py [shardformer] update colo attention to support custom mask (#5510) 2024-03-27 11:19:32 +08:00
dropout.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
embedding.py [gemini] gemini support tensor parallelism. (#4942) 2023-11-10 10:15:16 +08:00
linear.py [shardformer] Sequence Parallelism Optimization (#5533) 2024-04-03 17:15:47 +08:00
loss.py [shardformer] llama support DistCrossEntropy (#5176) 2023-12-13 01:39:14 +08:00
normalization.py [hotfix] fix typo change enabel to enable under colossalai/shardformer/ (#5317) 2024-03-05 21:48:46 +08:00
parallel_module.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
qkv_fused_linear.py [shardformer] Sequence Parallelism Optimization (#5533) 2024-04-03 17:15:47 +08:00
utils.py [shardformer] Sequence Parallelism Optimization (#5533) 2024-04-03 17:15:47 +08:00