ColossalAI/colossalai/shardformer/layer
pre-commit-ci[bot] 351a1c269b [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-06-20 06:50:40 +00:00
..
__init__.py [shardformer] update colo attention to support custom mask (#5510) 2024-03-27 11:19:32 +08:00
_operation.py [hotfix] fix typo change enabel to enable under colossalai/shardformer/ (#5317) 2024-03-05 21:48:46 +08:00
attn.py [shardformer] update colo attention to support custom mask (#5510) 2024-03-27 11:19:32 +08:00
dropout.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
embedding.py [gemini] gemini support tensor parallelism. (#4942) 2023-11-10 10:15:16 +08:00
linear.py support linear accumulation fusion (#5199) 2023-12-29 18:22:42 +08:00
loss.py [pre-commit.ci] auto fixes from pre-commit.com hooks 2024-06-20 06:50:40 +00:00
normalization.py fix: modify model config and add Qwen2RMSNorm 2024-06-14 16:27:46 +08:00
parallel_module.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
qkv_fused_linear.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
utils.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00