ColossalAI/colossalai/shardformer/layer
duanjunwen d2e05a99b3 [feat] support no tensor parallel Linear in shardformer; Add test for use weightGradStore and not use WeightGradStore 2024-10-30 02:54:32 +00:00
..
__init__.py [feat] support no tensor parallel Linear in shardformer; Add test for use weightGradStore and not use WeightGradStore 2024-10-30 02:54:32 +00:00
_operation.py [feat] support no tensor parallel Linear in shardformer; Add test for use weightGradStore and not use WeightGradStore 2024-10-30 02:54:32 +00:00
attn.py [zerobubble] rebase main (#6075) 2024-10-08 15:58:00 +08:00
dropout.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
embedding.py [zerobubble] rebase main (#6075) 2024-10-08 15:58:00 +08:00
linear.py [feat] support no tensor parallel Linear in shardformer; Add test for use weightGradStore and not use WeightGradStore 2024-10-30 02:54:32 +00:00
loss.py [zerobubble] rebase main (#6075) 2024-10-08 15:58:00 +08:00
normalization.py [Hotfix] Avoid fused RMSnorm import error without apex (#5985) 2024-08-09 18:17:09 +08:00
parallel_module.py [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
qkv_fused_linear.py [zerobubble] rebase main (#6075) 2024-10-08 15:58:00 +08:00
utils.py [zerobubble] rebase main (#6075) 2024-10-08 15:58:00 +08:00