ColossalAI/colossalai/shardformer/layer
アマデウス 126cf180bc
[hotfix] fixed memory usage of shardformer module replacement (#5122)
2023-11-28 15:38:26 +08:00
..
__init__.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 2023-11-03 13:32:43 +08:00
_operation.py [hotfix] fixed memory usage of shardformer module replacement (#5122) 2023-11-28 15:38:26 +08:00
dropout.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
embedding.py [gemini] gemini support tensor parallelism. (#4942) 2023-11-10 10:15:16 +08:00
linear.py [Inference] Fix bug in ChatGLM2 Tensor Parallelism (#5014) 2023-11-07 15:01:50 +08:00
loss.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
normalization.py [npu] add npu support for gemini and zero (#5067) 2023-11-20 16:12:41 +08:00
parallel_module.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
qkv_fused_linear.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
utils.py [npu] add npu support for hybrid plugin and llama (#5090) 2023-11-22 19:23:21 +08:00