You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/layer
Yuanheng Zhao df6747603f
[Colossal-Inference] (v0.1.0) Merge pull request #5739 from hpcaitech/feature/colossal-infer
6 months ago
..
__init__.py [shardformer] refactor embedding resize (#5603) 7 months ago
_operation.py
attn.py [coloattention]modify coloattention (#5627) 7 months ago
dropout.py
embedding.py [Inference] Fix bugs and docs for feat/online-server (#5598) 7 months ago
linear.py [shardformer] refactor embedding resize (#5603) 7 months ago
loss.py [Shardformer] Add parallel output for shardformer models(bloom, falcon) (#5702) 6 months ago
normalization.py [shardformer] fix chatglm implementation (#5644) 7 months ago
parallel_module.py [shardformer] refactor embedding resize (#5603) 7 months ago
qkv_fused_linear.py
utils.py