You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/layer
Jianghai ef4c14a5e2
[Inference] Fix bug in ChatGLM2 Tensor Parallelism (#5014)
1 year ago
..
__init__.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 1 year ago
_operation.py
dropout.py
embedding.py
linear.py [Inference] Fix bug in ChatGLM2 Tensor Parallelism (#5014) 1 year ago
loss.py
normalization.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 1 year ago
parallel_module.py
qkv_fused_linear.py
utils.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 1 year ago