You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/kernel/__init__.py

4 lines
150 B

from .cuda_native import LayerNorm, FusedScaleMaskSoftmax, MultiHeadAttention
__all__ = ["LayerNorm", "FusedScaleMaskSoftmax", "MultiHeadAttention"]