You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/kernel/cuda_native
flybird1111 7a3dfd0c64
[shardformer] update shardformer to use flash attention 2 (#4392)
1 year ago
..
csrc [bf16] add bf16 support (#3882) 2 years ago
mha [coloattention] fix import error (#4380) 1 year ago
__init__.py [shardformer] update shardformer to use flash attention 2 (#4392) 1 year ago
layer_norm.py [kernel] fixed repeated loading of kernels (#2549) 2 years ago
multihead_attention.py [nfc] fix typo colossalai/cli fx kernel (#3847) 2 years ago
scaled_softmax.py [fix] coloattention support flash attention 2 (#4347) 1 year ago