Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
zbian 6877121377 updated flash attention api 2 years ago
..
csrc [hotfix] fix build error when torch version >= 1.13 (#1803) 2 years ago
__init__.py updated flash attention api 2 years ago
flash_attention.py updated flash attention api 2 years ago
layer_norm.py [NFC] polish colossalai/kernel/cuda_native/layer_norm.py code style (#980) 3 years ago
multihead_attention.py
scaled_softmax.py [NFC] polish colossalai/kernel/cuda_native/scaled_softmax.py code style (#955) 3 years ago