1 Commits (822241a99cca799e1fca250ff2fb7f54ea0f8dcd)

Author SHA1 Message Date
Frank Lee 7cfed5f076
[feat] refactored extension module (#5298) 10 months ago
Xuanlei Zhao dd2c28a323
[npu] use extension for op builder (#5172) 11 months ago
Hongxin Liu 079bf3cb26
[misc] update pre-commit and run all files (#4752) 1 year ago
flybird1111 7a3dfd0c64 [shardformer] update shardformer to use flash attention 2 (#4392) 1 year ago
flybird1111 25c57b9fb4
[fix] coloattention support flash attention 2 (#4347) 1 year ago
Frank Lee dd14783f75
[kernel] fixed repeated loading of kernels (#2549) 2 years ago
zbian 6877121377 updated flash attention api 2 years ago
ver217 f68eddfb3d
refactor kernel (#142) 3 years ago
shenggan 5c3843dc98
add colossalai kernel module (#55) 3 years ago