Frank Lee
|
7cfed5f076
|
[feat] refactored extension module (#5298)
* [feat] refactored extension module
* polish
* polish
* polish
* polish
* polish
* polish
* polish
* polish
* polish
* polish
|
10 months ago |
Xuanlei Zhao
|
dd2c28a323
|
[npu] use extension for op builder (#5172)
* update extension
* update cpu adam
* update is
* add doc for cpu adam
* update kernel
* update commit
* update flash
* update memory efficient
* update flash attn
* update flash attention loader
* update api
* fix
* update doc
* update example time limit
* reverse change
* fix doc
* remove useless kernel
* fix
* not use warning
* update
* update
|
11 months ago |
Hongxin Liu
|
079bf3cb26
|
[misc] update pre-commit and run all files (#4752)
* [misc] update pre-commit
* [misc] run pre-commit
* [misc] remove useless configuration files
* [misc] ignore cuda for clang-format
|
1 year ago |
flybird1111
|
7a3dfd0c64
|
[shardformer] update shardformer to use flash attention 2 (#4392)
* cherry-pick flash attention 2
cherry-pick flash attention 2
* [shardformer] update shardformer to use flash attention 2
[shardformer] update shardformer to use flash attention 2, fix
[shardformer] update shardformer to use flash attention 2, fix
[shardformer] update shardformer to use flash attention 2, fix
|
1 year ago |
flybird1111
|
25c57b9fb4
|
[fix] coloattention support flash attention 2 (#4347)
Improved ColoAttention interface to support flash attention 2. Solved #4322
|
1 year ago |
Frank Lee
|
dd14783f75
|
[kernel] fixed repeated loading of kernels (#2549)
* [kernel] fixed repeated loading of kernels
* polish code
* polish code
|
2 years ago |
zbian
|
6877121377
|
updated flash attention api
|
2 years ago |
ver217
|
f68eddfb3d
|
refactor kernel (#142)
|
3 years ago |
shenggan
|
5c3843dc98
|
add colossalai kernel module (#55)
|
3 years ago |