Commit Graph

2 Commits (4f68b3f10ce55a3563f943f8163b460d8c9fbb19)

Author SHA1 Message Date
Zian(Andy) Zheng 7768afbad0 Update flash_attention_patch.py
To be compatible with the new change in the Transformers library, where a new argument 'padding_mask' was added to forward function of attention layer.
https://github.com/huggingface/transformers/pull/25598
2023-10-16 14:00:45 +08:00
Tong Li 74aa7d964a
initial commit: add colossal llama 2 (#4784) 2023-09-24 23:12:26 +08:00