From 982a3d6813d485c26578c6c46c78e7181937b6cc Mon Sep 17 00:00:00 2001 From: ChenQiaoling00 Date: Wed, 12 Jul 2023 18:47:48 +0800 Subject: [PATCH] fix/fix_submodule_err --- doc/en/install.md | 3 ++- doc/install.md | 3 ++- third_party/apex | 2 +- third_party/flash-attention | 2 +- 4 files changed, 6 insertions(+), 4 deletions(-) diff --git a/doc/en/install.md b/doc/en/install.md index 5ff70fb..2799449 100644 --- a/doc/en/install.md +++ b/doc/en/install.md @@ -8,7 +8,8 @@ The required packages and corresponding version are shown as follows: - CUDA == 11.7 - Pytorch == 1.13.1+cu117 - Transformers >= 4.25.1 -- Flash-Attention == 23.05 +- Flash-Attention == v1.0.5 +- Apex == 23.05 - GPU with Ampere or Hopper architecture (such as H100, A100) - Linux OS diff --git a/doc/install.md b/doc/install.md index c0d6434..c8eceeb 100644 --- a/doc/install.md +++ b/doc/install.md @@ -8,7 +8,8 @@ - CUDA == 11.7 - Pytorch == 1.13.1+cu117 - Transformers >= 4.25.1 -- Flash-Attention == 23.05 +- Flash-Attention == v1.0.5 +- Apex == 23.05 - Ampere或者Hopper架构的GPU (例如H100, A100) - Linux OS diff --git a/third_party/apex b/third_party/apex index 8ffc901..0da3ffb 160000 --- a/third_party/apex +++ b/third_party/apex @@ -1 +1 @@ -Subproject commit 8ffc901e50bbf740fdb6d5bccb17f66a6ec8604e +Subproject commit 0da3ffb92ee6fbe5336602f0e3989db1cd16f880 diff --git a/third_party/flash-attention b/third_party/flash-attention index d2f4324..eff9fe6 160000 --- a/third_party/flash-attention +++ b/third_party/flash-attention @@ -1 +1 @@ -Subproject commit d2f4324f4c56e017fbf22dc421943793a8ca6c3b +Subproject commit eff9fe6b8076df59d64d7a3f464696738a3c7c24