mirror of https://github.com/InternLM/InternLM
fix/fix_submodule_err
parent
c7287e2584
commit
982a3d6813
|
@ -8,7 +8,8 @@ The required packages and corresponding version are shown as follows:
|
|||
- CUDA == 11.7
|
||||
- Pytorch == 1.13.1+cu117
|
||||
- Transformers >= 4.25.1
|
||||
- Flash-Attention == 23.05
|
||||
- Flash-Attention == v1.0.5
|
||||
- Apex == 23.05
|
||||
- GPU with Ampere or Hopper architecture (such as H100, A100)
|
||||
- Linux OS
|
||||
|
||||
|
|
|
@ -8,7 +8,8 @@
|
|||
- CUDA == 11.7
|
||||
- Pytorch == 1.13.1+cu117
|
||||
- Transformers >= 4.25.1
|
||||
- Flash-Attention == 23.05
|
||||
- Flash-Attention == v1.0.5
|
||||
- Apex == 23.05
|
||||
- Ampere或者Hopper架构的GPU (例如H100, A100)
|
||||
- Linux OS
|
||||
|
||||
|
|
|
@ -1 +1 @@
|
|||
Subproject commit 8ffc901e50bbf740fdb6d5bccb17f66a6ec8604e
|
||||
Subproject commit 0da3ffb92ee6fbe5336602f0e3989db1cd16f880
|
|
@ -1 +1 @@
|
|||
Subproject commit d2f4324f4c56e017fbf22dc421943793a8ca6c3b
|
||||
Subproject commit eff9fe6b8076df59d64d7a3f464696738a3c7c24
|
Loading…
Reference in New Issue