mirror of https://github.com/InternLM/InternLM
docs(install.md): update dependency package transformers version to >= 4.28.0 (#124)
Co-authored-by: 黄婷 <huangting3@CN0014010744M.local>pull/125/head
parent
1095263082
commit
acea4554ec
|
@ -7,7 +7,7 @@ The required packages and corresponding version are shown as follows:
|
|||
- MPFR == 4.1.0
|
||||
- CUDA == 11.7
|
||||
- Pytorch == 1.13.1+cu117
|
||||
- Transformers >= 4.25.1
|
||||
- Transformers >= 4.28.0
|
||||
- Flash-Attention == v1.0.5
|
||||
- Apex == 23.05
|
||||
- GPU with Ampere or Hopper architecture (such as H100, A100)
|
||||
|
|
|
@ -7,7 +7,7 @@
|
|||
- MPFR == 4.1.0
|
||||
- CUDA == 11.7
|
||||
- Pytorch == 1.13.1+cu117
|
||||
- Transformers >= 4.25.1
|
||||
- Transformers >= 4.28.0
|
||||
- Flash-Attention == v1.0.5
|
||||
- Apex == 23.05
|
||||
- Ampere或者Hopper架构的GPU (例如H100, A100)
|
||||
|
|
Loading…
Reference in New Issue