InternLM/doc/install.md

59 lines
1.7 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters!

This file contains ambiguous Unicode characters that may be confused with others in your current locale. If your use case is intentional and legitimate, you can safely ignore this warning. Use the Escape button to highlight these characters.

## InternLM项目的依赖安装
### 环境准备
首先,需要安装的依赖包及对应版本列表如下:
- Python == 3.10
- GCC == 10.2.0
- MPFR == 4.1.0
- CUDA == 11.7
- Pytorch == 1.13.1+cu117
- Transformers >= 4.25.1
- Flash-Attention == 23.05
- Ampere或者Hopper架构的GPU (例如H100, A100)
- Linux OS
以上依赖包安装完成后,需要更新配置系统环境变量:
```bash
export CUDA_PATH={path_of_cuda_11.7}
export GCC_HOME={path_of_gcc_10.2.0}
export MPFR_HOME={path_of_mpfr_4.1.0}
export LD_LIBRARY_PATH=${GCC_HOME}/lib64:${MPFR_HOME}/lib:${CUDA_PATH}/lib64:$LD_LIBRARY_PATH
export PATH=${GCC_HOME}/bin:${CUDA_PATH}/bin:$PATH
export CC=${GCC_HOME}/bin/gcc
export CXX=${GCC_HOME}/bin/c++
```
### 环境安装
将项目`internlm`及其依赖子模块,从 github 仓库中 clone 下来,命令如下:
```bash
git clone git@github.com:InternLM/InternLM.git --recurse-submodules
```
推荐使用 conda 构建一个 Python-3.10 的虚拟环境, 并基于`requirements/`文件安装项目所需的依赖包:
```bash
conda create --name internlm-env python=3.10 -y
conda activate internlm-env
cd internlm
pip install -r requirements/torch.txt
pip install -r requirements/runtime.txt
```
安装 flash-attention (version v1.0.5)
```bash
cd ./third_party/flash-attention
python setup.py install
cd ./csrc
cd fused_dense_lib && pip install -v .
cd ../xentropy && pip install -v .
cd ../rotary && pip install -v .
cd ../layer_norm && pip install -v .
cd ../../../../
```
安装 Apex (version 23.05)
```bash
cd ./third_party/apex
pip install -v --disable-pip-version-check --no-cache-dir --global-option="--cpp_ext" --global-option="--cuda_ext" ./
cd ../../
```