2023-07-06 04:55:23 +00:00
|
|
|
# InternLM Transformers
|
|
|
|
|
2023-07-07 05:38:06 +00:00
|
|
|
[English](./README.md) |
|
|
|
|
[简体中文](./README-zh-Hans.md)
|
2023-07-06 04:55:23 +00:00
|
|
|
|
2023-07-07 05:38:06 +00:00
|
|
|
This folder contains the `InternLM` model in transformers format.
|
2023-07-06 04:55:23 +00:00
|
|
|
|
2023-07-07 05:38:06 +00:00
|
|
|
## Weight Conversion
|
|
|
|
|
|
|
|
`convert2hf.py` can convert saved training weights into the transformers format with a single command.
|
2023-07-06 04:55:23 +00:00
|
|
|
|
|
|
|
```bash
|
2023-07-07 05:38:06 +00:00
|
|
|
python convert2hf.py --src_folder origin_ckpt/ --tgt_folder hf_ckpt/ --tokenizer ../v7_sft.model
|
2023-07-06 04:55:23 +00:00
|
|
|
```
|
|
|
|
|
2023-07-07 05:38:06 +00:00
|
|
|
Then, you can load it using the `from_pretrained` interface:
|
2023-07-06 04:55:23 +00:00
|
|
|
|
|
|
|
```python
|
|
|
|
from modeling_internlm import InternLMForCausalLM
|
|
|
|
|
|
|
|
model = InternForCausalLM.from_pretrained("hf_ckpt/")
|
|
|
|
```
|
|
|
|
|
2023-07-07 05:38:06 +00:00
|
|
|
`intern_moss_example.py` demonstrates an example of how to use LoRA for fine-tuning on the `fnlp/moss-moon-002-sft` dataset.
|