InternLM/tools/transformers
yhcc 745d2b911a
Fix readme about conversion to transformers (#25)
* add links for 8k

* fix acknowledgement

* modified readme for convert_hf
2023-07-07 13:38:06 +08:00
..
README-zh-Hans.md Fix readme about conversion to transformers (#25) 2023-07-07 13:38:06 +08:00
README.md Fix readme about conversion to transformers (#25) 2023-07-07 13:38:06 +08:00
configuration_internlm.py initial commit 2023-07-06 12:55:23 +08:00
convert2hf.py Fix readme about conversion to transformers (#25) 2023-07-07 13:38:06 +08:00
intern_moss_example.py initial commit 2023-07-06 12:55:23 +08:00
internlm_sft_on_moss.py initial commit 2023-07-06 12:55:23 +08:00
modeling_internlm.py initial commit 2023-07-06 12:55:23 +08:00
tokenization_internlm.py initial commit 2023-07-06 12:55:23 +08:00

README.md

InternLM Transformers

English | 简体中文

This folder contains the InternLM model in transformers format.

Weight Conversion

convert2hf.py can convert saved training weights into the transformers format with a single command.

python convert2hf.py --src_folder origin_ckpt/ --tgt_folder hf_ckpt/ --tokenizer ../v7_sft.model

Then, you can load it using the from_pretrained interface:

from modeling_internlm import InternLMForCausalLM

model = InternForCausalLM.from_pretrained("hf_ckpt/")

intern_moss_example.py demonstrates an example of how to use LoRA for fine-tuning on the fnlp/moss-moon-002-sft dataset.