# InternLM
👋 join us on Discord and WeChat
## Introduction InternLM is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies. With a single codebase, it supports pre-training on large-scale clusters with thousands of GPUs, and fine-tuning on a single GPU while achieving remarkable performance optimizations. InternLM achieves nearly 90% acceleration efficiency during training on 1024 GPUs. Based on the InternLM training framework, we have released two open-sourced pretrained model InternLM-7B and InternLM-20B. ## News [20231213] InternLM-7B-Chat and InternLM-20B-Chat checkpoints are updated. With an improved finetuning strategy, the new chat models can generate higher quality responses with greater stylistic diversity. [20230920] InternLM-20B is released with base and chat versions. ## Model Zoo Our models are released in three platforms: Transformers, ModelScope and OpenXLab. - There are two kinds of model weights: 1. huggingface type(marked as HF) 2. original model weight(marked as Original), providing in OpenXLab, which can be loaded by InternLM and finetuned directly. | Model | Transformers(HF) | ModelScope(HF) | OpenXLab(HF) | OpenXLab(Original) | Release Date | |---------------------------|------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------| | **InternLM Chat 20B** | [🤗internlm/internlm-chat-20b](https://huggingface.co/internlm/internlm-20b-chat) | [