From 74b60896c479c94ce8f9588437e859aba5735901 Mon Sep 17 00:00:00 2001 From: Kai Chen Date: Wed, 17 Jan 2024 12:19:28 +0800 Subject: [PATCH] Update README.md --- README.md | 10 ---------- 1 file changed, 10 deletions(-) diff --git a/README.md b/README.md index d6624b1..267e26b 100644 --- a/README.md +++ b/README.md @@ -119,16 +119,6 @@ The release of InternLM2 series contains two model sizes: 7B and 20B. 7B models * According to the released performance of 2024-01-17. -### Data Contamination - -| Method | GSM-8k | English Knowledge | Chinese Knowledge | Coding | -|------------|----------|-------------|-------------|------| -| Average of Open-source LLMs | -0.02 | -0.13 | -0.20 | -0.07 | -| InternLM2-Base-7B | 0.09 | -0.13 | -0.16 | 0.03 | -| InternLM2-7B | 0.02 | -0.12 | -0.16 | 0.05 | -| InternLM2-Base-20B | 0.08 | -0.13 | -0.17 | -0.02 | -| InternLM2-20B | 0.04 | -0.13 | -0.19 | -0.02 | - ## Usages We briefly show the usages with [Transformers](#import-from-transformers), [ModelScope](#import-from-modelscope), and [Web demos](#dialogue).