From 177c79f2d17091e4e74a79dab363cc0f57f0a0eb Mon Sep 17 00:00:00 2001 From: binmakeswell Date: Tue, 28 Nov 2023 17:44:06 +0800 Subject: [PATCH] [doc] add moe news (#5128) * [doc] add moe news * [doc] add moe news * [doc] add moe news --- README.md | 13 ++++++++++++- docs/README-zh-Hans.md | 13 ++++++++++++- examples/language/openmoe/README.md | 9 +++++++++ 3 files changed, 33 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 1898d255e..98592b85c 100644 --- a/README.md +++ b/README.md @@ -25,7 +25,8 @@ ## Latest News -* [2023/09] [One Half-Day of Training Using a Few Hundred Dollars Yields Similar Results to Mainstream Large Models, Open-Source and Commercial-Free Domain-Specific Llm Solution](https://www.hpc-ai.tech/blog/one-half-day-of-training-using-a-few-hundred-dollars-yields-similar-results-to-mainstream-large-models-open-source-and-commercial-free-domain-specific-llm-solution) +* [2023/11] [Enhanced MoE Parallelism, Open-source MoE Model Training Can Be 9 Times More Efficient](https://www.hpc-ai.tech/blog/enhanced-moe-parallelism-open-source-moe-model-training-can-be-9-times-more-efficient) +* [2023/09] [One Half-Day of Training Using a Few Hundred Dollars Yields Similar Results to Mainstream Large Models, Open-Source and Commercial-Free Domain-Specific LLM Solution](https://www.hpc-ai.tech/blog/one-half-day-of-training-using-a-few-hundred-dollars-yields-similar-results-to-mainstream-large-models-open-source-and-commercial-free-domain-specific-llm-solution) * [2023/09] [70 Billion Parameter LLaMA2 Model Training Accelerated by 195%](https://www.hpc-ai.tech/blog/70b-llama2-training) * [2023/07] [HPC-AI Tech Raises 22 Million USD in Series A Funding](https://www.hpc-ai.tech/blog/hpc-ai-tech-raises-22-million-usd-in-series-a-funding-to-fuel-team-expansion-and-business-growth) * [2023/07] [65B Model Pretraining Accelerated by 38%, Best Practices for Building LLaMA-Like Base Models Open-Source](https://www.hpc-ai.tech/blog/large-model-pretraining) @@ -52,6 +53,7 @@ Parallel Training Demo