Hotfix/tutorial readme index (#1922)

* [tutorial] removed tutorial index in readme

* [tutorial] removed tutorial index in readme
pull/1924/head
Frank Lee 2022-11-12 18:24:52 +08:00 committed by GitHub
parent 24cbee0ebe
commit d43a671ad6
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
5 changed files with 16 additions and 17 deletions

View File

@ -1,4 +1,4 @@
# Handson 3: Auto-Parallelism with ResNet
# Auto-Parallelism with ResNet
## Prepare Dataset

View File

@ -1,4 +1,4 @@
# Handson 1: Multi-dimensional Parallelism with Colossal-AI
# Multi-dimensional Parallelism with Colossal-AI
## Install Titans Model Zoo

View File

@ -1,4 +1,4 @@
# Handson 4: Comparison of Large Batch Training Optimization
# Comparison of Large Batch Training Optimization
## Prepare Dataset

View File

@ -1 +1 @@
# Handson 5: Fine-tuning and Serving for OPT from Hugging Face
# Fine-tuning and Serving for OPT from Hugging Face

View File

@ -1,4 +1,4 @@
# Handson 2: Sequence Parallelism with BERT
# Sequence Parallelism with BERT
In this example, we implemented BERT with sequence parallelism. Sequence parallelism splits the input tensor and intermediate
activation along the sequence dimension. This method can achieve better memory efficiency and allows us to train with larger batch size and longer sequence length.
@ -140,4 +140,3 @@ machine setting.
launch_from_slurm` or `colossalai.launch_from_openmpi` as it is easier to use SLURM and OpenMPI
to start multiple processes over multiple nodes. If you have your own launcher, you can fall back
to the default `colossalai.launch` function.