You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/docs/config.md

54 lines
1.2 KiB

3 years ago
# Config file
Here is a config file example showing how to train a ViT model on the CIFAR10 dataset using Colossal-AI:
3 years ago
```python
# optional
# three keys: pipeline, tensor
# data parallel size is inferred
3 years ago
parallel = dict(
pipeline=dict(size=1),
tensor=dict(size=4, mode='2d'),
)
# optional
3 years ago
# pipeline or no pipeline schedule
fp16 = dict(
mode=AMP_TYPE.NAIVE,
3 years ago
initial_scale=2 ** 8
)
# optional
# configuration for zero
# you can refer to the Zero Redundancy optimizer and zero offload section for details
# https://www.colossalai.org/zero.html
zero = dict(
level=<int>,
...
)
# optional
# if you are using complex gradient handling
# otherwise, you do not need this in your config file
# default gradient_handlers = None
gradient_handlers = [dict(type='MyHandler', arg1=1, arg=2), ...]
3 years ago
# optional
# specific gradient accumulation size
# if your batch size is not large enough
gradient_accumulation = <int>
3 years ago
# optional
# add gradient clipping to your engine
# this config is not compatible with zero and AMP_TYPE.NAIVE
# but works with AMP_TYPE.TORCH and AMP_TYPE.APEX
# defautl clip_grad_norm = 0.0
clip_grad_norm = <float>
# optional
# cudnn setting
# default is like below
cudnn_benchmark = False,
cudnn_deterministic=True,
3 years ago
```