ColossalAI/examples/resnet_cifar10_data_parallel/config.py

11 lines
138 B
Python

from colossalai.amp import AMP_TYPE
BATCH_SIZE = 128
NUM_EPOCHS = 200
CONFIG = dict(
fp16=dict(
mode=AMP_TYPE.TORCH
)
)