You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/parallel/__init__.py

5 lines
130 B

from .data_parallel import ColoDDP, ZeroDDP
from .gemini_parallel import GeminiDDP
__all__ = ['ColoDDP', 'ZeroDDP', 'GeminiDDP']