You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/accelerator.py

15 lines
269 B

import torch
import torch.nn as nn
__all__ = ['Accelerator']
class Accelerator:
def __init__(self, device: torch.device):
self.device = device
def setup_model(self, model: nn.Module) -> nn.Module:
# TODO: implement this method
pass