You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/metric/_utils.py

8 lines
155 B

import torch
def calc_acc(logits, targets):
preds = torch.argmax(logits, dim=-1)
correct = torch.sum(targets == preds)
return correct