mirror of https://github.com/hpcaitech/ColossalAI
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
7 lines
153 B
7 lines
153 B
3 years ago
|
import torch
|
||
|
|
||
|
def calc_acc(logits, targets):
|
||
|
preds = torch.argmax(logits, dim=-1)
|
||
|
correct = torch.sum(targets == preds)
|
||
|
return correct
|