Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

An inquiry about the Loss function in train.py line123-125 #15

Open
JerryHao2001 opened this issue Aug 4, 2023 · 0 comments
Open

An inquiry about the Loss function in train.py line123-125 #15

JerryHao2001 opened this issue Aug 4, 2023 · 0 comments

Comments

@JerryHao2001
Copy link

Hello! While reading the paper associated with the code, I noticed that in the paper (page 5 formula 5 & 6), the L1 is the "nearest neighbor classification loss" while L2 is the "global classification loss", and total loss L = lambda L1 + L2. Whereas in the code line 123-125, it seems to me the losses are a bit mixed up.

123 loss1 = criterion(ytest, pids.view(-1))
124 loss2 = criterion(cls_scores, labels_test.view(-1))
125 loss = loss1 + 0.5 * loss2

I'm wondering why the nearest neighbor classification loss is calculated with ytest and pids, which looks like the global loss.
Thank you for the help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant