Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the loss #42

Open
aks1207 opened this issue May 17, 2023 · 0 comments
Open

Question about the loss #42

aks1207 opened this issue May 17, 2023 · 0 comments
Labels
bug Something isn't working

Comments

@aks1207
Copy link

aks1207 commented May 17, 2023

What does the term lap_loss indicate intutively, like what are we trying to optimise or capture here ?

D = torch.diag(torch.sum(self.masked_adj[0], 0))
m_adj = self.masked_adj if self.graph_mode else self.masked_adj[self.graph_idx]
L = D - m_adj
pred_label_t = torch.tensor(pred_label, dtype=torch.float)
if self.args.gpu:
    pred_label_t = pred_label_t.cuda()
    L = L.cuda()
if self.graph_mode:
    lap_loss = 0
else:
    lap_loss = (self.coeffs["lap"]
        * (pred_label_t @ L @ pred_label_t)
        / self.adj.numel()
    )
@aks1207 aks1207 added the bug Something isn't working label May 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant