Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

weird behivior/implementation error? #4

Open
nevakrien opened this issue Mar 11, 2024 · 0 comments
Open

weird behivior/implementation error? #4

nevakrien opened this issue Mar 11, 2024 · 0 comments

Comments

@nevakrien
Copy link

nevakrien commented Mar 11, 2024

i took the code for BitLinearOptimized and added a small thing so I can run it standalone

super(BitLinearOptimized, self).__init__(in_features, out_features, bias,dtype=torch.bfloat16) 
#just added the right dtype

runing the folowing

w=BitLinearOptimized(1,1)
x=torch.ones(1,dtype=torch.bfloat16)
y=w(x)
print(list(w.parameters()))

gives

[Parameter containing:
tensor([0.0703], dtype=torch.bfloat16, requires_grad=True)]

meaning that there is only the weight. is this an intended behivior?
because I saw in ur training u use model.parameters() so it seems like that would be an issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant