Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sequential adaquant updating only one batch with quantized input? #3

Open
hy-chen opened this issue Apr 27, 2021 · 2 comments
Open

Comments

@hy-chen
Copy link

hy-chen commented Apr 27, 2021

cached_input_output[layer][0] = (cached_qinput[layer][0],cached_input_output[layer][0][1])

[0] index into the first batch.

isn't sequential adaquant supposed to update the input cache of all batches to the quantized values?

@itayhubara
Copy link
Owner

In line 465 I set quantize to true thus all values are quantized. In line 481 I just replace the FP32 record I have with the results from the quantized model.

@hy-chen
Copy link
Author

hy-chen commented Apr 27, 2021

Hi itayhubara,

Thanks for replying. Yes the quantize is set to True. But in the record that you will be using for optimization, only the first batch is updated (the replacement action you mentioned).

This is because cached_input_output is organized as [layer1, layer2 ... ] and each layer is [batch1, batch2, ...]. So in line 481, only the first batch's FP32 record is replaced with the first batch of the quantized values.

Could you let me know if this is true? Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants