Replies: 2 comments
-
@ofirgo Would you be able to take a look? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi @amundra0 , Thank you for raising this issue. RNNs are currently not supported for quantization using MCT. Let us know if we can help you with anything else. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Issue Type
Bug
Source
pip (model-compression-toolkit)
MCT Version
2.1.0
OS Platform and Distribution
Ubuntu 22.04.4 LTS
Python version
3.9.19
Describe the issue
I'm trying to perform post-training quantization using an RNN. I'm providing 3 inputs - one to the model and one state to each GRU layer. However, I run into this issue:
I tried to trace it deeper by going inside those traceback files and printing the results. I found that the out_stats_container for the GRU layers that I printed in the get_activations_qparams function provided a list of two StatsCollector objects for each GRU layer. It's the same object being repeated twice. This is what it looks like:
Do you know why GRU layers have two output statistics provided, both of which are the same? This is blocking my code from running. It has something to do with this exception, but I don't know what to make of it: Exception: ActivationQuantizationHolder supports a single quantizer but 2 quantizers were found for node GRU:gru
Expected behaviour
The code should run without errors and create a quantized model.
Code to reproduce the issue
Log output
No response
Beta Was this translation helpful? Give feedback.
All reactions