Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No 'vanilla' RNN layer support? #147

Open
nvssynthesis opened this issue Oct 22, 2024 · 2 comments
Open

No 'vanilla' RNN layer support? #147

nvssynthesis opened this issue Oct 22, 2024 · 2 comments

Comments

@nvssynthesis
Copy link

Is it correct that there is no support for 'vanilla' RNN layers, e.g. that of torch.nn.RNN? Is the reason for this something like 'GRU or LSTM is better anyway, just use that'?

@jatinchowdhury18
Copy link
Owner

That's correct, at the moment RTNeural does not have support for that layer type. The reasoning is more just that I haven't yet had a need for it, and haven't received any requests to implement it (up to now). We probably should implement that layer, especially since it's simpler than the GRU or LSTM layers.

I'll probably end up naming the layer something like ElmanRNN, since I think that's maybe a more "specific" name than just RNN. Would you happen to know if TensorFlow has an equivalent layer?

I've added this to my to-do list, but it might be a minute before I get around to implementing it.

@nvssynthesis
Copy link
Author

Excellent, thanks for clarifying.
I just know there is tf.keras.layers.RNN but I'm not dead sure if it's equivalent.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants