Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

2
  • 1
    $\begingroup$ So, I would clarify that this is a matter of terminology. If you define RNN (a short for recurrent neural networks) as the set of all neural networks with recurrent connections, then LSTMs could be considered a subset of RNN. However, if you define RNN as the vanilla RNN, then LSTMs are not a subset of RNNs (because vanilla RNNs do not have gates): it probably would be more the other way around (it may be possible to represent vanilla RNNs as LSTMs, though I don't remember the details now, so I am not really sure if this is possible). $\endgroup$ Commented Sep 13, 2021 at 16:19
  • $\begingroup$ @nbro: Yes. I think the looser terminology is very common, because for many problems "basic" RNNs and LSTMs/GRUs are interchangeable - in the sense that they can have same inputs and outputs, so using LSTM or GRU is a hyperparameter choice. For some reason that is phrased as using one of (RNN, GRU, LSTM) to solve a problem as opposed to one of (Elman, GRU, LSTM). I am not sure why $\endgroup$ Commented Sep 13, 2021 at 16:47