Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

23
  • 12
    $\begingroup$ This formula is very interesting and helpful. Is there is any reference for this formula? It would be more helpful. $\endgroup$ Commented Feb 16, 2016 at 14:14
  • 2
    $\begingroup$ @prashanth I combined several assertions and formulas in the NN Design text referenced above. But I don't think it's explicitly called out in the form I show. And my version is a very crude approximation with a lot of simplifying assumptions. So YMMV. $\endgroup$ Commented Feb 16, 2016 at 17:30
  • 1
    $\begingroup$ First I wanted to write training set instead of test set in previous comment. Maybe this formula makes sense if we are to read it as "you need at least that many neurons to learn enough features (the DOF you mentioned) from dataset". If the features of dataset are representative of population and how well the model can generalize maybe it's a different question (but an important one). $\endgroup$ Commented Feb 22, 2016 at 22:07
  • 5
    $\begingroup$ Are you sure this is a good estimate for networks with more than one hidden layer? Isn't it the case than for multiple hidden layers the number of parameters is much greater than $N_h \cdot (N_i + N_o)$? $\endgroup$ Commented May 24, 2017 at 22:37
  • 3
    $\begingroup$ @mateus, perhaps a slightly better rule of thumb for multiple layers is the N_h (average number of hidden neurons per layer) solution to this N_s = (N_i + N_o) * N_h ^ N_hidden_layers. But I still wouldn't use this formula. It's only for very basic problems (toy problems) when you don't plan to implement any other regularization approaches. $\endgroup$ Commented May 25, 2017 at 19:40