Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

4
  • $\begingroup$ Method is converging, it is not diverging. Learning rate is not the issue. Different learning rates converging to same local minima (with different random initializations) is the problem. $\endgroup$ Commented Jul 13, 2016 at 9:53
  • $\begingroup$ As the cost function is non-convex, I thought it is expected for the method to converge to different local minimas. But it is converging to same local minima. So I am concerned if there is some problem with my code or something. $\endgroup$ Commented Jul 13, 2016 at 9:54
  • $\begingroup$ So if the initial points are same and if you are in the same local valley, it might happen. Can you increase the learning rate by a good amount and check just to check if you will end up at the same point or not? Increasing the learning rate should push the point outside the local valley. $\endgroup$ Commented Jul 13, 2016 at 10:02
  • $\begingroup$ Initial points are not the same. I have randomized the initialization. $\endgroup$ Commented Jul 13, 2016 at 14:54