This is my question about a very common problem faced while training several data science and AI algorithms, and most importantly while backpropogating errors in neural networks, which is getting trapped in a local minima while descending gradient.
So, according to the discussion under the qn, it is claimed to be off-topic
However, in the defence of my post, I think it is perfectly on-topic in this site, as it asks about a legit problem faced while training neural nets and several other AI algorithms.
So, I am looking forward to what the community thinks regarding this.