Let's say that I have an error function, E, such that: $$ E_p = \frac{1}{2} \sum_{j}(\delta_{pj})^2$$
If I take the partial derivative like this:
$$\frac{\partial{E_p}}{\partial{\delta_{pj}}} $$
will I get:
$$ \frac{\partial{E_p}}{\partial{\delta_{pj}}} = \delta_{pj}$$?
And if I take the partial derivative like this:
$$\frac{\partial{E_p}}{\partial{\delta_{p}}} $$ (notice that I've dropped the subscript, j),
will I get:
$$ \frac{\partial{E_p}}{\partial{\delta_{p}}} = \sum_{j}(\delta_{pj}) $$ ?
I'm trying to understand the effect that taking a partial derivative has on a summation involving a vector (especially when there are different indexes used in the derivatives) because I've had trouble finding information about that subject in particular. Maybe I just don't know what to search for, so if someone has a link to an article that will help me understand this matter, I will greatly appreciate it.