Given the discrete probability distribution for the negative binomial distribution in the form
$$P(X = r) = \sum_{n\geq r} {n-1\choose r-1} (1-p)^{n-r}p^r$$
It appears there are no derivations on the entire www of the variance formula $V(X) = \frac{r(1-p)}{p^2}$ that do not make use of the moment generating function.
I have successfully managed to compute the mean without this as follows;
\begin{align*} \mu = \sum_{n\geq r} n{n-1\choose r-1} (1-p)^{n-r}p^r &= \sum_{n\geq r} \frac{n(n-1)!}{(r-1)!(n-r)!}(1-p)^{n-r}p^r \\ &= \frac{r}{p} \sum_{n\geq r} \frac{n!}{r!(n-r)!}(1-p)^{n-r} p^{r+1}\\ \end{align*} Having already factored our claimed mean of $r/p$, it remains to show that $\sum_{n\geq r} \frac{n!}{r!(n-r)!}(1-p)^{n-r} p^{r+1} = 1$ which is done by reindexing (both $r$ and $n$) and realizing this as the sum of a probability mass function for a negative binomial distribution. Indeed, letting $k = r+1$ followed by $m = n+1$, we find
\begin{align*} \sum_{n\geq r} \frac{n!}{r!(n-r)!}(1-p)^{n-r} p^{r+1} &= \sum_{n\geq k-1}\frac{n!}{(k-1)!(n-k+1)!}(1-p)^{n-k+1}p^k\\ &= \sum_{m\geq k}\frac{(m-1)!}{(k-1)!(m-k)!}(1-p)^{m-k}p^k\\ &= \sum_{m\geq k}{m-1\choose k-1}(1-p)^{m-k}p^k = 1 \end{align*}
Does anyone know of a way to demonstrate that $\sigma^2 = V(X) = \frac{r(1-p)}{p^2}$ in this fashion?