If you ask someone to evaluate $\left(-\frac{1}{2}\right)!$ they would probably provide you with $\Gamma\left(\frac{1}{2}\right)=\sqrt{\pi}$. Similarly, $\left(-\frac{3}{4}\right)!$ would get called $\Gamma\left(\frac{1}{4}\right)=\sqrt[4]{\pi}\sqrt{\sqrt{2}-1}\frac{\zeta\left(\frac{3}{4}\right)}{\zeta\left(\frac{1}{4}\right)}$. Each instance uses the principal root.
It's generally accepted that the gamma function is "the" continuation of the factorial. But why? Analytic continuation doesn't really work uniquely for sets with disjoint elements. If $f:\mathbb{Z}\to\mathbb{R}$ is given by $f(x)=0$, then two different analytic functions that agree with $f$ on $\mathbb{Z}$ are $g:\mathbb{R}\to\mathbb{R},\;g(x)=0$ and $h:\mathbb{R}\to\mathbb{R},\;h(x)=\sin(\pi x)$
So if we took $\tilde{\Gamma}(x)=\Gamma(x)\operatorname{cis}(2\pi x)$, where $\operatorname{cis}(t)=\cos(t)+i\sin(t)$, then this would agree (in an offset way) with the factorial at all the non-negative integers, and is analytic, but wouldn't be considered as a substitute for the gamma function. People wouldn't think $\left(-\frac{3}{4}\right)!=\tilde{\Gamma}\left(\frac{1}{4}\right)$ ever.
Is it because the gamma function is the only analytic function which agrees with the factorial and is also real along the real line (save the select poles)? $\tilde{\Gamma}$ isn't real along all of $\mathbb{R}$, but that's just one example. Is there a way to prove this, or disprove it? Or is it just that the integral definition got popularised, and it's just a byproduct of tradition?