Given two points on a unit sphere in spherical coordinates representation ($\theta$ is longitude, $\phi$ is latitude in $[0,\pi]$), I want to express their angular difference in spherical coordinates.
In detail: given two points $p_1,p_2$ on the meridian, the first is under the second. Making the first the new north pole but leaving the meridian at $\theta =0°$ should give me a spherical coordinate with $\phi = |\phi_1 - \phi_2| $ which seems obvious, but the new longitude $\theta$ needs to become $180°$ although $\theta_1 = \theta_2 = 0$ (meridian).
If the answer is to convert both coordinates to Euclidian (X,Y,Z), then subtract and then reconvert to spherical coordinates I would not be surprised. But I would be very glad, if you could state why such a difference as in $ \mathbb{R}^{3}$ is not possible in spherical coordinates. What property do they miss?
Edit: these images should clarify my intentions.

The left image shows an angular difference. But I understand that a coordinate-wise difference is not sufficient. To show an example, please look at the right image.
$A$ becomes the new north pole. Thereby $B's$ polar coordinate changes to $\theta(B-A)$. But additionally the azimuthal angle of $B'$ becomes $\phi(B)=$ 180°. Imagine another extreme case where $\phi(B)$ started with 90°. It would end up at the same 90° after making $A$ the new north pole.
