Yes, of course. This is fine and perfectly acceptable. It is common and standard to see algorithms whose running time depends upon two parameters.
For instance, you will often see the running time of depth-first search expressed as $O(n+m)$, where $n$ is the number of vertices and $m$ is the number of edges in the graph. This is perfectly valid. The meaning of this is there exists a constant $c$ and numbers $n_0,m_0$ such that the running time of the algorithm is at most $c \cdot (n+m)$, for all $n>n_0,m>m_0$. In other words, if the exact running time is $f(n,m)$, we say that $f(n,m) = O(n+m)$ if there exists $c,n_0,m_0$ such that $n>n_0$ and $m>m_0$ implies $f(n,m) \le c \cdot (n+m)$.
Yes, it is perfectly appropriate and acceptable to say that the first stage takes $O(n)$ time and the second stage takes $O(m)$ time.
Important: make sure you define what $n$ and $m$ are. You can't say "this is an $O(n)$ time algorithm" without specifying what $n$ is. If $n$ isn't specified in the problem statement, you need to specify it. For instance, see graph algorithms, where we typically define $n = $ # of vertices and $m = $ # of edges.
As far as whether you can call them $O(n)$ time, no, of course not -- unless you somehow know that $m = O(n)$. Of course, if you know that $m = O(n)$, then it follows that $m+n = O(n)$, so an $O(m+n)$ time algorithm is also an $O(n)$ time algorithm. But if there is no guarantee that $m = O(n)$, then you cannot call it an $O(n)$ time algorithm.
This is basic stuff. You'll find it all over algorithms textbooks.