I assume you are talking about an ADC that has a sampling capacitor (e.g. successive approximation ADC, which is the most common type).
If you're talking about an ADC with a built-in multiplexer, the sampling time is very important, because it allows the voltage on the ADC's sampling capacitor to settle after switching from the previous channel. (More about this issue in a blog entry I wrote.)
If you're talking about an ADC with a single channel, the sampling time is still important, even though it's sampling only one signal, because the voltage on the ADC's sampling capacitor needs to catch up to that signal when it is reconnected to the input, and charged from its previous voltage to the new voltage. If you have a slow-bandwidth input signal, this isn't such a big deal, but if you have a relatively fast-changing input signal, you need to make sure the sampling capacitor catches up to it, by allowing sufficient sampling time.
A more detailed example for single-signal ADC:
Compare your signal frequencies to the sampling frequency. Let's say it's 10kHz sine waves via 100kHz sampling frequency. That's a 36 degree phase shift between samples. Worst-case is when your signal is going through zero (just as the day length changes fastest at the equinoxes rather than at the solstice); sin(+18 degrees) - sin(-18 degrees) = 0.618. So if you have a 1V amplitude sine wave (e.g. -1V to +1V, or 0 to 2V if offset), the difference between samples could be has high as 0.618V.
There's a nonzero resistance between the input pin and the ADC sampling capacitor -- at a minimum, it's the sampling switch resistance, but it can also include external resistance if you have any; that's why you should almost always place at least some local storage capacitor at the input of any sampling ADC. Compute that RC time constant and compare to the sampling time to look at the transient voltage decay after reconnecting the sampling capacitor to the input voltage. Suppose your sampling time is 500nsec and the RC time constant in question is 125nsec, that is, your sampling time is 4 time constants. 0.618V * e^(-T/tau) = 0.618V * e^(-4) = 11mV --> the ADC sampling capacitor voltage is still 11mV off from its final value. In this case I'd say the sampling time is too short. In general you have to look at the ADC bit count and wait something like 8 or 10 or 12 time constants. You want any transient voltage to decay down to less than 1/2 LSB of the ADC.
Hope that helps....