If you look at the data sheet of an MCU or chip that provides IO lines, you'll usually find somewhere in that data sheet a limit value for current into an IO line. Now if the IO is configured as an input, that limit current might be specified as 1 mA. If it's an output port then maybe something in excess of 10 mA might be specified.
But ask yourself the question; why is the chip supplier telling you this information - why is the chip supplier telling you that when you use a port as an input you should ensure the current into it is less than (say) 1 mA. After all, you'd expect that when applying Vcc to the port, that just a few nA will flow. That would be true under normal circumstances of course.
The 1 mA limit (if that is what the specified limit is) tells you how much current you can push/pull in/out of that pin should the input voltage rise a little above Vcc or fall a little below 0 volts.
So, if you protect an input line with a 10 kohm resistor and the pin is rated to withstand 1 mA, you could, in effect push 1 mA through that 10 kohm resistor without sweating about the device failing. That means, on the outside world the external voltage could be something around 10 volts higher than Vcc and the pin would be protected. Similar story for going below 0 volts.
if the input pin is connected through a short wire to a sensor or a pot
Do an analysis on how that short wire can be influenced by external surges and EMI and, if you conclude that you are safe without a resistor then don't place one. However, my advice is to place a resistor and play safe. You can always fit a zero ohm link and, if you screw up on something and need to bodge a solution, then those extra pads could be a lifesaver in terms of modifications.