I am giving a presentation on Data Science, and I want to talk about the idea that data that is not "big" enough is a big barrier for Machiene Learning. Looking online, there are concepts like overfitting and underfitting, but I am more looking to talk about data that, even if fitted optimally, would still not actually be a good model for the system.
Is there a good term to use for this?