Overfitting is when your machine learning model memorizes the training set too much and can't predict well on the prediction set. Richard Feynman says that there's no need to memorize math or physics, he teaches directly into intuition. I think in life, it also makes the same sense. If you have the intuition of the fundamentals, you'll find that it transfers across many, many different domains.