Any model that’s based on data assumes that the underlying process that generated that data will remain the same. Aka, nothing changes.
If you take a snapshot of today’s population weight/height data, and determined the correlation coefficient, that’ll only remain valid as there’s no sudden gene mutation (affecting the whole population) that suddenly distorts it.
But what the chances of that, right? It may be very improbable at the moment - but our world is transforming at neck-breaking speed. The top paying job that you aim for when you’re at 18 may be marginalized by technological advancement by the time you finish your PhD.
But, we can top that!
Any model that learns from data that “changes over time” assumes that the underlying process that produced the change will not change, as well as the timeframe of the observable effect remains the same! That’s another dimension of stationarity.
If we model inflation’s effect on stocks, that’s not instantaneous, and the time horizon we pick matters, a lot. Nothing may happen within a month, but 100 years will potentially eliminate all the observable differences we’re interested in. So we look at, let’s say, a year.
But, what if the effect may appear faster than expected (COVID market recovery), or it may lag by years, because of completely exogenous factors, like a war. If you relied on a fixed time horizon model (hard to avoid!) then your forecast is probably wrong in these cases.