I’ve long been dubious about the hand-wringing around retirement savings in America. There is certainly room for improvement but I’ve never been convinced it is quite as dire as op-ed columns and media reports make it out to be. In particular, median personal income in the US is $33,706. So 50% of Americans make even less than that. How much “retirement savings” do they actually need once you factor in Social Security and home equity? (Also where are the tens of millions of retirees living in extremely dire circumstances caused by this lack of planning?)
A new NBER paper does a deeper analysis and comes to a similar conclusion.
In “Can Low Retirement Savings Be Rationalized” by Jason S. Scott, John B. Shoven, Sita N. Slavov, and John G. Watson find that
for many, perhaps most, people in the bottom half of the lifetime earnings distribution, it is optimal to spend out their retirement wealth well before death and to live on Social Security alone after that. Very low earners may find it optimal to not engage in retirement saving at all.
How do they reach that conclusion? Their argument is that the nearly universal assumption of constant lifetime consumption is wrong. It isn’t that the lifecycle models are wrong, just that constant consumption is only optimal under “very specific conditions” that they don’t believe hold in the real world. Instead, they argue, declining consumption is optimal.
Constant consumption is only optimal if one’s subjective rate of time preference is perfectly offset by earned interest. But there are many reasons to believe that isn’t true.
Beyond a pure rate of time preference reflecting impatience [ed: that is, people have a discount rate higher than the prevailing interest rate], future utility may be discounted for mortality. Survey evidence further suggests individuals attach a lower value to consumption in bad health states, particularly in states with mental impairments like dementia.
And on the other side of the equation — the asset return side — the collapse of interest rates over the past decades also should result in bringing consumption forward. (That’s the whole point of why central banks cut rates, after all. To spur consumption.)
That’s the high-level view to try to give an intuitive understanding. But what are the actual concrete results?
First let’s look at a “moderate income earner”, someone who makes 100% of the average wage index and see what the optimal portfolio size at retirement is. Remember, standard retirement planning generally suggests a portfolio of around 20x your pre-retirement income. (25x retirement spending; but retirement spending is often stylized as 80% of pre-retirement spending; hence 25 * .8 = 20.)
- If the prevailing real interest rate is 0% and then the optimal portfolio is only 1.45x.
- But if the subjective time preference is higher than the real interest rate (i.e. people are “impatient”) then it drops to 0.97x for a time preference of 3% or 0.52x for a time preference of 5%.
The author’s point out that their model does not (yet) incorporate health discounting (i.e. people expecting their health to decline as they age, placing less value of future money), which would further strengthen the case for declining consumption and lower optimal portfolios at retirement.
Things are ever more stark for “low income earners” (45% of the average wage index) and “very low income earners” (25% of the average wage index). For a “low income earner” just 0.77x is optimal; for a “very low income earner” just 0.14x is optimal. And those numbers go even lower if you take into account “impatience” and “health discounting”!
Once again, optimal retirement wealth is really quite low. In 2019 dollars, the figure is approximately $6,000 with zero percent interest rates.
Of course, these results only apply for people in the bottom half of the earnings distribution. But still, these results suggest that the vast majority of Americans who are saving very little for retirement aren’t being nearly as myopic and irrational as commonly reported.