It is well known that wages in the United States have stagnated in recent decades, but how badly? We know that nominal wages, expressed in current dollars at the time they are paid, have risen dramatically. In 1965, production and nonsupervisory workers averaged just $2.60 an hour. Now they average nearly $22 an hour. But what really matters is real wages, that is, nominal wages adjusted to show the effect of inflation. Are real wages actually lower now than in the past? Have they increased, but just not very rapidly? As this chart shows, it depends on exactly how you do the inflation adjustment.
The difference is dramatic. According to the CPI, real wages have increased just 8 percent in half a century. According to the PCE index, they have increased 40 percent. Even that is not very impressive over such a long period, but 40 percent is a lot better than 8 percent.
If you measure from 1972 instead of 1965, real wages have actually fallen by 4 percent, as measured by the CPI. Even by the PCE, they have increased by just 19 percent.
Which is right? Frustratingly, we can't really say that either measure is right or wrong. The two indexes simply make different choices when it comes to the thorny technical issues that bedevil the measurement of inflation—how to adjust for changes in the basket of goods that consumers purchase, how to adjust for quality, and how to adjust for the substitution of cheaper goods for more expensive ones when relative prices change.
For more on the problems of measuring inflation, see these earlier posts:
What Does the Consumer Price Index Measure? Inflation or the Cost of Living? What's the Difference? (Also available in a classroom-ready slideshow version).
Deconstructing Shadowstats: Why is it So Loved by its Followers but Scorned by Economists?