’s no secret that the years since the Great Recession have been hard on American workers. Though unemployment has finally dipped below six per cent, real wages for most have barely budged since 2007. Indeed, the whole century so far has been tough: wages haven’t grown much since 2000. So it was big news when, last month, Aetna’s C.E.O., Mark Bertolini, announced that the company’s lowest-paid workers would get a substantial raise—from twelve to sixteen dollars an hour, in some cases—as well as improved medical coverage. Bertolini didn’t stop there. He said that it was not “fair” for employees of a Fortune 50 company to be struggling to make ends meet. He explicitly linked the decision to the broader debate about inequality, mentioning that he had given copies of Thomas Piketty’s “Capital in the Twenty-first Century” to all his top executives. “Companies are not just money-making machines,” he told me last week. “For the good of the social order, these are the kinds of investments we should be willing to make.” Such rhetoric harks back to an earlier era in U.S. labor relations. These days, most of the benefits of economic growth go to people at the top of the income …show more content…
A substantial body of research suggests that it can make sense to pay above-market wages—economists call them “efficiency wages.” If you pay people better, they are more likely to stay, which saves money; job turnover was costing Aetna a hundred and twenty million dollars a year. Better-paid employees tend to work harder, too. The most famous example in business history is Henry Ford’s decision, in 1914, to start paying his workers the then handsome sum of five dollars a day. Working on the Model T assembly line was an unpleasant job. Workers had been quitting in huge numbers or simply not showing up for work. Once Ford started paying better, job turnover and absenteeism plummeted, and productivity and profits