In his wonderful new study The Benefit and the Burden: Tax Reform—Why We Need It and What It Will Take, Bruce Bartlett offers a useful thumbnail history on the federal government's seemingly haphazard role in the post-WWII evolution of the private health insurance industry.
In 1940 only about 12 million people in a population of 132 million had health insurance. During the war the federal government imposed wage and price controls to keep inflation in check. But because so many men had been called up for military service, there was a severe labor shortage. Businesses looked for ways to increase de facto wages to attract workers. One way was to offer free health insurance, permitted by the Stabilization Act of 1942.
This circumstantial solution led to today's crazy, outdated policy of subsidizing employer-sponsored health plans—"far and away," Bartlett notes, "the largest of all tax expenditures"—which has led to artificially high demand for healthcare and, thus, higher costs.
In the 2008 campaign, Sen. John McCain proposed eliminating this subsidy altogether and using the money to pay for refundable tax credits to buy individual healthcare plans. Obamacare tries to correct this market distortion by imposing (beginning in 2018) a 40 percent excise tax on so-called "Cadillac plans."
The goal of both policies is to create market incentives for people to consume less healthcare—not just any healthcare, but unnecessary healthcare. This is the impetus, too, behind Obamacare's notorious Independent Payment Advisory Board, which is charged with streamlining Medicare program rules and identifying wasteful provider payments.
The latter program, fairly or not, raises the specter of government rationing.
It seems to me, though, that diminishing or getting rid of subsidies for employment-linked healthcare constitutes rationing, too, if of a less philosophically offensive sort: self-rationing.
Conservatives and liberals essentially both believe that smarter federal tax policy and a freer insurance market would persuade people to see fewer specialists, to undergo fewer procedures, to generally be more cost-conscious when they "shop" in the healthcare marketplace.
It sounds great in theory, but what would self-rationing look like in practice?
It turns out that the weak economy of recent years may have given us a glimpse. Growth in healthcare spending actually slowed down in 2010, but it wasn't because we managed to control inflation. It was because of self-rationing.
[T]he Employee Benefit Research Institute published their Issue Brief of Health Savings Accounts for 2006-2011. The study found that U.S. health citizens continue to self-ration health care: 1 in 5 older Americans has cut back on health care to save money, in the form of postponing visits to doctors, cutting medication dosages (against physician prescriptions), and stopping pills altogether.
This trend is especially pronounced among women and African-Americans.
It's not quite the picture of harmonious market efficiency we'd like to see.
Of course, the kind of self-rationing evinced above took place in the context of the current system, with all its waste and subsidized inefficiencies. If everyone, not just low-income people and seniors, had to cut back on unnecessary care, we'd probably see systemwide cost savings and curve-bending.
Despite the fact that I'm generally racked by ideological doubt, I agree this needs to happen. A muted version of this reality will arrive in 2018 with the Cadillac tax.
I suspect Americans are not going to like self-rationing anymore than they like the idea of "death panels."