With the 50-year anniversary of the War on Poverty approaching, there are plenty of stories out there about the basic facts of poverty in the U.S.: the official rate is 15 percent, with more than 46 million Americans living below the poverty line. That share of Americans in poverty is slightly lower than it was in 1964 and has been flat in the post-recession years.
However, it may be that none of that is correct. For more than a decade, arguments have been raging among academics and even within the government regarding whether the poverty rate is in fact wrong. That means the poverty count could be off by more than a full percentage point. And while that may sound like a small change, that means a poverty rate that misses millions of Americans.
"I think there is broad consensus that the official poverty measure is flawed, particularly when you look back over a long period of time," says Sharon Parrott, vice president for budget policy and economic opportunity at the Center on Budget and Policy Priorities, a left-leaning think tank based in Washington, D.C.
Across town at a right-leaning organization, another expert agrees.
"I don't have an impression that it's too high or too low; I just have an impression that it's wrong," says Michael Tanner, senior fellow at the Cato Institute, a libertarian-leaning think tank. "There's almost a universal acknowledgment that the number we use now doesn't make a whole lot of sense."
There is broad consensus among economists of many ideological schools that the way the U.S. measures poverty is broken. The official poverty thresholds are based off of what Americans spent on food in 1963 – according to the Census Bureau, a household making less than three times what the "minimum food diet" in 1963 cost is below the poverty threshold.
But the way Americans spend has changed markedly since then. People spent around one-quarter of their budgets on food in the 1960s. By 2003 it was closer to 13 percent, according to the Labor Department. The Gates Foundation calculated in 2012 that Americans now spend only 6 percent of their money on food. Spending on child care and health care, meanwhile, have grown.
In addition, the official poverty rate doesn't take into account some safety net programs designed to help the poorest Americans. Cash transfers like Social Security are included when determining poverty, but non-cash programs like Medicaid and the Supplemental Nutrition Assistance Program (commonly known as food stamps) are not included. Taxes are also not taken into account, says Tanner.
"Someone whose income is above the poverty line but their take-home pay is below the poverty line because of their tax liability is not poor, by our definition," he says. Likewise, the Earned Income Tax Credit is also not taken into account in the official estimate.
In addition, the federal poverty line applies across a nation in which families have widely varying costs of living; a family earning $20,000 in rural Kansas will likely find it easier to make ends meet than a family in New York City.
Taken together, all of those factors could affect the rate substantially. With this in mind, the Census Bureau in 2011 started publishing a "supplemental poverty measure" that takes into account all of these factors, among others. The new rate doesn't garner the kind of headlines that the official rate does, but it does tell a different story.
In a December paper, scholars found that by this measure, poverty has improved drastically since the 1960s, but it is also worse now than the official rate says it is. The 2012 supplemental poverty rate, 16.1 percent, is both higher than the official measure and also suggests that poverty is on the rise.
That's not the only alternative measure out there. The National Academy of Sciences' experimental measures, which use different spending and geographic adjustments, put the 2012 poverty rate anywhere between 14 percent and 16.9 percent. If the poverty rate is truly off by 1.9 percentage points, that could mean 5.9 million people are not counted.