April 20, 2011 — The House of Representatives’ approval of a deficit-slashing plan for 2012 and beyond may not have been everyone’s idea of “our generation’s defining moment,” as budget committee chairman Paul Ryan proclaimed it. Beyond any question, however, that vote was a breakthrough for the “skin in the game” school of health care policy.
For decades, advocates and experts affiliated with such free-market think tanks as the Heritage Foundation, the Cato Institute, and the Hoover Institution have criticized the system of “third party payment” that, in their view, over-insulates Americans (those fortunate enough to have insurance, at any rate) from the cost of their health care decisions. With more of our personal dollars at stake — more “skin in the game” — we will learn to make smarter health-care choices, the argument goes, and providers will respond by competing and innovating harder. That kind of approach, according to the theory, will lead to a system characterized by better care — or at least equally good care — at lower net cost.
Although it was Republicans who voted last week to turn Medicare into a privatized, insurance-shopping voucher arrangement, the skin-in-the-game theory also commands wide support among centrist and conservative Democrats. Ironically, the theory’s ascent has occurred in the face of groundbreaking research on how Americans actually make health care decisions — research that appears to punch big holes in the skin-in-the-game hypothesis.
Patients cutting back on preventive care
A Rand Corporation study, described in the March 2011 issue of the American Journal of Managed Care, compared enrollees in traditional health insurance plans with enrollees in so-called “consumer directed” health plans. The latter plans were characterized by high deductibles and tax-free or employer-supported health savings accounts — a combination of features championed by skin-in-the-game theorists as a way of turning Americans into more discerning and engaged “consumers” of health care.
Under the consumer-directed plans, Rand found that people were more likely to economize on doctor visits and medicines that would cost them money out of pocket; but in a result that one of the lead researchers, Neeraj Sood, characterized as “not ideal,” patients also cut back on preventive care, despite the fact that such care was fully covered under all the plans in the study.
In a phone interview with Remapping Debate, Sood posited two likely explanations. Some people, he said, may simply not have realized that their plans’ deductibles excluded preventive care. But even for those who did, “a lot of preventive care is initiated when you actually see a doctor,” Sood pointed out, “and if you have high cost-sharing you’re probably seeing your doctor less often and therefore there is less opportunity to start conversations about preventive care.”
Advocates of vouchers, health savings accounts, and other market-oriented policies often point to another Rand study, dating from the 1970s and early ‘80s, as proof that more cost-sharing leads to less care at no sacrifice of health. But even that study raised questions about the skin-in-the-game-model, according to Shannon Brownlee, an instructor at the Dartmouth Institute for Health Policy and Clinical Practice, and author of “Overtreated: Why Too Much Medicine Is Making Us Sicker and Poorer.”
“Yes, when people had more skin in the game they were more judicious about spending money on health care,” Brownlee said in a phone interview. “But they didn’t do it in a rational way. They were just as likely to forego care that they needed as care that they didn’t need. So they weren’t really very prudent consumers of health care — they were simply more worried about spending money on health care.”
Higher co-pays, more hospital time
Because the latest Rand study tracked its subjects for only one year, it is impossible to know whether short-term neglect of worthwhile care led to long-term health problems. That, however, is the pattern strongly suggested by another recent study, involving enrollees in private “Medicare advantage” plans that have experienced sharp increases in co-pays and deductibles. Among this group, a research team led by Amal Trivedi, an assistant professor of community health at Brown University, found fewer doctor visits but, at the same time, more and longer hospital stays. The study — “Increased Ambulatory Care Copayments and Hospitalizations among the Elderly” — was originally reported in the New England Journal of Medicine last year.
“For every one hundred people exposed to a doubling of co-pays, there were 20 fewer outpatient visits, two additional hospital admissions, and 13 more inpatient days,” Trivedi said. The increased hospital spending, he added, was sufficient to “dwarf” the savings on ambulatory care.
One reason for looking at the effects of cost-sharing on Medicare patients, Trivedi said, was that the elderly had been left out of the original Rand study. By excluding those over 65, the Rand research, he speculated, might not have fully captured the effects of cost-sharing on people with low incomes and chronic conditions. His study’s finding of more hospitalizations among elderly and sick people, Trivedi added, could be interpreted as an argument, at least with this population, for lower copayments and deductibles rather than the opposite.
Are health care decisions really like other consumer decisions?
One lesson to be taken from relevant research, old and new, is that health care tends to resist efforts to bend it into the shape of a normal consumer market. There may be a host of consumer products for which Americans commonly yearn, but, as Brownlee pointed out, “I don’t think anyone says, ‘Oh goody, I get to be in the hospital; oh goody, I get to be in the ICU; oh yippee, I get to have open-heart surgery!’”
The same basic point — apparently lost in much of the debate over controlling costs — was made by Jessie Gruman, a patient advocate who is president of the Center for Advancing Health, a Washington-based policy center. “I think most people don’t want to go the doctor,” Gruman said. “Most people don’t want to be sick.”
Skin-in-the-game theorists, Gruman added, tend to talk as though they have never been really sick themselves. “They don’t understand that most of us go out of our way not to make decisions about health care unless we have to,” she said. “And most of the time, when we have to, it’s because something anxiety-provoking or dangerous or sad or frightening has happened. Either we’re sick or our Mom is sick or our kid is sick,” she added, and those are not times when it’s likely that people will be apt “to be making wise, rational, market-based decisions.”
People don’t lust after medical care
Alain Enthoven, an economist known as one of the godfathers of “managed care,” himself champions a skin-in-the-game approach to insurance-purchasing, especially when patients are allowed to choose between fee-for-service plans and HMOs in which the physicians work collaboratively to reduce waste and deliver high-quality care. (Many observers regard that kind of care — exemplified by the Mayo Clinic — more as an aspirational model than as an available option for the great majority of Americans.) “The place for people to have skin in the game is in choosing their delivery system in a cost-conscious manner,” Enthoven said in a phone interview.
Skin in the game is a far less useful concept, Enthoven added, when it comes to decisions about particular procedures and medications. To begin with, he said, it is foolish to imagine people lusting after medical care the way many of us may lust after the latest electronic hardware.
“I think the proponents of high deductibles and all that stuff, they think of medical care as being mostly cosmetic,” Enthoven said. “You’re going in for a facelift or a breast reduction or a joint replacement that you don’t really need. “
“Well, I tell you what,” he said. “Hip replacement surgery, it’s such a hugely invasive thing and it puts you out of commission for a few months, that it’s not fun. I think people who undergo that are in real pain and need it.”
Just like computers?
Scott Atlas, senior fellow at the Hoover Institution and chief of neuroradiology at Stanford University Medical Center, remains committed to the view that health care decisions can and should be made like other consumer decisions.
Remapping Debate asked Atlas whether it was realistic to expect people to be intelligent consumers in a context of such difficult choices and circumstances. His answer was a flat-out yes. “People say, ‘Oh, it’s too complicated,’” he said. “Well, I’m not sure I know anyone who can explain how a computer works, yet we make value-based decisions on computers all the time.”
Others fear that a skin-in-the-game approach could tilt the balance of medical decision-making — at least with decisions that are truly being made by patients, not by doctors or hospitals — from vigilance toward neglect. People with chronic conditions are sometimes encouraged to see their primary care doctors every few months, even when there’s nothing obviously wrong. That kind of pro-active monitoring has a good record of success, Jessie Gruman pointed out, but it “really flies in the face” of some of today’s consumer-driven health care policies.
“People don’t want to go to the doctor generally, and they especially don’t want to go to the doctor if it’s really expensive and they need to watch their money,” Gruman said.
Suppose “I decide to cut costs by not going to have regular checkups for my chronic conditions,” she continued. “Maybe there’s nothing going on at the moment that I think is big enough to warrant paying the doctor to look at. Well, I could be fine. But chances are, by not taking preventive measures, sooner or later I will have made a decision that threatens the length and quality of my life. This is a really different decision from deciding that I can’t afford car payments on a Lincoln Continental.”