Ignoring a solution to chronic drug shortages

Original Reporting | By T.J. Lewan |

May 7, 2014 — Since shortages of critical drugs became a fixture of the American medical landscape a decade ago, pundits have proposed an array of incentives to encourage more production from pharmaceutical companies. But an obvious alternative or supplement — having the government manufacture the drugs — appears not to have made it to anyone’s list.

“Unfortunately, we’re at that stage again where we’re saying, yeah, we got together, we came up with a lot of ideas, the FDA got more authority, the FDA increased its staff, the FDA did this and that. But guess what? It’s not working. What are the next steps?” — Allen Vaida

A factory designed, built and run by Uncle Sam could manufacture drugs in short supply — not least the older, low-profit but still highly effective generic medications that drug makers cannot, or simply will not, produce. And yet, when Remapping Debate raised the possibility, the initial reactions of economists, physicians, industry analysts, and others ranged from stammering, chuckling and long, sometimes awkward, pauses, to bewilderment and shock.

Several people we interviewed struggled to explain their reservations by pointing out that the government has no experience in medicine making. Some flatly rejected the idea because, in their view, it was a long shot to gain political traction. Others said the concept gave them pause because it ran afoul of their beliefs of how a free market system ought to operate.

There were those who said a government entry into the generics business might destabilize the markets and send prices for all drugs soaring. (No evidence of this was forthcoming, and several people who said this later altered their opinion, saying the idea had potential.)

No one we reached out to in a position of authority — including at the FDA, Centers for Disease Control, or the National Institutes of Health — gave any hint that they were considering these questions or giving any thought as to how a federal drug factory could work.

But there were also those who — after first inquiring if we were serious — wholeheartedly endorsed the merits of a state drug-manufacturing capability.

 

Reversing government “de-industrialization”

The idea that governments would manufacture drugs is not novel. In fact, governments churn out medications for citizens in 80 countries around the globe, including Brazil, India, Denmark, Thailand, and Indonesia, according to the World Health Organization; in many places public and private enterprises coexist without incident.

Stephen S. Morse, a professor of epidemiology at the Mailman School of Public Health at Columbia University, says it’s remarkable that more than a decade into what are indisputably the worst and most widespread drug shortages in American history no one has yet thought to suggest or seriously consider a government option — at the very least, to study the benefits and drawbacks of it. 

“It’s an interesting question I find rather puzzling,” says Morse, who directs a program at Columbia that specializes in risk assessment of emerging infectious diseases, including influenza. “Even the most conservative economists, the University of Chicago people and so on, have always believed that the one legitimate role for government — they do not admit to any others — is to take up the slack when there are market failures. And there seems to be a market failure here.”

The idea that governments would manufacture drugs is not novel. In fact, governments churn out medications for citizens in 80 countries around the globe, including Brazil, India, Denmark, Thailand, and Indonesia, according to the World Health Organization; in many places public and private enterprises coexist without incident.

“And yet,” he adds, “no one thinks to have government take an active role in this case, which is interesting because, if there is a market failure going on, as there clearly is, who is supposed to step in — foreign manufacturers?”

Since the 1980s, the U.S. government has increasingly moved away from manufacturing anything — what some call the “de-industrialization” of government. Not long ago, however, the federal government was an active and effective player in the production of things needed to keep citizens healthy.

For decades after World War II, public agencies and institutes produced vaccines for biological threats, substances “which are harder to make because you have to do a lot of work on clinical trials as well as safety,” Morse says. The Salk Institute-Government Services Division once made specialty, bio-defense vaccines for things like Rift Valley fever, a mosquito-borne illness, and states like Michigan and New York made serums for smallpox, anthrax, and whooping cough, before leaving it to industry in the 1990s.

The notion that government is somehow incapable of making generic medications because it hasn’t done so before is myopic, says Aaron Kesselheim, a professor at the Harvard Medical School and a researcher at its department of health policy and management. The state, he says, is perfectly capable of quickly assembling people with the know-how for making pharmaceuticals.

It’s worth remembering, he says, how the government successfully managed the Manhattan Project, which developed the atomic bomb; NASA’s Apollo missions, space shuttles and stations, and telescopes; the Defense Advanced Research Projects Agency’s development of the silicon chip and the Internet. Today, there’s the “Green Electricity Network Integration” project at the Department of Energy, which promises to modernize our electric grid, and the National Nanotechnology Initiative, administrated by various governmental agencies such as the Department of Defense, the Small Business Innovation Research Program, and the National Institutes of Health.

Although the NIH devotes just 11 percent of its budget to internal biomedical research, the accomplishments of staff investigators are extensive: in the 1950s, the first use of chemotherapy to cure a solid tumor; in the ’80s, the first drugs for treating AIDS; in the ‘90s, the first successful gene therapy. Today, NIH scientists are creating a four-dimensional atlas of brain development in simple organisms, tracking the origin and evolution of every neuron, the path of every axon, the creation of every synapse; they’re pioneering a new way to repair holes in the human heart — the most common form of congenital heart disease — without the need for open-chest surgery; they’re inventing probes that capture images of receptors, cells, and tissues at the molecular level — cutting-edge procedures and technologies not available commercially.

Send a letter to the editor