This is the second day of our coverage of the 2010 BIO International Convention, a massive biotechnology conference being held this week at McCormick Place in Chicago. Come back all day for reports from panels, lectures, and the exhibit floor on how scientists, government leaders, and industry hope to use the combined forces of science and technology to tackle some of the world biggest problems. For day one of our coverage, click here.
The boogeyman of the BIO conference has been those faceless regulators, the bureaucracy that speakers have often blamed for the bulk of the slowdown that occurs between scientific innovation and its debut on the marketplace and in the clinic. Though never named directly, the big boss of those regulators in the United States is the Food & Drug Administration, whose stamp of approval is required for each and every new treatment or device. And as much as the biotechnology and patient advocacy choruses cry “more and better drugs, faster!,” one must remember the importance of government oversight in monitoring new products for safety and efficacy – illustrated again today in the FDA-directed recall of 200,000 medical infusion pumps. Late afternoon Tuesday, the industry finally got to hear from the woman at the head of that regulatory boogeyman, in the form of Margaret Hamburg, commissioner of the FDA.
Appearing at a session sponsored by the pharmaceutical company Merck, the reception for Hamburg was of course, polite and gracious – everyone knows not to bite the hand holding that magical FDA stamp. But Hamburg’s speech appeared to be quite reassuring as well, a promise to the assembled drug companies and scientists that President Obama’s FDA was committed to science and new, better, faster ways of approving medical technology. In fact, Hamburg argued for no less than a new field – regulatory science – that would study how best to judge drugs in an era where older models of clinical trials may be obsolete.
“Just as biomedical research evolved in the past decade, regulatory science must also evolve,” Hamburg said. “We cannot use 20th century science to review products using 21st century science.”
If regulatory science sounds like the dullest thing ever, Hamburg doesn’t blame you. In the Q&A session, she said the FDA looked for a different, punchier name, saying that regulatory science “sounds like such a snooze,” and conjures images of bureaucracy and intrusive government. But in the end, Hamburg said, they decided “you can only put so much perfume and fancy clothes” on the concept of improving the country’s regulatory processes. The trick now is to attract promising scientists to a field that is considered to lack the creativity of other scientific realms, and to expand the FDA workforce to keep up with the active pipeline of biotechnology
Such regulatory scientists will have some big problems to tackle. The new world of biomarkers and potentially personalized medicine have complicated the clinical trial process, which has previously relied on lumping large numbers of patients together despite individual differences in disease. New kinds of trials, such as the “adaptive” format pioneered in the BATTLE trial at MD Anderson Cancer Center, will be needed to match the best drug to the best patient in a way that can be regulated, Hamburg said. New devices, such as software that will connect glucose monitors to insulin pumps to produce automated control in diabetes patients, will also need inventive study design to quickly and properly weigh the benefits and safety risks for patients.
“We want the FDA to serve as a gateway, not a barrier, for products people need every day,” Hamburg said.
(Apologies to Warren G – ed.)
There’s a lot of idealism, hope and optimism at a conference such as this one. That’s a necessary byproduct of who’s doing most of the speaking: scientists who believe their research holds important answers and ramifications for the real world, biotechnology companies who hope that their next product will be suitably profitable or attractive to purchase by a larger corporation. But in health care, nothing is ever easy, and the afternoon session on the promise of epigenetics offered a short lesson in the disconnect between research hope and health care system frustration.
Epigenetics is the study of factors above and beyond genes themselves which control genetics, the switches and scaffolding that determine when and how much protein will be made from a particular gene. The field has grown rapidly in recent years, as basic science laboratories find new ways that a person’s diet, drug intake, and stress can influence the expression of their own genes and even their offspring’s genes. Epigenetics have been billed as evidence that nurture plays a role alongside nature, that a person’s environment and choices can affect genetic processes.
But what about the translation of epigenetic research to our medical lives? The most direct path appears to be creating epigenetic tests that can inform people about their risk for certain diseases, and that was the bulk of the data presented at Tuesday afternoon’s session. Nathan Lakey, from Orion Genomics described a test under development that will look for the methylation state of a gene called IGF-2 – methylation being an epigenetic method of silencing genes. In early studies, patients with colorectal cancer were found to have 22 times the chance of having demethylated IGF-2 genes, indicating a potential marker for predicting patients at high risk for the cancer.
But Karen Kaul, the head of molecular diagnostics at NorthShore University Health System in the suburbs of Chicago (and a clinical professor in the department of pathology at the University of Chicago), offered a sobering reality check. As the director of a molecular pathology lab, which runs the tests that diagnose disease in hospital patients, Kaul knows the myriad challenges that face a new test, from insuring reliability across different laboratories in different hospitals to convincing insurance companies that a new test is worth paying for. Kaul pointed out that even something as critical as the billing code system for pathology tests is many years behind the science, making reimbursement for newer tests tricky and frustrating. In one recent example, she said that tests performed last year for swine flu were deemed “experimental” after the fact by at least one insurance company, preventing full coverage.
The answer, as it was for many sessions at the conference, was improved education – for doctors, patients and insurance companies alike. Kaul said that companies looking to release new tests on the market need to demonstrate to third-party payers how that test could save them money in the long run. And doctors must consider how to deal with the new information about disease risk such tests could provide – early screening is only a benefit to the system if it leads to more efficient care, instead of just more panicked care. Despite all that, Kaul said she was “enthused” about the potential of new tests, epigenetic or otherwise, but her comments were a much-needed gray lining of reality amid the conference’s silver cloud of enthusiasm and speculation.
I’ve been to a lot of scientific conferences, and am used to seeing vast fields of lavish displays on the exhibit floor. But the kiosks at BIO are a whole extra degree of impressive, with giant sculptures of corn and DNA drawing people in alongside the usual promises of free coffee, pens, iPad giveaways, and other assorted tchotchkes. All the usual pharmaceutical and laboratory supply companies are present, but a significant portion of the several-football-field-size floor is given over to booths from countries and U.S. states looking to promote their region as a hub for biotechnology.
The Illinois booth contains some familiar faces from the University of Chicago representing the Chicago Innovation Pipeline – a new collaboration between six local academic institutions to help bridge the gap between academia and industry for new biotechnology development. Alan Thomas, the director of UChicagoTech, explained to me the purpose of the multi-institutional pipeline.
“University language doesn’t always correlate with industry,” Thomas said as people drifted through the booth speaking with representatives. “The pipeline creates a taxonomy that is more attuned to how industry works.”
The Chicago Innovation Pipeline is a sort of library of technologies from the University of Chicago, Northwestern University, Argonne National Laboratory, Loyola University, Children’s Memorial Research Center, and the University of Illinois at Chicago available for development by private companies. Technologies are sorted by therapeutic area, such as oncology or diabetes, and development stage, describing where along the path from laboratory to clinic the drug or treatment currently stands. Companies interested in opportunities for development can quickly scan the pipeline and work with each research center’s technology transfer office to pursue further research.
“It’s similar to how you might organize your music on iTunes,” Thomas said.
The current environment is ripe for collaboration, he said, with pharmaceutical companies cutting back on research and development funding during the economic recession and universities looking for ways to bridge the “valley of death” between lab discovery and the marketplace.
“The field is becoming much more collaborative,” Thomas said. “Now people care. It’s a much more open, innovative environment now.”
It’s hard to imagine a scientific innovation over the past decade that attracted more excitement and controversy than stem cells. It’s been almost 10 years since George W. Bush (who’s giving the keynote address at the conference today with fellow Presidential alum Bill Clinton) placed federal limitations on the study of embryonic stem cell lines, at a time when the potential of those cells seemed limitless. But in 2010, stem cells have yet to cross over to the clinic in any meaningful way, due to a variety of obstacles more scientific than political, frustrating a field that had high hopes.
But over the last couple years, a stem cell savior has appeared in the form of inducible pluripotent stem cells. Scientists at the University of Wisconsin discovered in the late 2000’s that skin, blood or fat cells could be “reprogrammed” back to their undifferentiated state, then teased into forming various cell types by the application of growth factors and gene transduction. These iPS cells have taken the lead from their embryonic counterparts in the race to the clinic, according to a morning session which featured several representatives from biotechnology companies. And while the session focused more than most at BIO on obstacles instead of potential, the news finally suggested that stem cells may soon have an impact on everyday medical care.
However, that impact may not be what was originally promised. Early reports on stem cells touted their promise for regenerative treatments: growing a new liver or kidney for organ transplant, or replacing the neurons lost in Parkinson’s Disease. Speakers on the iPS panel said that such uses are still likely to be years, if not decades, away. But in the meantime, iPS cells have the immediate potential for use in the discovery and testing of new drugs, provided that obstacles of manufacture and reliability can be overcome.
Both Christopher Kendrick-Parker of Cellular Dynamics and Dushyant Pathak of iPierian talked about the use of iPS cells to create laboratory tests in a dish that better simulate clinical trials in humans. Currently, drugs are tested on cell cultures that are either fragile or derived from a single source, not reflective of different people’s physiology and genetic backgrounds. But if iPS cells can be derived from the skin or blood of many, many patients with a particular disease and differentiated into heart, kidney or brain cells, a new drug treatment for that disease can be tested across a broad population without leaving the laboratory – an in vitro clinical trial.
Cell lines can also be created to reflect group differences – such as ethnicity, gender, or age – or even individual differences. One extreme possibility for the future, suggested by Kendrick-Parker, was that doctors could test the effectiveness of a drug on a specific patient using skin-derived iPS cells, differentiated in a dish, before giving the actual patient the drug. That concept suggested that one day soon, we may all be able to supply the material for our own personal clinical trial, individualized medicine of the highest power.
At a business-focused meeting like BIO, you might not expect a lot of time and attention to be paid to the relatively vague notion of ethics. But for many biotechnology businesses, ethics is part of the bottom line, and the experts said at a morning session entitled “Ethics and Biotechnology: Genetically Engineered Animals” that ethics should not be ignored. As the genetically-modified plant and crop industry learned, scientific advances that seem like slam dunks in improving the quality of life around the globe can meet unanticipated resistance from a public that fears methods it perceives as unnatural. If changing the genes of a strain of wheat produce unwelcome ethical blowback, imagine the public response to a cow breed genetically altered to be resistant to disease, or to produce more meat.
Paul Thompson, an agriculture and food ethicist from Michigan State University, spoke about why most people experience are averse to the genetic manipulation of animals, focusing on polled opinions about animal welfare. While a slight majority of Americans are concerned only that livestock animals are kept as pain-free and content as possible during the process of food production, 46 percent believe that such animals should live as “natural” a life as possible, Thompson said. For these naturalists, any genetic manipulation that leads to a perception of a less natural life – even if it makes the animals less likely to die – would be met with resistance. That’s a “genuine ethical conundrum,” Thompson said.
But scientists and companies also do themselves harm in the public sphere by not considered the ethics up front, even when they may be less than palatable to a significant portion of the country. Margare Foster-Riley, from the University of Virginia School of Law, seemed to say that good ethics equals good PR, and that trying to hunker down and hope the public won’t pay attention leads to “media firestorms.” It would do well, Foster-Riley said, for scientists to consider the ethics of their discoveries before they are too far along the process, something that was not done for high-profile cases such as Dolly the cloned sheep and Ruppy the transgenic puppy. It’s also important to emphasize the ethics of not capitalizing upon a particular biotechnology, weighing concerns about animal welfare against concerns about world hunger, for example.
As such, it’s important to remember that biotechnology does not exist only in the United States. Ethical decisions that may derail the use of genetic engineering here may be embraced without qualms in other countries, where the U.S. luxury of abundant food is not present.
“The assumption that you can stop biotechnology by stopping it in the United States is nuts,” Foster-Riley said. “These discussions should be global…any attempt to limit these technologies by law moves it offshore very quickly.”