Clinical Trials: Gold Standard In Need of Improvement

There is an emerging crisis in the development of drugs, biologics and complex new medical devices. Clinical trials take too long, cost too much and often produce imperfect knowledge. Many promising medical products are not developed because of the difficulty and expense of proving safety and efficacy—a loss that is costly to society.

FDA Matters believes that the key lies in developing new approaches to generating rigorous data and analysis. Ultimately, this will require the re-invention of the clinical trial.

Clinical trials produce the knowledge that makes FDA approvals possible. Without them, we would all become test subjects in a dangerous game of medical trial and error. FDA (and patients) want a reasonable level of certainty about safety, efficacy and risk-benefit before medical products are marketed. Except in extraordinary cases, FDA should never be put into a position to accept less.

The clinical trial is, and must remain, the gold standard. To understand why, it is useful to look at another type of medical knowledge that is increasingly in vogue: analysis of real-world data. The Medicare Claims database would be an example. Another would be patient data compiled by large health plans. Analysis of real-world data sets is becoming a cornerstone of reimbursement policy and plays a significant role in comparative effectiveness determinations.

The supposed advantage is the ability to look at hundreds of thousands of patients and discern patterns that might not be seen in clinical trials. However, the association of data points tells us nothing about causality. It only signals where additional analysis is needed. Real-world datasets also lack rigor:

Real-world data sets → post-hoc analysis using uncontrolled variables + inconsistent definitions + incomplete data collection + questionable data accuracy

By comparison, clinical trials produce a wealth of reliable knowledge (albeit far from infallible). This can be expressed as:

Clinical trial data sets → prospectively-defined analysis using controlled variables + randomization of patients + double-blind protocol + placebo controlled + pre-defined standard for data collection and data integrity

"Prospectively planned" means a drug or device sponsor must declare in advance the precise findings that will determine whether the treatment caused a beneficial outcome. Sponsors are limited in their ability to go back afterward to "data dredge" for positive correlations that might be spurious. To some extent, all analysis of real-world data sets is data dredging.

"Controlled variables" means that the outcomes of patients in the clinical trial can be compared with some degree of reliability. In real-world data sets, you can never be sure.

"Randomization" and "double blind" work together to assure there is no bias in patient selection (e.g. putting healthier patients in one arm of the trial) and that neither patients nor medical staff knows who is getting the study drug.

"Placebo controlled" allows a reliable determination of the impact of treatment. Since some patients will improve regardless of whether they are getting treatment or placebo, treatment effectiveness is the differential between those who improve in one study arm over the ones who improve in the other.

"Pre-defined protocols for data collection and data integrity" assures that definitions stay constant and results from different trial sites and different investigators can be combined. In real-world data sets, no one has yet figured out why medicine is practiced differently in Boston compared to Hartford.

Taken together, these features of the clinical trial serve to produce reliable data that support a conclusion (or not) that the treatment caused the benefit. The challenge is to improve upon this gold standard while maintaining confidence in the results.

Future columns will explore how this might be done. Meantime, readers are encouraged to post their thoughts or send me their ideas.

Steven

Here are two earlier columns that partially address this topic:

Long-term Challenges Need Short-term Attention

December 13th, 2009

We are less than 7 months into the new Commissioner's tenure. Three or four years from now, she will be judged by whether she moved the agency forward in these areas. I think she has gotten off to a very good start, but there is immense amount of work still required. Read the rest of this entry »

Turning Data into Knowledge

June 2nd, 2009

Through statute and directive, FDA has been asked to collect, analyze, interpret and utilize massive amounts of data. This includes biological, clinical, adverse event, production and distribution data, medical and food product tracking, and the Sentinel system for early discovery of potential drug safety problems. The systems are not in place to do any of this, at least not at the required level of sophistication. Even if they were, sifting valuable information from background noise is extraordinarily hard. Read the rest of this entry »

Previous
Previous

Animal Welfare and FDA

Next
Next

About FDA Matters