Statistics can be a dry, impenetrable discipline for the uninitiated, but Michelle Wiest, PhD, an associate professor of statistics at the University of Idaho, had the attendees’ full attention even though she was scheduled on the afternoon of the last day (when attendees might be expected to be nodding off or checking airline schedules on their smartphones). Wiest’s talk, in which she laid out out how a good meta-analysis ought to be constructed, was part of the biannual meeting of the Global Organization of EPA and DHA Omega-3s, which took place last week on Tenerife.
Meta-analyses gaining weight in media
The issue is seen as a critical one, since meta-analysis as a research tool has seen tremendous growth in recent years, and reporters in the mainstream media have been looking to the conclusions of these studies to give them a quick, relatively easy way to get a grip on a broad swath of research. Thus, the headlines generated from these studies as they are reported in the press have taken on ever greater importance in terms of their potential effect on policy. But just like a dietary supplement, a meta analysis is only as good as the ingredients used to make it and the methods used for putting those together, Wiest said.
“I don’t know if meta analyses are being given greater weight within the research community. I think when you have a large, randomized trial come out that is still seen as the best standard of evidence,” Wiest said.
Nevertheless, she did display several slides that showed that meta-analyses, as part of systematic reviews, occupy the highest rung in the research-weighting pyramids of several organizations, such as the American Diabetes Association. And the use of this tool is exploding, Wiest said. The number of publications about ‘meta-analysis’ pushed through the 15,000 mark sometime in 2015, Wiest said.
While the media often takes these studies at face value, researchers ought to be more discriminating. But even for those in the know, it can sometimes be difficult and time consuming to determine if the meta-analysis was done in a fully valid way, she said.
“It’s difficult to know when to to put stock in some things, and what to take away from them, and how differentiate a good meta-analysis from a bad one,” she said.
Done well, a meta-analysis can provide additional information about the research area, giving a weighted average of RCTs that could, for example, elucidate a subsidiary effect of a given treatment across a population. In trying to determine whether that was done successfully, Wiest said there are some hurdles that any good meta-analysis ought to clear easily if it is to be taken seriously.
Large research team is best
First, a meta-analysis cannot be seen as an easy way out. Conducting all of the trials on which the meta-analysis is based is certainly more expensive that pooling them together using this statistical tool, but a good meta-analysis is a significant, and potentially expensive undertaking of its own.
“They are all over the map. Some of them are really high quality and represent great work. Those tend to have a ton of authors and a broad team. Then there are others that are not so good, some of which might be the work of a lone grad student,” Wiest said.
“It might be cheaper than doing original research but it’s not going to be cheap. Or easy. And it really shouldn’t be easy,” she said.
Wiest said a properly conducted meta-analysis will have a clearly defined protocol for which studies are included and which are excluded. It should include a statistical representation called a funnel plot that seeks to correct for publication bias. And the data extraction from the studies themselves should be conducted in parallel, and the results of those extractions evaluated in a blinded fashion by a third party to correct as much as possible for any potential bias or “data cherry picking” on the part of the researchers doing the extractions.
Choice of what data to use is biggest bone of contention
The choice of what data to use is often the point of greatest contention. Wiest delved in particular into the Rizos meta-analysis of 2012 in the Journal of the American Medical Association that concluded that omega-3 supplementation was not associated with lower risk of all-cause mortality, cardiac death, sudden death, myocardial infarction, or stroke.
“In the Rizos paper a lot of the criticism was about the studies they included. The critics said they were using studies where the omega-3s dose was too low, or that they should have been looking for studies in which the researchers were looking at the composition of fatty acids in the blood to see if the test subjects were being dosed correctly,” Wiest said.
Wiest said in general meta-analyses that closely adhere to the Cochrane guidelines will be more reliable than those that do not. But she said that meta-analyses that come to unpopular or unfavorable conclusions will always be open for debate.
“News headlines and policy decisions tend to deal in absolutes and science is rarely making absolute conclusions. I don’t think that you are ever fully protected form criticism from when you put your work out there. As a scientist you are constantly getting critiqued,” she said.