Forty years of meta-analysis: We need evidence-based answers more than ever

Forty years ago, the introduction of modern meta-analysis brought the scientific method to reviews and syntheses of research results from multiple studies. The consequences of its widespread adoption have fundamentally changed the way scientists view scientific information, and ushered in an era of evidence-based decisions and the resolution of fundamental questions in medicine, ecology, psychology and many other fields.

The Review in Nature is here:

Science is based on the accumulation of knowledge. New findings may advance our understanding, or lead to puzzling contradictions. How can we make sense of the enormous avalanche of information when thousands of new studies are published every year in some research fields? How can scientists keep up with this explosion of information, much less practitioners such as physicians, and even less, the confused citizen? How do we resolve seemingly contradictory and inconclusive findings? 

The solution to these problems began 40 years ago with a revolutionary approach to research synthesis in the 1970s. In the past, summarizing scientific findings depended on expert “narrative reviews”, which presented the (subjective) views and synthesis of a field by an expert in the discipline. The problems with this approach were numerous: experts often disagreed on which studies were relevant and how to interpret them, reviews were subject to bias, and worst of all, it became increasingly difficult to “tell a story” when there might be dozens to hundreds of papers addressing the same questions. 

Forty years of meta-analysis
The term ‘meta-analysis’ was the contribution of an American psychologist, Gene V. Glass, some 40 years ago. Put simply, meta-analysis is the statistical analysis of results from separate studies on the same scientific question. It is the foundation for evidence-based medical practice, and for evidence-based solutions in conservation and environmental management. 

Although I am an ecologist, I first encountered the term meta-analysis in the social sciences, in a newspaper article in the Boston Globe in 1989 discussing a paper in educational psychology that asked  whether boys were really better than girls in math (they were not) by synthesizing the results of many studies on that question. I was stunned at the power of this approach (and I was pleased to see the conclusion). At that time, there was a big debate in the ecological literature on the occurrence and importance of competition in natural communities, and I thought that meta-analysis would be an amazing approach for synthesizing the studies on competition in natural communities. Meta-analysis had not yet been used in ecology. After a steep learning curve, I was able to carry out a large meta-analysis on competition in natural communities with a group of students and publish it (we found that competition was ubiquitous in nature), and that paper changed my entire career trajectory. 

In November 2015, I was at a meeting at the Center for Open Science on transparency and open science practices in ecology and evolution, where my colleagues and friends Julia KorichevaShinichi Nakagawa, and Gavin Stewart were also participants. With encouragement from Patrick Goymer, who was at that meeting as well, the four of us started talking about writing a paper reviewing the impact that meta-analysis had had across a wide range of disciplines. That initiated the present paper.

Julia and I first met years ago by email, when she was a postdoctoral researcher in Sweden working on her first meta-analysis, and not long after that, she and I organized a series of workshops at the National Center for Ecological Analysis and Synthesis with other colleagues (including Gavin) that resulted in a co-edited volume, the Handbook of Meta-analysis in Ecology and Evolution. Julia, Gavin, Shinichi and I are all elected members of the Society for Research Synthesis Methodology, a remarkable  interdisciplinary group of meta-analysis experts and fanatics. I first met Shinichi a few years ago at an annual SRSM meeting. He helped me write a book chapter on meta-analysis that required facility in R and new meta-analysis methods when he was a new faculty member in New Zealand and I was just learning to use R to do statistical analyses. For the current paper, we all contributed equally, and interacted intensively to write it.

In a meta-analysis, the results from each study in a collection of studies are first converted into a ‘common currency’ – that is, a measure of effect size. Then, these effect sizes—the outcomes of the studies—are combined statistically to estimate the overall effect across all of the studies, as well as the factors (moderators) responsible for differences among the study outcomes. 

Above figure: Depiction of meta-analysis models. Each curve represents the distribution of outcomes among the individuals in a single study. a) Common effect model; all studies share the same true effect. b) Random effects model; each study has a different true effect. c), d) common- and random-effects models, respectively, that include moderators in meta-regressions. Beige curves illustrate sampling error variance, blue the between-study variance.

Meta-analysis provides a more objective and far more powerful way of combining scientific evidence than narrative reviews or other approaches are able to do.  The recent paper in Nature summarizes the influence that meta-analysis has had in different scientific fields, ranging from medicine to social sciences and ecology, evolution and conservation. The introduction of meta-analysis in these disciplines has had a transformative influence, resolving seemingly contradictory research outcomes and identifying the factors that can change the results of a study (and rejecting those that do not influence study outcomes). It also can shine a laser-light on where information is missing and more research is needed, as well as making it apparent where no new studies are necessary to resolve a research question.

Although the popularity and the number of published meta-analyses has grown exponentially in recent decades, and meta-analyses are cited far more than any other types of reviews or individual research papers, the field is not without critics. 

Questions of quality standards, publication bias (summarizing published results when certain findings are systematically excluded from publication), combining “apples and oranges” and “parasitizing” the efforts of primary researchers have been raised in multiple fields in which meta-analysis has become prominent.

Each of these issues has been considered and addressed by meta-analysis practitioners; solutions include establishing stronger standards and introducing better education for authors and reviewers. Looking for advances in the accumulation and rigorous evaluation of a body of evidence, rather than in dazzling single experiments, requires a different perspective on how we make progress in science. Other issues (such as publication bias) are actually not unique to meta-analysis, but are more effectively addressed by taking a scientific approach to research synthesis. Real challenges in meta-analysis are being addressed by rapid methodological and conceptual advances, and improvements in practices.

The advantages of applying a robust scientific approach to research synthesis and meta-analysis are now too great to be ignored. Overlooking the enormous potential of meta-analysis to facilitate scientific progress is unaffordable, and a vast range of scientific problems are now being addressed by evidence-based approaches and meta-analysis. As Gavin has so eloquently said for all of us, we are  chuffed about the potential for meta-analysis to expedite scientific advances, and about this paper.

By: Jessica Gurevitch, Stony Brook University, Stony Brook, NY

Please sign in or register for FREE

If you are a registered user on Research Data Community, please sign in