Santa Clara University Biology Professor Michelle Marvier and her colleagues have recently published a meta-analysis of field studies that concluded that Bt crops are generally more benign for non-target invertebrates than chemical insecticides. A second meta-analysis of lab studies found no harmful effects of Bt proteins on honeybees. Although these reports will probably fail to convince skeptics, they raise an important question: Can meta-analysis be used to tease meaningful results out of a series of studies that, taken individually, are inconclusive? Given the cost and methodological complexity of ecological studies, it’s an important question.
The answer is affirmative only if the studies analyzed are sufficiently well designed and conducted to yield useful data. The need to discard or discount flawed studies presents a constant challenge to meta-analysts. Even when carefully selected criteria for inclusion or exclusion of a data set are stated a priori, researchers are accused of bias when they exclude a study that seems to favor one point of view over another.
If we are to avoid the GIGO—garbage-in garbage-out—effect in meta-analysis, incomplete and/or flawed studies must be excluded. A perfect example is the study reported in PNAS by Rosi-Marshall et al. (2007) which claimed to show that pollen from Bt-maize was injurious to caddisflies in a laboratory aquatic ecosystem, but was flawed in numerous ways. For example, pollen produced by currently available varieties of Bt-maize contains very low concentrations of Bt toxin. In addition, the authors extrapolated from a laboratory experiment to a field system based on a single study, an extrapolation that is problematic, especially given that they used pollen in doses higher than the maximum encountered under field conditions. Possibly the most damning of all is that they reported elsewhere that they had failed to find these effects in the field (http://www.benthos.org/database/allnabstracts.cfm/db/Columbia2007abstracts/id/370), an important fact that should have been disclosed in the PNAS paper. The omission of those contrary findings arguably amounts to investigator misconduct.
Despite all the sensational publicity they may attract, such flawed studies add no value to meta-analysis.
An additional challenge for researchers is the well-known “File Drawer Effect.” Papers that show no differences—and especially no new risks or harms—are seldom of interest to scientists or journalists and are unpublishable. Thus, there is little incentive for scientists to spend their careers proving that safe crops are safe. This skews the literature in favor of papers that show effects, usually adverse. This phenomenon, combined with the willingness of some scientists to skew their studies toward negative results for a variety of reasons—good funding, publications, notoriety, and politics among them—make meaningful meta-analyses exceedingly difficult.
Taking nothing away from the careful analysis performed by Marvier and her colleagues, two important points emerge. First, as Marvier notes, “If meta-analyses and large databases of completed studies were to become a routine part of risk assessment, then there would not be the distraction of single experiments capturing media attention and inappropriately alarming or comforting the public and policy-makers.” It would be better if regulators were allowed to make science-based decisions free from pressure and politics from their over-seers in Congress and far from the maddening din of the media.
Second, perhaps we are asking all the wrong questions of all the wrong crops. By that I mean simply that GM crops are more precisely engineered, have far fewer and less random changes than conventional crops, and are far more thoroughly researched; indeed pre-market safety reviews are not required for non-GM crops even though they have more numerous, random, and potentially disruptive mutations in their DNA. We also have over 12 years experience from more than 13 million farmers on greater than 2 billion acres in 25 countries (Brooks and Barfoot, 2008) in profitably growing and harvesting GM crops, with no evidence of adverse environmental or human health impact and mounting evidence of substantial environmental benefit.
The greatest contribution Marvier and colleagues have made is to provide yet one more independent objective analysis that points to the conclusion that GM crops are as safe as any other crop produced by any other breeding technology; all crops should be evaluated for safety in a risk-based fashion, independent of the technology used to develop them.
Meta-studies for biosafety research, Michelle Marvier from the University of Santa Clara (USA). http://www.gmo-safety.eu/en/news/712.docu.html
E.J. Rosi-Marshall et al. 2007. “Toxins in transgenic crop by products may affect headwater stream ecosystems” PNAS 104 :16204-16208.
Graham Brookes and Peter Barfoot. 2008. Global Impact of Biotech Crops: Socio-Economic and Environmental Effects, 1996-2006. AgBioForum, 11(1): 21-38. http://www.pgeconomics.co.uk/pdf/agbioforumpaper2008final.pdf