A New Look at Genetically Engineered Crops

©keithpix/iStock/Getty ImagesFew scientific subjects have generated as much public uncertainty and controversy as genetically engineered (GE) crops and foods. An extensive study by the National Academies looked at over two decades of accumulated evidence on the health and environmental effects of current commercially available GE crops.

The report that resulted, Genetically Engineered Crops: Experiences and Prospects, finds no substantiated evidence of a difference in risks to human health between existing GE crops and their conventionally bred counterparts. The report also finds no conclusive evidence of cause-and-effect relationships between GE crops and environmental problems, though the complex nature of assessing long-term environmental changes often made it difficult to reach definitive conclusions.

Available evidence indicates that GE soybean, cotton, and maize have generally had favorable economic outcomes for producers who have adopted these crops, but outcomes have varied depending on pest abundance, farming practices, and agricultural infrastructure, the report says. In locations where insect-resistant crops were planted but resistance-management practices were not followed, damaging levels of resistance evolved in some target insects. In many locations, some weeds had evolved resistance to glyphosate, the herbicide to which most GE crops were engineered to be resistant. This could be delayed through integrated weed management approaches, the report says. If GE crops with resistance to insects and weeds are to be used sustainably, regulations and incentives are needed so that more integrated and sustainable pest-management approaches become economically feasible.

Because all technologies — whether GE or conventional — can change foods in ways that could raise safety issues, it is the product and not the process that should be regulated, the report says, a point also made in previous Academies reports. In determining whether a new plant variety should be subject to safety testing, regulators should focus on the extent to which the novel characteristics of the plant variety are likely to pose a risk to human health or the environment, the extent of uncertainty about the severity of potential harm, and the potential for human exposure — regardless of whether the plant was developed using genetic engineering or conventional breeding.

The case for evaluating the product rather than the process is even stronger given that new technologies are blurring the lines between GE and conventional approaches, the report notes. Some emerging genetic-engineering technologies have the potential to create novel plant varieties that are hard to distinguish genetically from plants produced through conventional breeding or processes that occur in nature.

The Academies’ study was funded by the Burroughs Wellcome Fund, the Gordon and Betty Moore Foundation, the New Venture Fund, and the U.S. Department of Agriculture, with additional support from the National Academy of Sciences.

Climate Change’s Influence on Extreme Weather

©xeni4ka/iStock/Getty ImagesOften after an extreme weather event such as a heat wave or hurricane, questions arise about whether climate change caused the event or made it more severe. While such questions remain difficult to answer, given all of the factors that affect an individual weather event, it is now possible to estimate the influence of climate change on certain types of extreme events, such as heat waves, drought, and heavy precipitation, says .

Extreme event attribution is a fairly new area of climate science that explores the influence of human-caused climate change on extreme events compared with other factors such as natural sources of climate and weather variability. The science typically estimates how the intensity or frequency of a type of event has been altered by climate change. Some studies use observational records to compare a recent event with similar events that occurred in the past, when the influence of human-caused climate change was much less. Other studies use climate and weather models to compare the meteorological conditions associated with an extreme event in scenarios with and without human-caused climate change.

The results are most reliable when multiple different methods are used that incorporate both a long-term historical record of observations and models to estimate human influences on a given event, the report says. The most dependable attribution findings are for those events related to an aspect of temperature, for which there is little doubt that human activity has caused an observed change in the long-term trend. For example, long-term warming is linked to more evaporation that can both exacerbate droughts and increase atmospheric moisture available to storms, leading to more severe heavy rainfall and snowfall events.

However, temperature alone does not fully determine the probabilities of extreme events, and attributing specific events to climate change may be complicated by factors such as natural long-term fluctuations in the ocean surface temperatures. The report urges continued advancement in weather and climate modeling, noting that focused research on weather and climate extremes would improve the ability to attribute events. The report also calls for the development of predictive weather-to-climate forecasts for future extreme events that account for both natural variability and human influences.

The Academies’ study was funded by the U.S. Department of Energy, Heising-Simons Foundation, Litterman Family Foundation, David and Lucile Packard Foundation, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, and the Arthur L. Day Fund of the National Academy of Sciences.

Protecting Against Shortages of Molybdenum-99

©JohnnyGreig/iStock/Getty ImagesMolybdenum-99 and the technetium-99m derived from it are the isotopes most commonly used in medical diagnostic imaging. The U.S. currently consumes about half of the molybdenum-99 produced for medical use worldwide, but none has been produced domestically for medical use since the late 1980s.

Nearly 95 percent of the world’s supply of molybdenum-99 for medical use is produced by irradiating targets in research reactors located in Australia, Europe, and South Africa. A Canadian reactor was also producing molybdenum-99 for medical use until late October 2016. Molybdenum-99 and technetium-99m are distributed through an international supply chain on a weekly or more frequent basis because the isotopes have short half-lives and therefore cannot be stockpiled.

found that while global supplies of molybdenum-99 for medical use were adequate to meet U.S. needs at the time of the report’s release (September 2016), the supply capacity would be reduced substantially after Canada’s reactor stops routine production. Canada then becomes a supplier of last resort — producing molybdenum-99 only in case of severe global shortages — and will continue to be so until its reactor shuts down permanently at the end of March 2018.

The report recommends that the U.S. government continue to work with the Canadian government to ensure that there is an executable and well-communicated plan in place to restart the supply of molybdenum from Canada until March 2018, if needed.

The American Medical Isotopes Production Act of 2012 and financial support from the U.S. Department of Energy’s National Nuclear Security Administration have stimulated private-sector efforts to establish U.S. domestic production of molybdenum-99 for medical use. However, it is unlikely that substantial domestic supplies will be available before 2018. Moreover, potential domestic suppliers face technical, financial, regulatory, and market penetration challenges.

About 75 percent of the global supply of molybdenum-99 for medical use is produced using targets that contain highly enriched uranium (HEU). The report recommends that U.S. and other nations promote the wider utilization of molybdenum-99/technetium-99m produced without HEU targets. The U.S. should also work with global molybdenum-99 suppliers and regulators to reduce the proliferation hazards of medical isotope production wastes containing HEU.

The Academies’ study was funded by the U.S. Department of Energy’s National Nuclear Security Administration.

Highly Enriched Uranium in Research Reactors

U.S. policy and reactor conversion programs have worked for decades to minimize and phase out the use of highly enriched uranium (HEU) fuel for civilian research reactors, which are used primarily for research and training, materials testing, or the production of radioisotopes for medicine and industry. Worldwide, over 90 civilian research reactors have been converted to low-enriched uranium (LEU) fuel or shut down. However, more than 70 civilian research reactors, including eight in the United States, continue to use HEU fuel. These reactors use weapon-grade HEU to produce neutrons vital to scientific research and other civilian applications. Eliminating HEU use in these reactors by converting them to fuel containing LEU would reduce the risks that this material could be diverted for illicit use, for example in nuclear explosive devices.

Obstacles for converting the remaining civilian research reactors from HEU to LEU fuel are both technical and nontechnical, says . Some reactors using HEU fuel require the successful development of new, higher-density LEU fuel to maintain performance after conversion; the United States is developing high-density LEU fuel, but manufacturing it will be challenging. For other research reactors, progress toward conversion is hindered by nontechnical obstacles such as economic and political motivations.

The Department of Energy now estimates that it will take another 20 years to convert the remaining HEU-fueled civilian research reactors throughout the world to LEU fuel, and over 15 years to convert those in the United States. The report recommends an interim solution to accelerate the removal of weapon-grade HEU from civilian applications and reduce the risk of its illicit use until new high-density LEU fuel is available: convert the high-performance research reactors in the United States and Europe to use an existing, qualified HEU fuel with uranium-235 enrichments of 45 percent or less. Nearly all of the high-performance research reactors currently operating with HEU fuel could use this intermediate fuel without significant impact to their missions, the report estimates.

Since the report was released, the National Nuclear Security Administration has been directed by Congress to develop a long-term roadmap that describes the timeline, milestones, costs, and technology to develop LEU fuels for high-performance research reactors. The roadmap will incorporate regular independent technical and programmatic evaluations, as recommended in the report.

The Academies’ study was funded by the U.S. Department of Energy’s National Nuclear Security Administration.

Guidance for Gene-Drive Research

©isak55/iStock/Getty ImagesGene drives are systems of inheritance that enhance the ability of sequences of DNA to pass from parent to offspring. New, more efficient gene-editing tools like CRISPR/Cas9 offer the capacity to make modifications to a gene and spread it throughout a population of living organisms intentionally and quickly.

This breakthrough has the potential to control the spread of infectious diseases, eliminate invasive species, and address conservation-related issues and other challenges. Despite such promise, the two distinguishing characteristics of gene drives — the intentional spread of a genetic trait through a population and the possibility that their effects on ecosystems are irreversible — raise a number of concerns.

A cautionary approach to research on and governance of gene-drive technologies is needed to navigate the uncertainty posed by this fast-moving field of study and make informed decisions about potential applications of gene-drive modified organisms, says . There is insufficient evidence available at this time to support the release of these organisms into the environment. However, the possible benefits for basic and applied research justify continued research in laboratories and highly controlled field trials.

To understand the scientific, ethical, regulatory, and social consequences of releasing gene-drive modified organisms into the environment, much more research is required. A phased approach should guide research from the laboratory to the field, the report says. Each proposed field test or environmental release should be subject to robust ecological risk assessment before being approved to determine a gene drive’s potential impact.

The report also calls for flexible and rapidly adaptable governance policies to facilitate international coordination and collaboration. Public engagement should be built into risk assessment and practical decision-making to help frame and define the potential harms and benefits of using a gene-drive organism.

The Academies’ study was funded by the National Institutes of Health (NIH), the Foundation for the National Institutes of Health (FNIH), and the National Academy of Sciences Biology and Biotechnology Fund. The Defense Advanced Research Projects Agency and the Bill & Melinda Gates Foundation provided funding to NIH and FNIH, respectively, in support of this study.