Project TENDR: A call to action to protect children from harmful neurotoxins [The Pump Handle]

Just 10 years ago, it wouldn’t have been possible to bring leading physicians, scientists and advocates together in a consensus on toxic chemicals and neurological disorders in children, says Maureen Swanson. But with the science increasing “exponentially,” she said the time was ripe for a concerted call to action.

Swanson is co-director of Project TENDR (Targeting Environmental Neuro-Developmental Risks), a coalition of doctors, public health scientists and environmental health advocates who joined forces in 2015 to call for reducing chemical exposures that interfere with fetal and child brain development. This past summer in July, after more than a year of work, the group published its TENDR Consensus Statement in Environmental Health Perspectives, laying down a foundation for developing future recommendations to monitor, assess and reduce neurotoxic chemical exposures. The consensus concludes that a new framework for assessing such chemicals is desperately needed, as the “current system in the United States for evaluating scientific evidence and making health-based decisions about environmental chemicals is fundamentally broken.”

Swanson said the consensus statement is a first of its kind, adding that it’s “unprecedented” to have such a breadth of scientists come together and agree that the “science is clear” on toxic chemicals and neurodevelopmental disorders.

“Part of the urgency is because these toxic chemicals are in such widespread use and exposures for children and pregnant women are so widespread — they’re just ubiquitous,” Swanson, who also directs the Healthy Children Project at the Learning Disabilities Association of America, told me. “The urgency is also in seeing the trends in learning and developmental disorders and cognitive and behavioral difficulties — they’re problems that only seem to be increasing.”

According to the consensus statement, which Swanson said involved hundreds of studies and “countless hours” and reviewing and assessing the evidence, the U.S. is home to an alarming increase in childhood learning and behavioral problems, with parents reporting that one in six American children are living with some form of developmental disability, such as autism or attention deficit hyperactivity disorder. That statistic is an increase of 17 percent from a decade ago. The statement offers examples of toxic chemicals that can contribute to such disorders and lays out the argument for a new approach to chemical safety.

For example, the statement notes that many studies offer evidence that “clearly demonstrates or strongly suggests” adverse neurodevelopmental toxicity for lead, mercury, organophosphate pesticides, combustion-related air pollution, PBDE flame retardants and PCBs. Lead, as Swanson noted, is a perfect example of a widely used chemical that contributes to cognitive problems and intellectual impairment — “and yet it’s still everywhere, in water pipes, in cosmetics. We thought we’d done a good job of eliminating lead problems, but we haven’t done enough,” she said. Another prime example are chemical flame retardants, one of the most common household toxic exposures associated with neurodevelopmental delays in children.

“Of course these disorders are complex and multifactorial, so genetics plays a role, nutrition does and social stressors do,” Swanson told me. “But the contribution of toxic chemicals is a piece that we can prevent. We can do something about this part to decrease the risk to children.”

On taking action, the consensus argues that the current system for evaluating the human health effects of chemicals is “broken,” noting that of the thousands of chemicals now on the market, only a fraction have been tested for health impacts. The consensus reads:

Our failures to protect children from harm underscore the urgent need for a better approach to developing and assessing scientific evidence and using it to make decisions. We as a society should be able to take protective action when scientific evidence indicates a chemical is of concern, and not wait for unequivocal proof that a chemical is causing harm to our children.

Evidence of neurodevelopmental toxicity of any type — epidemiological or toxicological or mechanistic — by itself should constitute a signal sufficient to trigger prioritization and some level of action. Such an approach would enable policy makers and regulators to proactively test and identify chemicals that are emerging concerns for brain development and prevent widespread human exposures.

As many of you know, President Obama signed the Frank R. Lautenberg Chemical Safety for the 21st Century Act into law in June, reforming the woefully outdated federal Toxic Substances Control Act (TSCA), which hadn’t been updated since 1976. And while TSCA reform is certainly a “step in the right direction,” Swanson said the sheer backlog of chemical safety testing as well as the pace of testing set forth in the new law means the process of reducing or removing toxic exposures will likely be incredibly slow — and even that’s still dependent on whether the U.S. Environmental Protection Agency is fully funded to implement TSCA reform.

“TSCA reform by itself is insufficient to address the magnitude of these problems,” she said, noting that pesticides are outside of TSCA’s and the new law’s jurisdiction.

In turn, consensus authors called on regulators to follow scientific guidance when assessing a chemical’s impact on brain development, with a particular emphasis on fetuses and children; called on businesses to eliminate neurotoxic chemicals from their products; called on health providers to integrate knowledge about neurotoxics into patient care and public health practice; and called on policymakers to be more aggressive in reducing childhood lead exposures.

The problem of harmful chemical exposures can seem like an overwhelming one — “nobody can shop they’re way out of this problem,” Swanson said — but there are steps that can be taken right away to reduce exposures. For example, Swanson noted that when the U.S. phased out the use of lead in gasoline, children’s blood lead levels plummeted. Similarly, after Sweden banned PBDEs in the 1990s, levels of the chemical found in breast milk dropped sharply.

In terms of next steps, Swanson said Project TENDR will continue reaching out to policymakers, health professionals and businesses on how to work together toward safer chemical use and healthier children.

“There’s a lot we can do that can make a substantial difference in a relatively short time frame,” Swanson told me. “Our key message is that this is problem we can do something about. There’s reason for alarm, but also reason to get working and take care of it collectively so that our children are not at greater risk for neurodevelopmental disorders.”

To download a full copy of the Consensus Statement, as well as find tips on reducing harmful exposures on an individual level, visit Project TENDR.

Kim Krisberg is a freelance public health writer living in Austin, Texas, and has been writing about public health for nearly 15 years.



from ScienceBlogs http://ift.tt/2dh1yEc

Just 10 years ago, it wouldn’t have been possible to bring leading physicians, scientists and advocates together in a consensus on toxic chemicals and neurological disorders in children, says Maureen Swanson. But with the science increasing “exponentially,” she said the time was ripe for a concerted call to action.

Swanson is co-director of Project TENDR (Targeting Environmental Neuro-Developmental Risks), a coalition of doctors, public health scientists and environmental health advocates who joined forces in 2015 to call for reducing chemical exposures that interfere with fetal and child brain development. This past summer in July, after more than a year of work, the group published its TENDR Consensus Statement in Environmental Health Perspectives, laying down a foundation for developing future recommendations to monitor, assess and reduce neurotoxic chemical exposures. The consensus concludes that a new framework for assessing such chemicals is desperately needed, as the “current system in the United States for evaluating scientific evidence and making health-based decisions about environmental chemicals is fundamentally broken.”

Swanson said the consensus statement is a first of its kind, adding that it’s “unprecedented” to have such a breadth of scientists come together and agree that the “science is clear” on toxic chemicals and neurodevelopmental disorders.

“Part of the urgency is because these toxic chemicals are in such widespread use and exposures for children and pregnant women are so widespread — they’re just ubiquitous,” Swanson, who also directs the Healthy Children Project at the Learning Disabilities Association of America, told me. “The urgency is also in seeing the trends in learning and developmental disorders and cognitive and behavioral difficulties — they’re problems that only seem to be increasing.”

According to the consensus statement, which Swanson said involved hundreds of studies and “countless hours” and reviewing and assessing the evidence, the U.S. is home to an alarming increase in childhood learning and behavioral problems, with parents reporting that one in six American children are living with some form of developmental disability, such as autism or attention deficit hyperactivity disorder. That statistic is an increase of 17 percent from a decade ago. The statement offers examples of toxic chemicals that can contribute to such disorders and lays out the argument for a new approach to chemical safety.

For example, the statement notes that many studies offer evidence that “clearly demonstrates or strongly suggests” adverse neurodevelopmental toxicity for lead, mercury, organophosphate pesticides, combustion-related air pollution, PBDE flame retardants and PCBs. Lead, as Swanson noted, is a perfect example of a widely used chemical that contributes to cognitive problems and intellectual impairment — “and yet it’s still everywhere, in water pipes, in cosmetics. We thought we’d done a good job of eliminating lead problems, but we haven’t done enough,” she said. Another prime example are chemical flame retardants, one of the most common household toxic exposures associated with neurodevelopmental delays in children.

“Of course these disorders are complex and multifactorial, so genetics plays a role, nutrition does and social stressors do,” Swanson told me. “But the contribution of toxic chemicals is a piece that we can prevent. We can do something about this part to decrease the risk to children.”

On taking action, the consensus argues that the current system for evaluating the human health effects of chemicals is “broken,” noting that of the thousands of chemicals now on the market, only a fraction have been tested for health impacts. The consensus reads:

Our failures to protect children from harm underscore the urgent need for a better approach to developing and assessing scientific evidence and using it to make decisions. We as a society should be able to take protective action when scientific evidence indicates a chemical is of concern, and not wait for unequivocal proof that a chemical is causing harm to our children.

Evidence of neurodevelopmental toxicity of any type — epidemiological or toxicological or mechanistic — by itself should constitute a signal sufficient to trigger prioritization and some level of action. Such an approach would enable policy makers and regulators to proactively test and identify chemicals that are emerging concerns for brain development and prevent widespread human exposures.

As many of you know, President Obama signed the Frank R. Lautenberg Chemical Safety for the 21st Century Act into law in June, reforming the woefully outdated federal Toxic Substances Control Act (TSCA), which hadn’t been updated since 1976. And while TSCA reform is certainly a “step in the right direction,” Swanson said the sheer backlog of chemical safety testing as well as the pace of testing set forth in the new law means the process of reducing or removing toxic exposures will likely be incredibly slow — and even that’s still dependent on whether the U.S. Environmental Protection Agency is fully funded to implement TSCA reform.

“TSCA reform by itself is insufficient to address the magnitude of these problems,” she said, noting that pesticides are outside of TSCA’s and the new law’s jurisdiction.

In turn, consensus authors called on regulators to follow scientific guidance when assessing a chemical’s impact on brain development, with a particular emphasis on fetuses and children; called on businesses to eliminate neurotoxic chemicals from their products; called on health providers to integrate knowledge about neurotoxics into patient care and public health practice; and called on policymakers to be more aggressive in reducing childhood lead exposures.

The problem of harmful chemical exposures can seem like an overwhelming one — “nobody can shop they’re way out of this problem,” Swanson said — but there are steps that can be taken right away to reduce exposures. For example, Swanson noted that when the U.S. phased out the use of lead in gasoline, children’s blood lead levels plummeted. Similarly, after Sweden banned PBDEs in the 1990s, levels of the chemical found in breast milk dropped sharply.

In terms of next steps, Swanson said Project TENDR will continue reaching out to policymakers, health professionals and businesses on how to work together toward safer chemical use and healthier children.

“There’s a lot we can do that can make a substantial difference in a relatively short time frame,” Swanson told me. “Our key message is that this is problem we can do something about. There’s reason for alarm, but also reason to get working and take care of it collectively so that our children are not at greater risk for neurodevelopmental disorders.”

To download a full copy of the Consensus Statement, as well as find tips on reducing harmful exposures on an individual level, visit Project TENDR.

Kim Krisberg is a freelance public health writer living in Austin, Texas, and has been writing about public health for nearly 15 years.



from ScienceBlogs http://ift.tt/2dh1yEc

Friday Cephalopod: Net traps and chiller [Pharyngula]

If you ever wondered how to breed nautiluses



from ScienceBlogs http://ift.tt/2durJ7b

If you ever wondered how to breed nautiluses



from ScienceBlogs http://ift.tt/2durJ7b

More Wadhams [Stoat]

paladin Browsing Twitter after a break I was unsurprised to see the usual suspects dissing that fine chap, Peter Wadhams. Heaven forfend that I should ever stoop so low. It is tempting to describe the “lame article” they were dissing as the usual stuff, but alas it isn’t. It lards extra Yellow Peril guff onto the pre-existing guff. Incidentally the author, Paul Brown, was once a respectable chap – my great-aunt Proctor knew him somewhat. But that was many years ago. Bizarrely, the first “related posts” link in the article is to a far better article by Ed Hawkins pointing out how bad the previous article about Wadhams was.

[You may be wondering “why the image?” At least, if you’re new here you might. The answer is that web-indexers tend to throw up the first image on a page that they find; and I didn’t want that nice PH to be the “image” for this post.]

Bordering on dishonest

“What is needed is something that has not been invented yet − a way of stopping elderly scientists from talking nonsense” (I may have fabricated part of that quote). And my section header is a sub-headline in the article, to they can hardly complain if I reproduce it to my own ends. The rest of this post is just character assassination (or, to dignify it somewhat, trying to work out what his current status is); look away if you like Wadhams.

I was intrigued by the article calling him “former head of the Polar Ocean Physics Group at the University of Cambridge” (my bold). It appears to be wrong; as far as I can tell, he is still head of this little known group. The article may have got confused by him also being (accurately) the former head of SPRI. But the POPG is an odd little thing. Just look at it’s web page: Professor Peter Wadhams has run the Group since January 1976, which until December 2002 was based in the Scott Polar Research Institute (SPRI), University of Cambridge. In January 2003 the Group moved within the university from SPRI to the Department of Applied Mathematics and Theoretical Physics. It was previously called the Sea Ice and Polar Oceanography Group. Text by Prof Peter Wadhams, 2002. Updated by Oliver Merrington, POPG Webmaster, October 2005. This is certainly not an active web page. [I’ve just re-read that. He’s run the group for forty years!?! Can that be healthy?]

Poking further, I find his Clare Hall bio, where he self-describes as “became Emeritus Professor in October 2015” (and there’s also this little letter which may or may not be deliberately public). So his DAMPT page is clearly out of date (it still describes him as “Professor”, which he is isn’t, any more that Murry Salby is). I think the best explanation is that the DAMPT pages are just out of date and unloved; they don’t give the impression of vibrancy.

Within DAMPT, the POPG is a touch anomalous, to my eye. It fits within the highly-respected Geophysics group, and might be compared (in the sense that it appears to sit on the same organisational level as), say, to the Atmosphere-Ocean Dynamics Group. This is a highly active research group featuring hard man (just look at those eyes; you wouldn’t want to run across him on a dark river) Peter Haynes (who, I might perhaps hasten to add, has nothing at all to do with the story (story? This post has a plot? Well no. OK then, ramble) I’m telling) and mysterious old wizard Michael McIntyre (famous for telling you to repeat words). Compare that to the POPG page and it looks a touch moribund; poke further into the list of projects and the impression re-surfaces. Can an active research group be lead by an emeritus professor? It seems odd to me, but what do I know?

There’s also this bio which, perhaps fittingly, ends on the ludicrous Vast costs of Arctic change.



from ScienceBlogs http://ift.tt/2dxMPSP

paladin Browsing Twitter after a break I was unsurprised to see the usual suspects dissing that fine chap, Peter Wadhams. Heaven forfend that I should ever stoop so low. It is tempting to describe the “lame article” they were dissing as the usual stuff, but alas it isn’t. It lards extra Yellow Peril guff onto the pre-existing guff. Incidentally the author, Paul Brown, was once a respectable chap – my great-aunt Proctor knew him somewhat. But that was many years ago. Bizarrely, the first “related posts” link in the article is to a far better article by Ed Hawkins pointing out how bad the previous article about Wadhams was.

[You may be wondering “why the image?” At least, if you’re new here you might. The answer is that web-indexers tend to throw up the first image on a page that they find; and I didn’t want that nice PH to be the “image” for this post.]

Bordering on dishonest

“What is needed is something that has not been invented yet − a way of stopping elderly scientists from talking nonsense” (I may have fabricated part of that quote). And my section header is a sub-headline in the article, to they can hardly complain if I reproduce it to my own ends. The rest of this post is just character assassination (or, to dignify it somewhat, trying to work out what his current status is); look away if you like Wadhams.

I was intrigued by the article calling him “former head of the Polar Ocean Physics Group at the University of Cambridge” (my bold). It appears to be wrong; as far as I can tell, he is still head of this little known group. The article may have got confused by him also being (accurately) the former head of SPRI. But the POPG is an odd little thing. Just look at it’s web page: Professor Peter Wadhams has run the Group since January 1976, which until December 2002 was based in the Scott Polar Research Institute (SPRI), University of Cambridge. In January 2003 the Group moved within the university from SPRI to the Department of Applied Mathematics and Theoretical Physics. It was previously called the Sea Ice and Polar Oceanography Group. Text by Prof Peter Wadhams, 2002. Updated by Oliver Merrington, POPG Webmaster, October 2005. This is certainly not an active web page. [I’ve just re-read that. He’s run the group for forty years!?! Can that be healthy?]

Poking further, I find his Clare Hall bio, where he self-describes as “became Emeritus Professor in October 2015” (and there’s also this little letter which may or may not be deliberately public). So his DAMPT page is clearly out of date (it still describes him as “Professor”, which he is isn’t, any more that Murry Salby is). I think the best explanation is that the DAMPT pages are just out of date and unloved; they don’t give the impression of vibrancy.

Within DAMPT, the POPG is a touch anomalous, to my eye. It fits within the highly-respected Geophysics group, and might be compared (in the sense that it appears to sit on the same organisational level as), say, to the Atmosphere-Ocean Dynamics Group. This is a highly active research group featuring hard man (just look at those eyes; you wouldn’t want to run across him on a dark river) Peter Haynes (who, I might perhaps hasten to add, has nothing at all to do with the story (story? This post has a plot? Well no. OK then, ramble) I’m telling) and mysterious old wizard Michael McIntyre (famous for telling you to repeat words). Compare that to the POPG page and it looks a touch moribund; poke further into the list of projects and the impression re-surfaces. Can an active research group be lead by an emeritus professor? It seems odd to me, but what do I know?

There’s also this bio which, perhaps fittingly, ends on the ludicrous Vast costs of Arctic change.



from ScienceBlogs http://ift.tt/2dxMPSP

Sensitivity training

Climate scientists are certain that human-caused emissions have increased carbon dioxide in the atmosphere by 44 per cent since the Industrial Revolution. Very few of them dispute that this has already caused average global temperatures to rise roughly 1 degree. Accompanying the warming is disruption to weather patterns, rising sea levels and increased ocean acidity. There is no doubt that further emissions will only make matters worse, possibly much worse. In a nutshell, that is the settled science on human-caused climate change.

What scientists cannot yet pin down is exactly how much warming we will get in the future. They do not know with precision how much a given quantity of emissions will lead to increased concentrations of greenhouse gases in the atmosphere. For climate impact it is the concentrations that matter, not the emissions. Up until now, 29 per cent of human emissions of carbon dioxide has been taken up by the oceans, 28 per cent has been absorbed by plant growth on land, and the remaining 43 per cent has accumulated in the atmosphere. Humans have increased carbon dioxide concentrations in the atmosphere from a pre-industrial level of 280 parts per million to over 400 today, a level not seen for millions of years.

There’s a possibility that the 43 per cent atmospheric fraction may increase as ocean and terrestrial carbon sinks start to become saturated. This means that a given amount of emissions will lead to a bigger increase in concentrations than we saw before. In addition, the warming climate may well provoke increased emissions from non-fossil fuel sources. For example, as permafrost thaws, the long-frozen organic matter contained within it rots and oxidizes, giving off greenhouse gases. Nature has given us a major helping hand, so far, by the oceans and plants taking up more than half of our added fossil carbon, but there’s no guarantee that it will continue to be so supportive forever. These so-called carbon-cycle feedbacks will play a big role in determining how our climate future will unfold, but they are not the largest unknown. 

Feedbacks

Atmospheric physicists have long tried to pin down a number to express what they refer to as climate sensitivity, the amount of warming we will get from a certain increase in concentration of greenhouse gases. Usually, this is expressed as the average global warming, measured in degrees Celsius that results from a doubling of carbon dioxide concentrations. The problem is not so much being able to calculate how much warming the doubling of the carbon dioxide alone will cause – that is relatively easy to estimate and is about 1 degree C. The big challenge is in figuring out the range of size of the feedbacks. These are the phenomena that arise from warming temperatures and that amplify or dampen the direct effects of the greenhouse gases that humans have added to the atmosphere.

The biggest feedback is water vapour, which is actually the most important single greenhouse gas in the atmosphere. Warm air holds more water vapour. As carbon dioxide increases and the air warms, there is plenty of water on land and in the sea available to evaporate. The increased amount of vapour in the air, in turn, provokes more warming and increased evaporation. If temperatures go down, the water vapour condenses and precipitates out of the atmosphere as rain and snow. Water vapour goes quickly into and out of the air as temperatures rise and fall, but the level of carbon dioxide stays around for centuries, which is why water vapour is considered a feedback and not a forcing agent. Roughly speaking, the water vapour feedback increases the sensitivity of carbon dioxide alone from 1 to 2 degrees C.

Another feedback results from the melting of sea ice in the Arctic. Ice reflects the sun’s energy back out into space, whereas oceans that are free of ice absorb more of the sun’s radiated heat. As warming temperatures melt the sea ice, the Earth absorbs more solar energy and the surface warms faster. The loss of sea ice is a major reason that Arctic temperatures are increasing about twice as fast as the rest of the globe. The Antarctic has gained rather than lost sea ice over recent decades due to the effects of ocean currents and other factors. But this gain is much smaller than the Arctic ice loss, so the overall effect of all polar sea ice on the climate is to amplify the global response to increased carbon dioxide concentrations.

The least well-defined feedback is the effect of clouds. The quantity and distribution of clouds is expected to change in a warming climate, but exactly how is not yet fully known and is debated. High clouds tend to keep more heat in, while low clouds tend to reflect more sunlight back into space, providing a cooling effect. Most experts estimate that clouds, on balance, will have anywhere from a slight cooling feedback to a significant warming feedback.

On top of the variance in the estimates of the feedbacks, the size of the human and natural factors that drive climate change, apart from carbon dioxide, also have a wide range. Greenhouse gases like methane play a big role in warming, while sulphate-particle pollution from coal-burning plants actually cools the planet by blocking the sun. Land-use changes – clearing forests, for example – also affect climate by either reflecting or absorbing more of the sun’s energy. Natural ejections of reflective particles from volcanoes can also influence the climate in significant, but unpredictable ways.

This year’s model

An early estimate of climate sensitivity was made in 1979 by the American scientist Jule Charney. He based his estimate on just two sets of climate calculations – or models – that were available at that time. One set of models predicted a sensitivity of 2 degrees, the other, 4 degrees, which he averaged to get a mean value of 3 degrees. Rather arbitrarily, he subtracted or added half a degree from the two model estimates to produce a minimum-to-maximum range of 1.5 to 4.5 degrees. Despite the shakiness of this approach, Charney’s estimate has proved remarkably durable.

sensitive1

The five Intergovernmental Panel on Climate Change (IPCC) reports produced between 1990 and 2013 drew upon the results of many more climate models that were also much more sophisticated. Nevertheless, all of the reports came up with estimates of the minimum, maximum and most likely sensitivities that were within half a degree of Charney’s rough estimate. In 2007, the fourth assessment report (AR4) provided a climate sensitivity range of 2 to 4.5 degrees C, with a most likely value of 3 degrees C. The latest report, AR5 in 2013, estimates the likely range of sensitivity at 1.5 to 4.5 degrees C, exactly the range Charney provided 34 years earlier with his educated guesswork.

It is worth noting that climate sensitivity is not an input factor into the climate models but a calculated result.

In the past few years, some scientists have made calculations based on recent temperature measurements and simple energy-balance climate models. This approach has tended to produce an estimate of a most-likely climate sensitivity number around 2 degrees, which is significantly lower than the previous best estimate of around 3 degrees from more complex climate models. Taking account of this work, the IPCC adjusted its lower estimates downward in the 2013 AR5 report and, because of the newly increased range, opted not to settle upon a most-likely central value. These new, lower values suggest that the average, complex climate models may be predicting too much warming.

However, a recent publication in the journal Nature Climate Change by NASA scientist Mark Richardson and his colleagues has exposed flaws in those simple, low sensitivity models. One problem is that the simple calculations took ocean temperatures measured just below the surface (which is the common measurement made by climate scientists) and compared them to the calculated air temperatures near the Earth’s surface that is output by climate models. Since air above the ocean warms more than the water, the comparison is not valid over the oceans. Richardson and his colleagues also factored in the effect of retreating Arctic sea ice on temperature measurements, and the lack of measured historical data in some regions. They then checked the calculations again, and as Richardson explained to Corporate Knights:

“Once you do a fair test then you get the same result from both the simple calculation using real-world data and from complex climate models. We took model water temperatures when the measurements are of water temperatures, and didn’t use model output when and where there were no measurements. This matters because fast-warming areas like the Arctic, where there is now less summer sea ice than in at least 1,450 years, have not historically been measured by thermometers. All of the effects combined in the same way; they hid warming. This is the main reason that climate models looked like they were warmed a bit too much since 1861.”

Additional recent research from a NASA team led by scientist Kate Marvel took a hard look at some other simplifying assumptions made in the low-sensitivity calculations. Marvel and her colleagues modified the inputs to more complex climate models to explore how much certain factors, like sulphate pollution or land-use changes, affected the climate when modelled in isolation. They found that these agents are more effective in causing temperature changes because they tend to be located in the northern hemisphere and on land where they carry a bigger punch than if it is simply assumed that their effect is distributed evenly across the planet, as some of the simpler, low-sensitivity studies have done.

Combining the Richardson and the Marvel results brings estimates of climate sensitivity back to, or even a little above Jule Charney’s estimates. To the non-specialist, all of this may seem like a rather pointless process where we end up where we started from, still stuck with a stubbornly wide range of a factor of 3 or so from minimum (1.5 degrees) to maximum (4.5 degrees). But as Gavin Schmidt, director of NASA’s Goddard Institute for Space Studies, told Scientific American last year: “We may be just as unsure as before, but we are unsure on a much more solid footing.”

Uncertainty provides no comfort

Climate sensitivity estimates are not just estimated by climate models using modern data. Scientists also have observations of how the Earth behaved in periods of past climatic change. From the ice-age cycles that occurred over the past 800,000 years there are samples of past atmospheres trapped in gas bubbles in ice cores that reveal the chemical mix of the air and the temperatures at the time.

Scientists can look back much further in time, many millions of years ago, when the Earth was in a hot-house state. In those times there was little ice even at the poles and sea levels were several tens of metres higher than they are today.

These observations of the geological past have their own considerable ranges of uncertainty, but, taken together, they produce estimates of climate sensitivity that are broadly consistent with the range calculated by climate models of the modern era. This consilience, which is to say, different approaches pointing to the same general result, explains why climate scientists are so confident that increasing concentrations of greenhouse gases lead to increased warming, even if nobody can yet be sure how much the human-induced warming will be over this century and beyond.

One thing we do know with great confidence is that if we continue to emit greenhouse gases at the current rate, then sometime in the second half of this century we will have doubled the concentration of carbon dioxide in the atmosphere. The last time concentrations were that high, 30 million years ago, there was no ice on Greenland and little on Antarctica.

Click here to read the rest



from Skeptical Science http://ift.tt/2cRl5dK

Climate scientists are certain that human-caused emissions have increased carbon dioxide in the atmosphere by 44 per cent since the Industrial Revolution. Very few of them dispute that this has already caused average global temperatures to rise roughly 1 degree. Accompanying the warming is disruption to weather patterns, rising sea levels and increased ocean acidity. There is no doubt that further emissions will only make matters worse, possibly much worse. In a nutshell, that is the settled science on human-caused climate change.

What scientists cannot yet pin down is exactly how much warming we will get in the future. They do not know with precision how much a given quantity of emissions will lead to increased concentrations of greenhouse gases in the atmosphere. For climate impact it is the concentrations that matter, not the emissions. Up until now, 29 per cent of human emissions of carbon dioxide has been taken up by the oceans, 28 per cent has been absorbed by plant growth on land, and the remaining 43 per cent has accumulated in the atmosphere. Humans have increased carbon dioxide concentrations in the atmosphere from a pre-industrial level of 280 parts per million to over 400 today, a level not seen for millions of years.

There’s a possibility that the 43 per cent atmospheric fraction may increase as ocean and terrestrial carbon sinks start to become saturated. This means that a given amount of emissions will lead to a bigger increase in concentrations than we saw before. In addition, the warming climate may well provoke increased emissions from non-fossil fuel sources. For example, as permafrost thaws, the long-frozen organic matter contained within it rots and oxidizes, giving off greenhouse gases. Nature has given us a major helping hand, so far, by the oceans and plants taking up more than half of our added fossil carbon, but there’s no guarantee that it will continue to be so supportive forever. These so-called carbon-cycle feedbacks will play a big role in determining how our climate future will unfold, but they are not the largest unknown. 

Feedbacks

Atmospheric physicists have long tried to pin down a number to express what they refer to as climate sensitivity, the amount of warming we will get from a certain increase in concentration of greenhouse gases. Usually, this is expressed as the average global warming, measured in degrees Celsius that results from a doubling of carbon dioxide concentrations. The problem is not so much being able to calculate how much warming the doubling of the carbon dioxide alone will cause – that is relatively easy to estimate and is about 1 degree C. The big challenge is in figuring out the range of size of the feedbacks. These are the phenomena that arise from warming temperatures and that amplify or dampen the direct effects of the greenhouse gases that humans have added to the atmosphere.

The biggest feedback is water vapour, which is actually the most important single greenhouse gas in the atmosphere. Warm air holds more water vapour. As carbon dioxide increases and the air warms, there is plenty of water on land and in the sea available to evaporate. The increased amount of vapour in the air, in turn, provokes more warming and increased evaporation. If temperatures go down, the water vapour condenses and precipitates out of the atmosphere as rain and snow. Water vapour goes quickly into and out of the air as temperatures rise and fall, but the level of carbon dioxide stays around for centuries, which is why water vapour is considered a feedback and not a forcing agent. Roughly speaking, the water vapour feedback increases the sensitivity of carbon dioxide alone from 1 to 2 degrees C.

Another feedback results from the melting of sea ice in the Arctic. Ice reflects the sun’s energy back out into space, whereas oceans that are free of ice absorb more of the sun’s radiated heat. As warming temperatures melt the sea ice, the Earth absorbs more solar energy and the surface warms faster. The loss of sea ice is a major reason that Arctic temperatures are increasing about twice as fast as the rest of the globe. The Antarctic has gained rather than lost sea ice over recent decades due to the effects of ocean currents and other factors. But this gain is much smaller than the Arctic ice loss, so the overall effect of all polar sea ice on the climate is to amplify the global response to increased carbon dioxide concentrations.

The least well-defined feedback is the effect of clouds. The quantity and distribution of clouds is expected to change in a warming climate, but exactly how is not yet fully known and is debated. High clouds tend to keep more heat in, while low clouds tend to reflect more sunlight back into space, providing a cooling effect. Most experts estimate that clouds, on balance, will have anywhere from a slight cooling feedback to a significant warming feedback.

On top of the variance in the estimates of the feedbacks, the size of the human and natural factors that drive climate change, apart from carbon dioxide, also have a wide range. Greenhouse gases like methane play a big role in warming, while sulphate-particle pollution from coal-burning plants actually cools the planet by blocking the sun. Land-use changes – clearing forests, for example – also affect climate by either reflecting or absorbing more of the sun’s energy. Natural ejections of reflective particles from volcanoes can also influence the climate in significant, but unpredictable ways.

This year’s model

An early estimate of climate sensitivity was made in 1979 by the American scientist Jule Charney. He based his estimate on just two sets of climate calculations – or models – that were available at that time. One set of models predicted a sensitivity of 2 degrees, the other, 4 degrees, which he averaged to get a mean value of 3 degrees. Rather arbitrarily, he subtracted or added half a degree from the two model estimates to produce a minimum-to-maximum range of 1.5 to 4.5 degrees. Despite the shakiness of this approach, Charney’s estimate has proved remarkably durable.

sensitive1

The five Intergovernmental Panel on Climate Change (IPCC) reports produced between 1990 and 2013 drew upon the results of many more climate models that were also much more sophisticated. Nevertheless, all of the reports came up with estimates of the minimum, maximum and most likely sensitivities that were within half a degree of Charney’s rough estimate. In 2007, the fourth assessment report (AR4) provided a climate sensitivity range of 2 to 4.5 degrees C, with a most likely value of 3 degrees C. The latest report, AR5 in 2013, estimates the likely range of sensitivity at 1.5 to 4.5 degrees C, exactly the range Charney provided 34 years earlier with his educated guesswork.

It is worth noting that climate sensitivity is not an input factor into the climate models but a calculated result.

In the past few years, some scientists have made calculations based on recent temperature measurements and simple energy-balance climate models. This approach has tended to produce an estimate of a most-likely climate sensitivity number around 2 degrees, which is significantly lower than the previous best estimate of around 3 degrees from more complex climate models. Taking account of this work, the IPCC adjusted its lower estimates downward in the 2013 AR5 report and, because of the newly increased range, opted not to settle upon a most-likely central value. These new, lower values suggest that the average, complex climate models may be predicting too much warming.

However, a recent publication in the journal Nature Climate Change by NASA scientist Mark Richardson and his colleagues has exposed flaws in those simple, low sensitivity models. One problem is that the simple calculations took ocean temperatures measured just below the surface (which is the common measurement made by climate scientists) and compared them to the calculated air temperatures near the Earth’s surface that is output by climate models. Since air above the ocean warms more than the water, the comparison is not valid over the oceans. Richardson and his colleagues also factored in the effect of retreating Arctic sea ice on temperature measurements, and the lack of measured historical data in some regions. They then checked the calculations again, and as Richardson explained to Corporate Knights:

“Once you do a fair test then you get the same result from both the simple calculation using real-world data and from complex climate models. We took model water temperatures when the measurements are of water temperatures, and didn’t use model output when and where there were no measurements. This matters because fast-warming areas like the Arctic, where there is now less summer sea ice than in at least 1,450 years, have not historically been measured by thermometers. All of the effects combined in the same way; they hid warming. This is the main reason that climate models looked like they were warmed a bit too much since 1861.”

Additional recent research from a NASA team led by scientist Kate Marvel took a hard look at some other simplifying assumptions made in the low-sensitivity calculations. Marvel and her colleagues modified the inputs to more complex climate models to explore how much certain factors, like sulphate pollution or land-use changes, affected the climate when modelled in isolation. They found that these agents are more effective in causing temperature changes because they tend to be located in the northern hemisphere and on land where they carry a bigger punch than if it is simply assumed that their effect is distributed evenly across the planet, as some of the simpler, low-sensitivity studies have done.

Combining the Richardson and the Marvel results brings estimates of climate sensitivity back to, or even a little above Jule Charney’s estimates. To the non-specialist, all of this may seem like a rather pointless process where we end up where we started from, still stuck with a stubbornly wide range of a factor of 3 or so from minimum (1.5 degrees) to maximum (4.5 degrees). But as Gavin Schmidt, director of NASA’s Goddard Institute for Space Studies, told Scientific American last year: “We may be just as unsure as before, but we are unsure on a much more solid footing.”

Uncertainty provides no comfort

Climate sensitivity estimates are not just estimated by climate models using modern data. Scientists also have observations of how the Earth behaved in periods of past climatic change. From the ice-age cycles that occurred over the past 800,000 years there are samples of past atmospheres trapped in gas bubbles in ice cores that reveal the chemical mix of the air and the temperatures at the time.

Scientists can look back much further in time, many millions of years ago, when the Earth was in a hot-house state. In those times there was little ice even at the poles and sea levels were several tens of metres higher than they are today.

These observations of the geological past have their own considerable ranges of uncertainty, but, taken together, they produce estimates of climate sensitivity that are broadly consistent with the range calculated by climate models of the modern era. This consilience, which is to say, different approaches pointing to the same general result, explains why climate scientists are so confident that increasing concentrations of greenhouse gases lead to increased warming, even if nobody can yet be sure how much the human-induced warming will be over this century and beyond.

One thing we do know with great confidence is that if we continue to emit greenhouse gases at the current rate, then sometime in the second half of this century we will have doubled the concentration of carbon dioxide in the atmosphere. The last time concentrations were that high, 30 million years ago, there was no ice on Greenland and little on Antarctica.

Click here to read the rest



from Skeptical Science http://ift.tt/2cRl5dK

New MIT app: check if your car meets climate targets

In a new study published in the journal Environmental Science & Technology, with an accompanying app for the public, scientists at MIT compare the carbon pollution from today’s cars to the international 2°C climate target. In order to meet that target, overall emissions need to decline dramatically over the coming decades.

The MIT team compared emissions from 125 electric, hybrid, and gasoline cars to the levels we need to achieve from the transportation sector in 2030, 2040, and 2050 to stay below 2°C global warming. They also looked at the cost efficiency of each car, including vehicle, fuel, and maintenance costs. The bottom line:

Although the average carbon intensity of vehicles sold in 2014 exceeds the climate target for 2030 by more than 50%, we find that most hybrid and battery electric vehicles available today meet this target. By 2050, only electric vehicles supplied with almost completely carbon-free electric power are expected to meet climate-policy targets.

figure

Cost-carbon space for light-duty vehicles, assuming a 14 year lifetime, 12,100 miles driven annually, and an 8% discount rate. Data points show the most popular internal-combustion-engine vehicles (black), hybrid electric vehicles (pink), plug-in hybrid electric vehicles (red), and battery electric vehicles (yellow) in 2014, as well as one of the first fully commercial fuel-cell vehicles (blue). Illustration: Miotti et al. (2016), Environmental Science & Technology.

The MIT app allows consumers to check how their own vehicles – or cars they’re considering purchasing – stack up on the carbon emissions and cost curves. As co-author Jessika Trancik noted,

One goal of the work is to translate climate mitigation scenarios to the level of individual decision-makers who will ultimately be the ones to decide whether or not a clean energy transition occurs (in a market economy, at least). In the case of transportation, private citizens are key decision-makers.

How can electric cars already be the cheapest?

The study used average US fuel and electricity prices over the decade of 2004–2013 (e.g. $3.14 per gallon of gasoline and 12 cents per kilowatt-hour), and the app allows consumers to test different fuel costs. The lifetime of a car is estimated at 14 years and about 170,000 miles. As co-author Marco Miotti explained,

We use parameters that reflect the U.S. consumer experience when going to buy a new car.

As the chart in the lower right corner of the above figure shows, when only accounting for the vehicle purchase cost, gasoline-powered cars are the cheapest. However, hybrid and electric vehicles have lower fuel and regular maintenance costs. In the US, there are also federal rebates that bring the consumer costs down further yet, so the cheapest electric cars (in cost per distance driven) cost less than the cheapest gasoline cars. When including state tax rebates like in California, electric cars are by far the best deal available.

Some might object that it’s unfair to include tax rebates, and the MIT app allows for a comparison with or without those rebates. However, as Trancik noted, the app is aimed at consumers, and for them the rebates are a reality. Moreover, the cost of gasoline in the USA does not reflect the costs inflicted by its carbon pollution via climate change. Over the lifetime of the car, the electric car tax rebates roughly offset the gasoline carbon pollution subsidy. Gasoline combustion releases other pollutants that result in additional societal costs as well.

To meet climate targets we need EVs and clean energy

As the study notes, most of today’s hybrids and plug-in hybrids meet the 2030 climate targets, and electric cars beat them. However, if we’re going to stay below the internationally-accepted ‘danger limit’ of 2°C global warming above pre-industrial temperatures, the US needs to achieve about an 80% cut in emissions by 2050. As co-author Geoffrey Supran noted,

The bottom line is that meeting long-term targets requires simultaneous and comprehensive vehicle electrification and grid decarbonization.

The good news is that we already have the necessary technology available today.Another recent study from MIT’s Trancik Lab found that today’s lowest-cost electric vehicles meet the daily range requirements of 87% of cars on the road, even if they can only recharge once a day. And the technology is quickly advancing; Chevrolet and Tesla will soon release electric cars with 200 mile-per-charge range and before-rebate prices under $40,000. Solar and wind energy prices continue to fall rapidly as well, as we progress toward a “renewable energy revolution.”

Click here to read the rest



from Skeptical Science http://ift.tt/2cRlsoI

In a new study published in the journal Environmental Science & Technology, with an accompanying app for the public, scientists at MIT compare the carbon pollution from today’s cars to the international 2°C climate target. In order to meet that target, overall emissions need to decline dramatically over the coming decades.

The MIT team compared emissions from 125 electric, hybrid, and gasoline cars to the levels we need to achieve from the transportation sector in 2030, 2040, and 2050 to stay below 2°C global warming. They also looked at the cost efficiency of each car, including vehicle, fuel, and maintenance costs. The bottom line:

Although the average carbon intensity of vehicles sold in 2014 exceeds the climate target for 2030 by more than 50%, we find that most hybrid and battery electric vehicles available today meet this target. By 2050, only electric vehicles supplied with almost completely carbon-free electric power are expected to meet climate-policy targets.

figure

Cost-carbon space for light-duty vehicles, assuming a 14 year lifetime, 12,100 miles driven annually, and an 8% discount rate. Data points show the most popular internal-combustion-engine vehicles (black), hybrid electric vehicles (pink), plug-in hybrid electric vehicles (red), and battery electric vehicles (yellow) in 2014, as well as one of the first fully commercial fuel-cell vehicles (blue). Illustration: Miotti et al. (2016), Environmental Science & Technology.

The MIT app allows consumers to check how their own vehicles – or cars they’re considering purchasing – stack up on the carbon emissions and cost curves. As co-author Jessika Trancik noted,

One goal of the work is to translate climate mitigation scenarios to the level of individual decision-makers who will ultimately be the ones to decide whether or not a clean energy transition occurs (in a market economy, at least). In the case of transportation, private citizens are key decision-makers.

How can electric cars already be the cheapest?

The study used average US fuel and electricity prices over the decade of 2004–2013 (e.g. $3.14 per gallon of gasoline and 12 cents per kilowatt-hour), and the app allows consumers to test different fuel costs. The lifetime of a car is estimated at 14 years and about 170,000 miles. As co-author Marco Miotti explained,

We use parameters that reflect the U.S. consumer experience when going to buy a new car.

As the chart in the lower right corner of the above figure shows, when only accounting for the vehicle purchase cost, gasoline-powered cars are the cheapest. However, hybrid and electric vehicles have lower fuel and regular maintenance costs. In the US, there are also federal rebates that bring the consumer costs down further yet, so the cheapest electric cars (in cost per distance driven) cost less than the cheapest gasoline cars. When including state tax rebates like in California, electric cars are by far the best deal available.

Some might object that it’s unfair to include tax rebates, and the MIT app allows for a comparison with or without those rebates. However, as Trancik noted, the app is aimed at consumers, and for them the rebates are a reality. Moreover, the cost of gasoline in the USA does not reflect the costs inflicted by its carbon pollution via climate change. Over the lifetime of the car, the electric car tax rebates roughly offset the gasoline carbon pollution subsidy. Gasoline combustion releases other pollutants that result in additional societal costs as well.

To meet climate targets we need EVs and clean energy

As the study notes, most of today’s hybrids and plug-in hybrids meet the 2030 climate targets, and electric cars beat them. However, if we’re going to stay below the internationally-accepted ‘danger limit’ of 2°C global warming above pre-industrial temperatures, the US needs to achieve about an 80% cut in emissions by 2050. As co-author Geoffrey Supran noted,

The bottom line is that meeting long-term targets requires simultaneous and comprehensive vehicle electrification and grid decarbonization.

The good news is that we already have the necessary technology available today.Another recent study from MIT’s Trancik Lab found that today’s lowest-cost electric vehicles meet the daily range requirements of 87% of cars on the road, even if they can only recharge once a day. And the technology is quickly advancing; Chevrolet and Tesla will soon release electric cars with 200 mile-per-charge range and before-rebate prices under $40,000. Solar and wind energy prices continue to fall rapidly as well, as we progress toward a “renewable energy revolution.”

Click here to read the rest



from Skeptical Science http://ift.tt/2cRlsoI

IPCC special report to scrutinise ‘feasibility’ of 1.5C climate goal

This is a re-post from Carbon Brief by Roz Pidcock

The head of the United Nation’s climate body has called for a thorough assessment of the feasibility of the international goal to limit warming to 1.5C.

Dr Hoesung Lee, chair of the Intergovernmental Panel on Climate Change (IPCC), told delegates at a meeting in Geneva, which is designed to flesh out the contents of a special report on 1.5C, that they bore a “great responsibility” in making sure it meets the expectations of the international climate community.

To be policy-relevant, the report will need to spell out what’s to be gained by limiting warming to 1.5C, as well as the practical steps needed to get there within sustainability and poverty eradication goals.

More than ever, urged Lee, the report must be easily understandable for a non-scientific audience. The IPCC has come under fire in the past over what some have called its “increasingly unreadable” reports.

Feasibility

In between the main “assessment reports” every five or six years, the IPCC publishes shorter “special reports” on specific topics. Past ones have included extreme weatherand renewable energy.

The IPCC was “invited” by the United Nations Convention on Climate Change (UNFCCC) to do a special report on 1.5C after the Paris Agreement codified a goal to limit global temperature rise to “well below 2C” and to “pursue efforts towards 1.5C”.

The aim for this week’s meeting in Geneva is, in theory, simple: to decide on a title for the report; come up with chapter headings; and write a few bullet points summarising what the report will cover.

On day two of three, Carbon Brief understands six “themes” have emerged as contenders. Judging by proceeding so far,  it seems likely that the feasibility of the 1.5C goal features highly on that list.

Referring to a questionnaire sent out to scientists, policymakers and other “interested parties” ahead of the scoping meeting to ask what they thought the 1.5C report should cover, Lee told the conference:

“One notion that runs through all this, is feasibility. How feasible is it to limit warming to 1.5C? How feasible is it to develop the technologies that will get us there?…We must analyse policy measures in terms of feasibility.”

The explicit mention of 1.5C in the Paris Agreement caught the scientific community somewhat off-guard, said Elena Manaenkova, incoming deputy secretary-general of theWorld Meteorological Organization.

Speaking in Geneva yesterday, she told delegates she felt “proud, but also somewhat concerned” about the outcome of the Paris talks. She said:

“I was there. I know the reason why it was done…[P]arties were keen to do even better, to go faster, to go even further…The word ‘feasibility’ is not in the Paris Agreement, is not in the decision. But that’s really what it is [about].”

Overshoot

Dr Andrew King, a researcher in climate extremes at the University of Melbourne, echoes the call for a rational discussion about the way ahead, now that the dust has settled after Paris. The question of what it would take to achieve the 1.5C goal has been largely sidestepped so far, he tells Carbon Brief:

“I think one unintended outcome of the Paris Agreement was that it made the public think limiting warming to 1.5C is possible with only marginally stronger policy from government on reducing emissions and this is simply not the case.”

Carbon Countdown: How many years of current emissions would use up the IPCC's carbon budgets for different levels of warming?

Carbon Countdown: How many years of current emissions would use up the IPCC’s carbon budgets for different levels of warming? Infographic by Rosamund Pearce for Carbon Brief.

The reality is that staying under the 1.5C threshold is now nigh-on impossible, says King. Meeting the 1.5C target now means overshooting and coming back down using negative emissions technologies that “suck” carbon dioxide out of the air. The report will need to be explicit about this, he says.

King is cautious about overstating the world’s ability to meet the 1.5C goal, given that no single technology yet exists approaching the scale that would be required. He tells Carbon Brief:

“We will need negative emissions on a large-scale and for a long period of time to bring global temperatures back down to 1.5C. This isn’t possible with current technologies.”

Earlier this year, Carbon Brief published a series of articles on negative emissions, including a close up on the most talked-about option – Bioenergy with Carbon Capture and Storage (BECCS) – and a survey of which technologies climate experts think hold the most potential.

‘A great responsibility’

Another point on which the special report must be very clear is the difference between impacts at 1.5C compared to 2C, noted Thelma Krug, chair of the scientific steering committee for the special report.

The first study to compare the consequences at both temperatures found that an extra 0.5C could see global sea levels rise 10cm more by 2100 and is also “likely to be decisive for the future of coral reefs”.

King tells Carbon Brief:

“We need to know more about the benefits of limiting warming to 1.5C. If scientists can demonstrate to policymakers that we would see significantly fewer and less intense extreme weather events by putting the brakes on our emissions then it might lead to the necessary action to protect society and the environment from the worst outcomes of climate change.”

Infographic: How do the impacts of 1.5C of warming compare to 2C of warming?

Infographic: How do the impacts of 1.5C of warming compare to 2C of warming? By Rosamund Pearce for Carbon Brief.

The timing of the 1.5C special report is critical, said Lee yesterday. Due for delivery in September 2018, the IPCC’s aim is that the report should be “in time for” the UNFCCC’s “facilitative dialogue” scheduled that year.

This will be the first informal review under the global stocktake – a process that will enable countries to assess progress towards meeting the long-term goals set out under the Paris Agreement.

Expectations will be high, Lee told delegates yesterday:

“You can be sure that the report, when it is available in two years’ time…will attract enormous attention. So you have a great responsibility.”

Any scientist wishing their research to be included in the special report on 1.5C will need to submit it to a peer-reviewed journal by October 2017, and have it accepted for publication by April 2018, according to the IPCC’s timeline.

The scientific community is already mobilising behind this tight deadline. An international conference at Oxford University in September will see scientists, policymakers, businesses and civil society gather to discuss the challenges of meeting the 1.5C goal, which the organisers say “caught the world by surprise”.

Clearer communication

More than ever, the IPCC should strive to communicate the special report on 1.5C as clearly and accessibly as possible, Lee told the conference yesterday.

Given the primary audience will be non-specialists, the authors should think from the outset about how FAQs (Frequently Asked Questions) and graphics could be used to best effect, he said.

“The special report on 1.5C is not intended to replicate a comprehensive IPCC regular assessment reports. It should be focused on the matter at hand.”

The importance of the 1.5C topic calls for a different approach to previous IPCC reports, says King. He tells Carbon Brief:

“The report will fail to have much effect if the findings aren’t communicated well to policymakers and the public. This could be seen as a failing of the climate science community in the past. It has led to much weaker action on reducing climate change than is needed; this report needs to change this.”

A couple of recently published papers might give the authors some food for thought on this point.

The first study looks at how the process by which governments approve the IPCC’s Summaries for Policymakers (SPMs) affects their “readability”. Of the eight examples the study considers, all got longer during the government review stage. On average, they expanded by 30% or 1,500-2,000 words. The review process improved “readability” in half of cases, though all eight scored low for “storytelling”.

second paper explores the power of visuals for communicating climate science to non-specialists, and highlights where the IPCC may be falling short. Giving the examples below from the IPCC’s third and fourth assessment reports, the paper notes:

“A feeling of confusion among non-climate students is certainly not congruent with positive engagement yet this emotional state was frequently reported for SPM visuals.”

Images and infographics can be powerful, but only if the trade-off between scientific credibility and ease of understanding is carefully handled, the paper concludes.

Four examples of visuals used in the IPCC's third and fourth assessment reports. Source: McMahon et al., (2016)

Four examples of visuals used in the IPCC’s third and fourth assessment reports. Source: McMahon et al., (2016)

With all this mind, the scientists will leave the Geneva conference on Wednesday and prepare an outline for the 1.5C report based on their discussions over the previous three days.

They will submit the proposed plan to the IPCC panel at its next meeting in Bangkok in October. If the outline meets the panel’s expectations, it will accept it and things move forward. If it falls short, they can request changes be made. The discussions in Geneva are, therefore, unlikely to be the last word.



from Skeptical Science http://ift.tt/2cRl1uv

This is a re-post from Carbon Brief by Roz Pidcock

The head of the United Nation’s climate body has called for a thorough assessment of the feasibility of the international goal to limit warming to 1.5C.

Dr Hoesung Lee, chair of the Intergovernmental Panel on Climate Change (IPCC), told delegates at a meeting in Geneva, which is designed to flesh out the contents of a special report on 1.5C, that they bore a “great responsibility” in making sure it meets the expectations of the international climate community.

To be policy-relevant, the report will need to spell out what’s to be gained by limiting warming to 1.5C, as well as the practical steps needed to get there within sustainability and poverty eradication goals.

More than ever, urged Lee, the report must be easily understandable for a non-scientific audience. The IPCC has come under fire in the past over what some have called its “increasingly unreadable” reports.

Feasibility

In between the main “assessment reports” every five or six years, the IPCC publishes shorter “special reports” on specific topics. Past ones have included extreme weatherand renewable energy.

The IPCC was “invited” by the United Nations Convention on Climate Change (UNFCCC) to do a special report on 1.5C after the Paris Agreement codified a goal to limit global temperature rise to “well below 2C” and to “pursue efforts towards 1.5C”.

The aim for this week’s meeting in Geneva is, in theory, simple: to decide on a title for the report; come up with chapter headings; and write a few bullet points summarising what the report will cover.

On day two of three, Carbon Brief understands six “themes” have emerged as contenders. Judging by proceeding so far,  it seems likely that the feasibility of the 1.5C goal features highly on that list.

Referring to a questionnaire sent out to scientists, policymakers and other “interested parties” ahead of the scoping meeting to ask what they thought the 1.5C report should cover, Lee told the conference:

“One notion that runs through all this, is feasibility. How feasible is it to limit warming to 1.5C? How feasible is it to develop the technologies that will get us there?…We must analyse policy measures in terms of feasibility.”

The explicit mention of 1.5C in the Paris Agreement caught the scientific community somewhat off-guard, said Elena Manaenkova, incoming deputy secretary-general of theWorld Meteorological Organization.

Speaking in Geneva yesterday, she told delegates she felt “proud, but also somewhat concerned” about the outcome of the Paris talks. She said:

“I was there. I know the reason why it was done…[P]arties were keen to do even better, to go faster, to go even further…The word ‘feasibility’ is not in the Paris Agreement, is not in the decision. But that’s really what it is [about].”

Overshoot

Dr Andrew King, a researcher in climate extremes at the University of Melbourne, echoes the call for a rational discussion about the way ahead, now that the dust has settled after Paris. The question of what it would take to achieve the 1.5C goal has been largely sidestepped so far, he tells Carbon Brief:

“I think one unintended outcome of the Paris Agreement was that it made the public think limiting warming to 1.5C is possible with only marginally stronger policy from government on reducing emissions and this is simply not the case.”

Carbon Countdown: How many years of current emissions would use up the IPCC's carbon budgets for different levels of warming?

Carbon Countdown: How many years of current emissions would use up the IPCC’s carbon budgets for different levels of warming? Infographic by Rosamund Pearce for Carbon Brief.

The reality is that staying under the 1.5C threshold is now nigh-on impossible, says King. Meeting the 1.5C target now means overshooting and coming back down using negative emissions technologies that “suck” carbon dioxide out of the air. The report will need to be explicit about this, he says.

King is cautious about overstating the world’s ability to meet the 1.5C goal, given that no single technology yet exists approaching the scale that would be required. He tells Carbon Brief:

“We will need negative emissions on a large-scale and for a long period of time to bring global temperatures back down to 1.5C. This isn’t possible with current technologies.”

Earlier this year, Carbon Brief published a series of articles on negative emissions, including a close up on the most talked-about option – Bioenergy with Carbon Capture and Storage (BECCS) – and a survey of which technologies climate experts think hold the most potential.

‘A great responsibility’

Another point on which the special report must be very clear is the difference between impacts at 1.5C compared to 2C, noted Thelma Krug, chair of the scientific steering committee for the special report.

The first study to compare the consequences at both temperatures found that an extra 0.5C could see global sea levels rise 10cm more by 2100 and is also “likely to be decisive for the future of coral reefs”.

King tells Carbon Brief:

“We need to know more about the benefits of limiting warming to 1.5C. If scientists can demonstrate to policymakers that we would see significantly fewer and less intense extreme weather events by putting the brakes on our emissions then it might lead to the necessary action to protect society and the environment from the worst outcomes of climate change.”

Infographic: How do the impacts of 1.5C of warming compare to 2C of warming?

Infographic: How do the impacts of 1.5C of warming compare to 2C of warming? By Rosamund Pearce for Carbon Brief.

The timing of the 1.5C special report is critical, said Lee yesterday. Due for delivery in September 2018, the IPCC’s aim is that the report should be “in time for” the UNFCCC’s “facilitative dialogue” scheduled that year.

This will be the first informal review under the global stocktake – a process that will enable countries to assess progress towards meeting the long-term goals set out under the Paris Agreement.

Expectations will be high, Lee told delegates yesterday:

“You can be sure that the report, when it is available in two years’ time…will attract enormous attention. So you have a great responsibility.”

Any scientist wishing their research to be included in the special report on 1.5C will need to submit it to a peer-reviewed journal by October 2017, and have it accepted for publication by April 2018, according to the IPCC’s timeline.

The scientific community is already mobilising behind this tight deadline. An international conference at Oxford University in September will see scientists, policymakers, businesses and civil society gather to discuss the challenges of meeting the 1.5C goal, which the organisers say “caught the world by surprise”.

Clearer communication

More than ever, the IPCC should strive to communicate the special report on 1.5C as clearly and accessibly as possible, Lee told the conference yesterday.

Given the primary audience will be non-specialists, the authors should think from the outset about how FAQs (Frequently Asked Questions) and graphics could be used to best effect, he said.

“The special report on 1.5C is not intended to replicate a comprehensive IPCC regular assessment reports. It should be focused on the matter at hand.”

The importance of the 1.5C topic calls for a different approach to previous IPCC reports, says King. He tells Carbon Brief:

“The report will fail to have much effect if the findings aren’t communicated well to policymakers and the public. This could be seen as a failing of the climate science community in the past. It has led to much weaker action on reducing climate change than is needed; this report needs to change this.”

A couple of recently published papers might give the authors some food for thought on this point.

The first study looks at how the process by which governments approve the IPCC’s Summaries for Policymakers (SPMs) affects their “readability”. Of the eight examples the study considers, all got longer during the government review stage. On average, they expanded by 30% or 1,500-2,000 words. The review process improved “readability” in half of cases, though all eight scored low for “storytelling”.

second paper explores the power of visuals for communicating climate science to non-specialists, and highlights where the IPCC may be falling short. Giving the examples below from the IPCC’s third and fourth assessment reports, the paper notes:

“A feeling of confusion among non-climate students is certainly not congruent with positive engagement yet this emotional state was frequently reported for SPM visuals.”

Images and infographics can be powerful, but only if the trade-off between scientific credibility and ease of understanding is carefully handled, the paper concludes.

Four examples of visuals used in the IPCC's third and fourth assessment reports. Source: McMahon et al., (2016)

Four examples of visuals used in the IPCC’s third and fourth assessment reports. Source: McMahon et al., (2016)

With all this mind, the scientists will leave the Geneva conference on Wednesday and prepare an outline for the 1.5C report based on their discussions over the previous three days.

They will submit the proposed plan to the IPCC panel at its next meeting in Bangkok in October. If the outline meets the panel’s expectations, it will accept it and things move forward. If it falls short, they can request changes be made. The discussions in Geneva are, therefore, unlikely to be the last word.



from Skeptical Science http://ift.tt/2cRl1uv

The Madhouse Effect of climate denial

A new book by Michael Mann and Tom Toles takes a fresh look on the effects humans are having on our climate and the additional impacts on our politics. While there have been countless books about climate change over the past two decades, this one – entitled The Madhouse Effect - distinguishes itself by its clear and straightforward science mixed with clever and sometimes comedic presentation. 

In approximately 150 pages, this books deals with the basic science and the denial industry, which has lost the battle in the scientific arena and is working feverishly to confuse the public. The authors also cover potential solutions to halt or slow our changing climate. Perhaps most importantly, this book gives individual guidance – what can we do, as individuals, to help the Earth heal from the real and present harm of climate change?

To start the book, the authors discuss how the scientific method works, the importance of the precautionary principle, and how delaying actions have caused us to lose precious time in this global race to halt climate change. And all of this done in only 13 pages!

Next, the book dives briefly into the basic science of the greenhouse effect. Readers of this column know that the science of global warming is very well established with decades of research. But some people don’t realize that this research originated in the early 1800s with scientists such as Joseph Fourier. The book takes us on a short tour of history. Moving beyond these early works that focused exclusively on global temperatures, the authors come to expected impacts. They explain that a warming world, for instance, can be both drier and wetter!

This seeming paradox is a result of our expectation that areas which are currently wet will become wetter with an increase in the most-heavy downpours. A great example is the flooding this year in the Southeast United States. The reason for this is simple: a warmer atmosphere has more water vapor. In other areas – especially those already dry - it will become drier because evaporation will speed up. If there is no available moisture to enter into the atmosphere (i.e. an arid region), then the result is more drying. Using their candid, humorous, and clear language, the authors also discuss impacts on storms, hurricanes, rising oceans, and so on.

With the basic science covered, the authors quickly move into reasons why readers should care. Many of the big risks climate change presents are covered. Included here are security, water availability, food production, land use and availability, human health, and risks to the world economies and ecosystems. Simply put, these are not future and abstract risks. They are current risks that are becoming more severe and will affect all of us.

The chapter that was most interesting to me was the stages of climate denial. The first stage is “it’s not happening,” where outright denial of the reality of climate change was discussed. The authors use an example of this stage which came from the climate contrarians Roy Spencer and John Christy, who claimed that the Earth wasn’t warming. 

Technically, Christy and Spencer claimed that a part of the Earth’s atmosphere was cooling – in contrast to mountains of contrary evidence from around the globe. It was discovered that these contrarians had made some elementary errors in their calculations. When those calculations were corrected, their results fell into line with other research. I’ve written many times about the many errors from contrarian scientists – certainly not limited to Christy and Spencer. Within the scientific community, the contrarians have made so many technical errors that their work is no longer taken seriously by many scientists. But this hasn’t stopped the denial industry from showcasing their work, even though it is discredited. 

The next stages of denial involve admissions that climate change is happening but that it is natural, it will self-correct, or it will climate change will benefit us. The tale of mistakes made in the science underlying these arguments is told in the book, and would be humorous if not so serious. The tales involve cherry-picked data, resigning editors, and other mistakes. For readers who may be concerned about getting lost in the weeds, don’t worry. The authors do an excellent job hitting high points in an intelligible manner so that readers don’t need a PhD in climate science to see the patterns.

Next, the book covers the denial industry including some of the key groups and persons who were responsible for a systematic rejection of the prevailing scientific view, or at least insidious questioning so that in the public’s mind there would be no consensus. This effort which not only denied the causes and effects of climate change, but also the health risks of tobacco was coupled with funding to scientists, “petitions” designed to appear as if they originated from the National Academy of Sciences, and an interconnected network of funded think tanks.

One result of all of this is that actual scientists who spend their lives studying the Earth’s climate have been attacked. One of the authors of this book (Michael Mann) is perhaps in the world’s best position to tell this tale because he personally knows many of the scientists who have been attacked professionally and personally, including the late Stephen Schneider, Ben Santer, Naomi Oreskes, and himself. 

I have had the fortune of knowing these scientists and I can attest that they (and others) go into this field because they have an intense curiosity.

Click here to read the rest



from Skeptical Science http://ift.tt/2cRkXL6

A new book by Michael Mann and Tom Toles takes a fresh look on the effects humans are having on our climate and the additional impacts on our politics. While there have been countless books about climate change over the past two decades, this one – entitled The Madhouse Effect - distinguishes itself by its clear and straightforward science mixed with clever and sometimes comedic presentation. 

In approximately 150 pages, this books deals with the basic science and the denial industry, which has lost the battle in the scientific arena and is working feverishly to confuse the public. The authors also cover potential solutions to halt or slow our changing climate. Perhaps most importantly, this book gives individual guidance – what can we do, as individuals, to help the Earth heal from the real and present harm of climate change?

To start the book, the authors discuss how the scientific method works, the importance of the precautionary principle, and how delaying actions have caused us to lose precious time in this global race to halt climate change. And all of this done in only 13 pages!

Next, the book dives briefly into the basic science of the greenhouse effect. Readers of this column know that the science of global warming is very well established with decades of research. But some people don’t realize that this research originated in the early 1800s with scientists such as Joseph Fourier. The book takes us on a short tour of history. Moving beyond these early works that focused exclusively on global temperatures, the authors come to expected impacts. They explain that a warming world, for instance, can be both drier and wetter!

This seeming paradox is a result of our expectation that areas which are currently wet will become wetter with an increase in the most-heavy downpours. A great example is the flooding this year in the Southeast United States. The reason for this is simple: a warmer atmosphere has more water vapor. In other areas – especially those already dry - it will become drier because evaporation will speed up. If there is no available moisture to enter into the atmosphere (i.e. an arid region), then the result is more drying. Using their candid, humorous, and clear language, the authors also discuss impacts on storms, hurricanes, rising oceans, and so on.

With the basic science covered, the authors quickly move into reasons why readers should care. Many of the big risks climate change presents are covered. Included here are security, water availability, food production, land use and availability, human health, and risks to the world economies and ecosystems. Simply put, these are not future and abstract risks. They are current risks that are becoming more severe and will affect all of us.

The chapter that was most interesting to me was the stages of climate denial. The first stage is “it’s not happening,” where outright denial of the reality of climate change was discussed. The authors use an example of this stage which came from the climate contrarians Roy Spencer and John Christy, who claimed that the Earth wasn’t warming. 

Technically, Christy and Spencer claimed that a part of the Earth’s atmosphere was cooling – in contrast to mountains of contrary evidence from around the globe. It was discovered that these contrarians had made some elementary errors in their calculations. When those calculations were corrected, their results fell into line with other research. I’ve written many times about the many errors from contrarian scientists – certainly not limited to Christy and Spencer. Within the scientific community, the contrarians have made so many technical errors that their work is no longer taken seriously by many scientists. But this hasn’t stopped the denial industry from showcasing their work, even though it is discredited. 

The next stages of denial involve admissions that climate change is happening but that it is natural, it will self-correct, or it will climate change will benefit us. The tale of mistakes made in the science underlying these arguments is told in the book, and would be humorous if not so serious. The tales involve cherry-picked data, resigning editors, and other mistakes. For readers who may be concerned about getting lost in the weeds, don’t worry. The authors do an excellent job hitting high points in an intelligible manner so that readers don’t need a PhD in climate science to see the patterns.

Next, the book covers the denial industry including some of the key groups and persons who were responsible for a systematic rejection of the prevailing scientific view, or at least insidious questioning so that in the public’s mind there would be no consensus. This effort which not only denied the causes and effects of climate change, but also the health risks of tobacco was coupled with funding to scientists, “petitions” designed to appear as if they originated from the National Academy of Sciences, and an interconnected network of funded think tanks.

One result of all of this is that actual scientists who spend their lives studying the Earth’s climate have been attacked. One of the authors of this book (Michael Mann) is perhaps in the world’s best position to tell this tale because he personally knows many of the scientists who have been attacked professionally and personally, including the late Stephen Schneider, Ben Santer, Naomi Oreskes, and himself. 

I have had the fortune of knowing these scientists and I can attest that they (and others) go into this field because they have an intense curiosity.

Click here to read the rest



from Skeptical Science http://ift.tt/2cRkXL6