aads

Snowball Earth due to plate tectonics?

Artist’s concept of a Snowball Earth, via MIT/Space.com.

About 700 million years ago, the Earth is known to have undergone a period of climate change that sent the planet into a deep freeze. The freeze, lasting millions of years, is what geologists now refer to as the Snowball Earth period. Several theories have been proposed to explain what might have triggered this dramatic cool-down, which occurred during a geological era called the Neoproterozoic. A new study suggests that those major climate changes can be linked to one thing: the advent of plate tectonics.

The theory of plate tectonics was formulated in the 1960s. It states that the Earth’s crust and upper mantle – a layer called the lithosphere – is broken into moving pieces, or plates. Today’s scientists measure the movement of these plates and know they move very slowly, about as fast as your fingernails and hair grow. Plate tectonics on Earth – the grinding, slipping and sliding of land plates with respect to each other – produces earthquakes and volcanoes and, over time, great geologic features like mountain ranges.

The study researchers – Robert Stern at the University of Texas/Dallas and Nathaniel Miller of the University of Texas at Austin – said they expect their hypothesis to generate controversy. That’s because geologists usually place the start of plate tectonics at least 3 billion years ago, while their new hypothesis puts the process during the Neoproterozoic, about 542 million to 1 billion years ago.

According to Miller and Stern, there are a variety of traces in the geologic record that could be consistent with plate tectonics not getting started until the Neoproterozoic – around the same time as the era of the Snowball Earth. The researchers examined a suite of published scientific data on the geological activity during the Neoproterozoic and found many links between plate tectonics and a cooling world.

Their study, published in the April 2018 edition of the journal Terra Nova, lists 22 proposed ways plate tectonic activity could have brought about the global cooling that caused the Earth to be covered pole-to-pole with ice. The researchers mention explosive volcanoes cooling the planet by releasing sulfur into the atmosphere; the shifting of the plates changing the planet’s rotational axis; and increased rock weathering pulling CO2 — a greenhouse gas — out of the atmosphere and back into the Earth. Stern said:

This climate crisis could have been caused by a number of proximal causes, but the overall great cause was this revolution in Earth’s tectonic style.

The authors note that the geologic interval preceding the Neoproterozoic shows a lack of geological activity — a feature that has it earned it the nickname the Boring Billion.

In the study, they considered possible non-tectonic reasons for cooling that came from space — in other words, causes that don’t agree with their theory. These range from asteroid impacts to the collapse of ice rings that could have formed around Earth. If these were the cause of Snowball Earth, Stern and Miller’s theory falls apart — but the researchers say there’s not much evidence to support these triggers.

As usual, the authors say, more research is needed to test the link between the dawn of plate tectonics and Snowball Earth. They hope that this research will lead other geoscientists to consider the evidence and test the hypothesis. Stern said:

Revolutions in our field don’t happen often or very easily. You plant ideas and then you go and look for more data that either supports them or refutes them.

During the Snowball Earth phase, our world might have resembled this one, Enceladus, a snow-and-ice-covered moon of Saturn. Image via NASA/JPL-Caltech/Space Science Institute.

Bottom line: A new study suggests that the dawn of plate tectonics could have been responsible for Snowball Earth.

Source: Did the transition to plate tectonics cause Neoproterozoic Snowball Earth?

Via University of Texas at Austin



from EarthSky https://ift.tt/2K4zz7F

Artist’s concept of a Snowball Earth, via MIT/Space.com.

About 700 million years ago, the Earth is known to have undergone a period of climate change that sent the planet into a deep freeze. The freeze, lasting millions of years, is what geologists now refer to as the Snowball Earth period. Several theories have been proposed to explain what might have triggered this dramatic cool-down, which occurred during a geological era called the Neoproterozoic. A new study suggests that those major climate changes can be linked to one thing: the advent of plate tectonics.

The theory of plate tectonics was formulated in the 1960s. It states that the Earth’s crust and upper mantle – a layer called the lithosphere – is broken into moving pieces, or plates. Today’s scientists measure the movement of these plates and know they move very slowly, about as fast as your fingernails and hair grow. Plate tectonics on Earth – the grinding, slipping and sliding of land plates with respect to each other – produces earthquakes and volcanoes and, over time, great geologic features like mountain ranges.

The study researchers – Robert Stern at the University of Texas/Dallas and Nathaniel Miller of the University of Texas at Austin – said they expect their hypothesis to generate controversy. That’s because geologists usually place the start of plate tectonics at least 3 billion years ago, while their new hypothesis puts the process during the Neoproterozoic, about 542 million to 1 billion years ago.

According to Miller and Stern, there are a variety of traces in the geologic record that could be consistent with plate tectonics not getting started until the Neoproterozoic – around the same time as the era of the Snowball Earth. The researchers examined a suite of published scientific data on the geological activity during the Neoproterozoic and found many links between plate tectonics and a cooling world.

Their study, published in the April 2018 edition of the journal Terra Nova, lists 22 proposed ways plate tectonic activity could have brought about the global cooling that caused the Earth to be covered pole-to-pole with ice. The researchers mention explosive volcanoes cooling the planet by releasing sulfur into the atmosphere; the shifting of the plates changing the planet’s rotational axis; and increased rock weathering pulling CO2 — a greenhouse gas — out of the atmosphere and back into the Earth. Stern said:

This climate crisis could have been caused by a number of proximal causes, but the overall great cause was this revolution in Earth’s tectonic style.

The authors note that the geologic interval preceding the Neoproterozoic shows a lack of geological activity — a feature that has it earned it the nickname the Boring Billion.

In the study, they considered possible non-tectonic reasons for cooling that came from space — in other words, causes that don’t agree with their theory. These range from asteroid impacts to the collapse of ice rings that could have formed around Earth. If these were the cause of Snowball Earth, Stern and Miller’s theory falls apart — but the researchers say there’s not much evidence to support these triggers.

As usual, the authors say, more research is needed to test the link between the dawn of plate tectonics and Snowball Earth. They hope that this research will lead other geoscientists to consider the evidence and test the hypothesis. Stern said:

Revolutions in our field don’t happen often or very easily. You plant ideas and then you go and look for more data that either supports them or refutes them.

During the Snowball Earth phase, our world might have resembled this one, Enceladus, a snow-and-ice-covered moon of Saturn. Image via NASA/JPL-Caltech/Space Science Institute.

Bottom line: A new study suggests that the dawn of plate tectonics could have been responsible for Snowball Earth.

Source: Did the transition to plate tectonics cause Neoproterozoic Snowball Earth?

Via University of Texas at Austin



from EarthSky https://ift.tt/2K4zz7F

Earth and Jupiter are closest May 10

Astronomer Damian Peach captured this view of Jupiter on February 25, 2017, using a 1 meter-diameter Cassegrain telescope in Chile.

On May 10, 2018, the giant planet Jupiter will be closest to Earth for all of 2018.

Yet the night of May 8-9 was Jupiter’s opposition, when Earth flew between Jupiter and the sun, placing Jupiter opposite the sun in our sky. You’d think Jupiter would be closest to Earth on the day of opposition. But it isn’t. Why not?

Opposition happens when Earth flies between an outer planet, like Jupiter, and the sun. Why aren’t Earth and Jupiter closest on the day of opposition? Illustration via Heavens Above.

In 2018, Jupiter’s opposition comes about a day-and-a-half before its closest point to Earth:

Jupiter’s opposition May 9 at 1 UTC (May 8 at 8 p.m. CDT).

Jupiter closest May 10 at 12 UTC (7 a.m CDT).

At its closest, Jupiter comes to within 409 million miles (658 million km).

Why isn’t Jupiter closest on the day of opposition? They would be, if the orbits of Earth and Jupiter were perfect circles and if our two worlds orbited on exactly on the same exact plane. They don’t. Both Earth and Jupiter have orbits that are very nearly circular. They go around the sun on almost the same plane. But not quite.

Consider that, because Jupiter’s orbit is elliptical, not circular, its distance from the sun varies. Likewise, Earth’s orbit is elliptical, not circular. Our distance from the sun varies, too.

This image shows an orbit that’s vastly more elliptical than either Earth’s or Jupiter’s. Still, you get the idea. Perihelion = closest to sun. Aphelion = farthest from sun. Image via MadManTalks.

Jupiter’s orbit takes 11.9 Earth-years. Earth’s orbit takes one year.

Right now, we’re headed toward a perihelion of Jupiter. In other words, every single day, Jupiter is closer to the sun than it was the day before. Are you beginning to see how it can be closer to Earth after we go between it and the sun?

Not yet? Keep reading …

View larger. | Jupiter at its April 7, 2017 opposition with the Great Red Spot and moons Io, Europa, and Ganymede (L to R). Photo by Rob Pettengill in Austin, Texas.

Jupiter passed aphelion – its farthest point from the sun in its orbit – on February 16, 2017. Jupiter will reach perihelion – its closest point – on January 20, 2023. So Jupiter is getting closer to the sun each day. And what is Earth doing?

Earth’s perihelion happened in early January. So Earth is now getting a bit farther from the sun each day now.

Jupiter is now getting closer to the sun – bit by bit, closer and closer – every earthly day. And Earth is getting farther from the sun – bit by bit, farther and farther – every day.

And that’s how Jupiter and Earth can be closest for 2018 about one-and-a-half days after our planet’s pass between Jupiter and the sun.

Understand? If not, check out the two links below … or let’s talk in the comments below …

Geocentric ephemeris for Jupiter: 2018

Geocentric ephemeris for Sun: 2018

Another artist’s concept of Jupiter and Earth at opposition, when Earth passes between the sun and Jupiter.

Bottom line: You’d think Jupiter would be closest to Earth on the day we pass between it and the sun. We did that during the night of May 8, 2018 for clocks in North America. Yet our 2 worlds are closest on May 10. Why?



from EarthSky https://ift.tt/2K7r7V8

Astronomer Damian Peach captured this view of Jupiter on February 25, 2017, using a 1 meter-diameter Cassegrain telescope in Chile.

On May 10, 2018, the giant planet Jupiter will be closest to Earth for all of 2018.

Yet the night of May 8-9 was Jupiter’s opposition, when Earth flew between Jupiter and the sun, placing Jupiter opposite the sun in our sky. You’d think Jupiter would be closest to Earth on the day of opposition. But it isn’t. Why not?

Opposition happens when Earth flies between an outer planet, like Jupiter, and the sun. Why aren’t Earth and Jupiter closest on the day of opposition? Illustration via Heavens Above.

In 2018, Jupiter’s opposition comes about a day-and-a-half before its closest point to Earth:

Jupiter’s opposition May 9 at 1 UTC (May 8 at 8 p.m. CDT).

Jupiter closest May 10 at 12 UTC (7 a.m CDT).

At its closest, Jupiter comes to within 409 million miles (658 million km).

Why isn’t Jupiter closest on the day of opposition? They would be, if the orbits of Earth and Jupiter were perfect circles and if our two worlds orbited on exactly on the same exact plane. They don’t. Both Earth and Jupiter have orbits that are very nearly circular. They go around the sun on almost the same plane. But not quite.

Consider that, because Jupiter’s orbit is elliptical, not circular, its distance from the sun varies. Likewise, Earth’s orbit is elliptical, not circular. Our distance from the sun varies, too.

This image shows an orbit that’s vastly more elliptical than either Earth’s or Jupiter’s. Still, you get the idea. Perihelion = closest to sun. Aphelion = farthest from sun. Image via MadManTalks.

Jupiter’s orbit takes 11.9 Earth-years. Earth’s orbit takes one year.

Right now, we’re headed toward a perihelion of Jupiter. In other words, every single day, Jupiter is closer to the sun than it was the day before. Are you beginning to see how it can be closer to Earth after we go between it and the sun?

Not yet? Keep reading …

View larger. | Jupiter at its April 7, 2017 opposition with the Great Red Spot and moons Io, Europa, and Ganymede (L to R). Photo by Rob Pettengill in Austin, Texas.

Jupiter passed aphelion – its farthest point from the sun in its orbit – on February 16, 2017. Jupiter will reach perihelion – its closest point – on January 20, 2023. So Jupiter is getting closer to the sun each day. And what is Earth doing?

Earth’s perihelion happened in early January. So Earth is now getting a bit farther from the sun each day now.

Jupiter is now getting closer to the sun – bit by bit, closer and closer – every earthly day. And Earth is getting farther from the sun – bit by bit, farther and farther – every day.

And that’s how Jupiter and Earth can be closest for 2018 about one-and-a-half days after our planet’s pass between Jupiter and the sun.

Understand? If not, check out the two links below … or let’s talk in the comments below …

Geocentric ephemeris for Jupiter: 2018

Geocentric ephemeris for Sun: 2018

Another artist’s concept of Jupiter and Earth at opposition, when Earth passes between the sun and Jupiter.

Bottom line: You’d think Jupiter would be closest to Earth on the day we pass between it and the sun. We did that during the night of May 8, 2018 for clocks in North America. Yet our 2 worlds are closest on May 10. Why?



from EarthSky https://ift.tt/2K7r7V8

Prostate cancer diagnosis: how scientists are working to get it right

Doctor discussing a prostate cancer scan with patient

Almost 48,000 men are diagnosed with prostate cancer in the UK every year. But questions are being asked of the tests used to diagnose these men, and how they might be improved.

The tests used today can be painful, invasive and, unfortunately, not that good at telling doctors for sure which cases need urgent attention, or which can be watched over time. This challenge is most apparent when looking at the results of screening studies in men without prostate cancer symptoms using the PSA blood test.

Tests used to diagnose prostate cancer

Initial tests include:

Specialist tests include:

But it’s an issue than runs right through the process of diagnosing prostate cancer. And one that needs fixing.

“With prostate cancer, we’ve got the problem that some aggressive cancers are being missed, while lots of harmless cancers are being treated unnecessarily,” says Professor Malcom Mason, a Cancer Research UK prostate cancer expert.

There’s clearly room for improvement, but what would a good system look like?

“It’s not about picking up everything,” says Professor Mark Emberton, a prostate cancer specialist from University College London. “It’s about picking up the right cancers – the one’s that will cause harm and need treating. And avoiding the cancers that won’t.”

That’s what researchers are working towards. And they’re starting by improving existing tests that look for prostate cancer.

Specialist MRI – getting up close and personal with prostate cancer

An important step has been getting eyes on the tumour. “For a long time, we diagnosed and treated prostate cancer without ever properly seeing it,” explains Emberton. “This all changed with MRI.”

This is not just any old MRI. The big interest has been in a special type of imaging called multiparametric (or mp) MRI. It combines three or four different scans, which can help radiologists build a clearer picture of what’s going on in the prostate.

And results suggest it can steer diagnosis in the right direction – by ruling out the need for, or helping guide, follow-up biopsies.

In two studies that involved over 1000 men, scientists found that mpMRI can prevent unnecessary prostate biopsies. The latest results showed that 1 in 4 men with an abnormal PSA test or rectal exam didn’t need a biopsy, as the scan showed no abnormalities.

And for men who did need a biopsy, the scan results helped guide doctors taking these tissue samples. This made it less invasive and more likely to pick up abnormal cells than a standard biopsy.

Specialist MRI scans aren’t a standard part of prostate cancer diagnosis yet. They’re being reviewed by the National Institute for Health and Care Excellent (NICE), which will decide whether or not to recommend the scans as part of standard NHS prostate cancer diagnosis.

And as with any new technique, there’s work to be done to ensure that the way the scans are run and analysed is consistent across the UK. Prostate Cancer UK is working with NHS England and hospitals to address issues around access to MRI scanners and specialist training. And we’re campaigning so there are enough NHS staff in place to diagnose cancer, including the radiologists who interpret scans.

Putting Gleason grade ‘to the test’

Specialist MRI to guide biopsies looks like a big step forward. But what if it could replace biopsies altogether? That’s what Emberton and his team are aiming to find out, in a new study funded by the Medical Research Council and Cancer Research UK.

They will combine mpMRI with potential new diagnostic tests – such as looking at DNA shed by cancer cells into the blood – to see if they can predict prostate cancer progression better than the current system: Gleason grade.

“Gleason grade has been the mainstay in prostate cancer diagnosis for many decades,” says Emberton. “But the time may have come to challenge it by combining imaging with an understanding of the genetic basis of prostate cancer.”

To do this, they’ll recruit 1000 men with abnormal results following a specialised prostate MRI scan. As part of the study, the men will have an MRI-guided biopsy, as well as blood and urine analysis. The team will then monitor the men using electronic NHS records until they die.

“We’ll be able to track what treatment men are having, how successful it is and what happens to their cancer over time,” says Emberton. “And link that back to the information we got during diagnosis.”

At the end of the study, they hope to have a new set of tests that not only diagnose prostate cancer, but also help to guide treatment. This would mean that in the future, men could be diagnosed without the need for an invasive biopsy.

It’s an ambitious study, and it will be a while before we have results. But, according to Professor Mason, the length of the study is what sets it apart.

“Most studies stop when they get a diagnosis of ‘clinically significant prostate cancer’, but the issue is we don’t know what that actually means. The fact that this project will follow people up and look at survival is a huge strength,” he says.

“We’ll have to wait a while to get answers, but it will be worth the investment.”

What else is happening?

MRI isn’t the only focus for prostate cancer diagnosis, scientists are also testing an ultrasound process called shear wave elastography. This sci-fi sounding technique measures how elastic tissue is. And as tumours are stiffer (or less elastic) than normal prostate tissue, it could provide a way to get information on prostate cancer.

Scientists have tested the technique in 200 men who were about to have surgery for prostate cancer. They found that the test could detect prostate tumours, and the results broadly matched Gleason scores. They’ll now need to put shear wave elastography to the test in men who haven’t already been diagnosed.

As well as improving diagnosis, scientists are also working to identify men who might be at a higher risk of developing prostate cancer. And then work out what to do with this information. Research shows that black men and men with faults in genes called BRCA are more likely to develop prostate cancer. But Mason thinks there’s more to learn.

“We have some clues as to who might be more likely to develop prostate cancer, but we need to refine it more. We should look for more detailed genetic signatures and markers that could help us detect risk.”

For Professor Rosalind Eeles, based at The Institute of Cancer Research, London, that means focusing on faulty genes. Eeles has spearheaded an international collaboration to give researchers around the world access to genetic samples from men with prostate cancer. It could help scientists identify faulty genes more quickly, which doesn’t just help predict who might be at risk, it could also open the door to new treatments.

Beyond diagnosis

Getting diagnosis right is a major hurdle in boosting prostate cancer survival. But without effective treatments, it would all be for nothing.

Thankfully, progress is being made here too.

We’re supporting a trial testing combinations of drugs for men with advanced prostate cancer. STAMPEDE, led by Professor Nick James at the University of Birmingham, has been running for 13 years, and has already changed how advanced prostate cancer is treated. It continues to test new drug combinations, with the most recent results showing that adding the targeted drug abiraterone (Zytiga) to standard hormone treatment improves survival by 40%.

And for men with prostate cancer that hasn’t spread, incremental improvements in how radiotherapy is given is helping reduce side effects and the number of hospital visits.

Experimental treatments like high intensity focal ultrasound (HIFU) could also make waves for prostate cancer treatment. The ultrasound technique aims to kill cancer cells using high intensity sound waves and initial trial results suggest it may work as well as surgery or radiotherapy. Scientists are measuring the long-term benefits of the treatment.

Scientists are also testing if an experimental laser treatment can help. The futuristic approach, called vascular-targeted photodynamic therapy, was found to be safe in early trials. But more research is needed before we’ll know if the treatment can help save lives.

Bringing it home

The goal is clear: to make prostate cancer diagnosis smarter and more reliable.

There isn’t a quick fix, but by using new techniques to build a clearer picture of how prostate cancer progresses, that’s what scientists are aiming for. And if they can predict how prostate cancer behaves, it might make treatment more personal too.

“We’re working towards a system that would allow us to predict how prostate cancer will progress and pick the right treatment for each person,” says Emberton.

Katie 



from Cancer Research UK – Science blog https://ift.tt/2I7TDFN
Doctor discussing a prostate cancer scan with patient

Almost 48,000 men are diagnosed with prostate cancer in the UK every year. But questions are being asked of the tests used to diagnose these men, and how they might be improved.

The tests used today can be painful, invasive and, unfortunately, not that good at telling doctors for sure which cases need urgent attention, or which can be watched over time. This challenge is most apparent when looking at the results of screening studies in men without prostate cancer symptoms using the PSA blood test.

Tests used to diagnose prostate cancer

Initial tests include:

Specialist tests include:

But it’s an issue than runs right through the process of diagnosing prostate cancer. And one that needs fixing.

“With prostate cancer, we’ve got the problem that some aggressive cancers are being missed, while lots of harmless cancers are being treated unnecessarily,” says Professor Malcom Mason, a Cancer Research UK prostate cancer expert.

There’s clearly room for improvement, but what would a good system look like?

“It’s not about picking up everything,” says Professor Mark Emberton, a prostate cancer specialist from University College London. “It’s about picking up the right cancers – the one’s that will cause harm and need treating. And avoiding the cancers that won’t.”

That’s what researchers are working towards. And they’re starting by improving existing tests that look for prostate cancer.

Specialist MRI – getting up close and personal with prostate cancer

An important step has been getting eyes on the tumour. “For a long time, we diagnosed and treated prostate cancer without ever properly seeing it,” explains Emberton. “This all changed with MRI.”

This is not just any old MRI. The big interest has been in a special type of imaging called multiparametric (or mp) MRI. It combines three or four different scans, which can help radiologists build a clearer picture of what’s going on in the prostate.

And results suggest it can steer diagnosis in the right direction – by ruling out the need for, or helping guide, follow-up biopsies.

In two studies that involved over 1000 men, scientists found that mpMRI can prevent unnecessary prostate biopsies. The latest results showed that 1 in 4 men with an abnormal PSA test or rectal exam didn’t need a biopsy, as the scan showed no abnormalities.

And for men who did need a biopsy, the scan results helped guide doctors taking these tissue samples. This made it less invasive and more likely to pick up abnormal cells than a standard biopsy.

Specialist MRI scans aren’t a standard part of prostate cancer diagnosis yet. They’re being reviewed by the National Institute for Health and Care Excellent (NICE), which will decide whether or not to recommend the scans as part of standard NHS prostate cancer diagnosis.

And as with any new technique, there’s work to be done to ensure that the way the scans are run and analysed is consistent across the UK. Prostate Cancer UK is working with NHS England and hospitals to address issues around access to MRI scanners and specialist training. And we’re campaigning so there are enough NHS staff in place to diagnose cancer, including the radiologists who interpret scans.

Putting Gleason grade ‘to the test’

Specialist MRI to guide biopsies looks like a big step forward. But what if it could replace biopsies altogether? That’s what Emberton and his team are aiming to find out, in a new study funded by the Medical Research Council and Cancer Research UK.

They will combine mpMRI with potential new diagnostic tests – such as looking at DNA shed by cancer cells into the blood – to see if they can predict prostate cancer progression better than the current system: Gleason grade.

“Gleason grade has been the mainstay in prostate cancer diagnosis for many decades,” says Emberton. “But the time may have come to challenge it by combining imaging with an understanding of the genetic basis of prostate cancer.”

To do this, they’ll recruit 1000 men with abnormal results following a specialised prostate MRI scan. As part of the study, the men will have an MRI-guided biopsy, as well as blood and urine analysis. The team will then monitor the men using electronic NHS records until they die.

“We’ll be able to track what treatment men are having, how successful it is and what happens to their cancer over time,” says Emberton. “And link that back to the information we got during diagnosis.”

At the end of the study, they hope to have a new set of tests that not only diagnose prostate cancer, but also help to guide treatment. This would mean that in the future, men could be diagnosed without the need for an invasive biopsy.

It’s an ambitious study, and it will be a while before we have results. But, according to Professor Mason, the length of the study is what sets it apart.

“Most studies stop when they get a diagnosis of ‘clinically significant prostate cancer’, but the issue is we don’t know what that actually means. The fact that this project will follow people up and look at survival is a huge strength,” he says.

“We’ll have to wait a while to get answers, but it will be worth the investment.”

What else is happening?

MRI isn’t the only focus for prostate cancer diagnosis, scientists are also testing an ultrasound process called shear wave elastography. This sci-fi sounding technique measures how elastic tissue is. And as tumours are stiffer (or less elastic) than normal prostate tissue, it could provide a way to get information on prostate cancer.

Scientists have tested the technique in 200 men who were about to have surgery for prostate cancer. They found that the test could detect prostate tumours, and the results broadly matched Gleason scores. They’ll now need to put shear wave elastography to the test in men who haven’t already been diagnosed.

As well as improving diagnosis, scientists are also working to identify men who might be at a higher risk of developing prostate cancer. And then work out what to do with this information. Research shows that black men and men with faults in genes called BRCA are more likely to develop prostate cancer. But Mason thinks there’s more to learn.

“We have some clues as to who might be more likely to develop prostate cancer, but we need to refine it more. We should look for more detailed genetic signatures and markers that could help us detect risk.”

For Professor Rosalind Eeles, based at The Institute of Cancer Research, London, that means focusing on faulty genes. Eeles has spearheaded an international collaboration to give researchers around the world access to genetic samples from men with prostate cancer. It could help scientists identify faulty genes more quickly, which doesn’t just help predict who might be at risk, it could also open the door to new treatments.

Beyond diagnosis

Getting diagnosis right is a major hurdle in boosting prostate cancer survival. But without effective treatments, it would all be for nothing.

Thankfully, progress is being made here too.

We’re supporting a trial testing combinations of drugs for men with advanced prostate cancer. STAMPEDE, led by Professor Nick James at the University of Birmingham, has been running for 13 years, and has already changed how advanced prostate cancer is treated. It continues to test new drug combinations, with the most recent results showing that adding the targeted drug abiraterone (Zytiga) to standard hormone treatment improves survival by 40%.

And for men with prostate cancer that hasn’t spread, incremental improvements in how radiotherapy is given is helping reduce side effects and the number of hospital visits.

Experimental treatments like high intensity focal ultrasound (HIFU) could also make waves for prostate cancer treatment. The ultrasound technique aims to kill cancer cells using high intensity sound waves and initial trial results suggest it may work as well as surgery or radiotherapy. Scientists are measuring the long-term benefits of the treatment.

Scientists are also testing if an experimental laser treatment can help. The futuristic approach, called vascular-targeted photodynamic therapy, was found to be safe in early trials. But more research is needed before we’ll know if the treatment can help save lives.

Bringing it home

The goal is clear: to make prostate cancer diagnosis smarter and more reliable.

There isn’t a quick fix, but by using new techniques to build a clearer picture of how prostate cancer progresses, that’s what scientists are aiming for. And if they can predict how prostate cancer behaves, it might make treatment more personal too.

“We’re working towards a system that would allow us to predict how prostate cancer will progress and pick the right treatment for each person,” says Emberton.

Katie 



from Cancer Research UK – Science blog https://ift.tt/2I7TDFN

Hawaii’s erupting Kilauea volcano

NASA’s Terra satellite acquired this image on May 6, 2018. Massive sulfur dioxide plumes from Kilauea volcano are shown here in yellow and green. A smaller, but thicker, sulfur dioxide gas plume can be seen coming from Kilauea. The prevailing trade winds blow the plumes to the southwest, out over the ocean. Image via NASA/ METI/ AIST/ Japan Space Systems/ Japan ASTER Science Team.

Lava and sulfur dioxide gas are continuing to spew from Kilauea volcano on Hawaii’s Big Island, where flows of lava across a rural neighborhood have caused evacuations. By late Tuesday (May 8, 2018), some 104 acres were covered by lava. Hawaii Civil Defense said that 35 structures — including at least 26 homes — had been destroyed.

A total of 12 volcanic fissures had formed as of late Tuesday. Residents were voicing frustration and anxiety against a backdrop of flowing lava and hazardous fumes.

May 4, 2018 Third Leilani Eruption from Mick Kalber on Vimeo.

Videographer Mick Kalber was on the ground in Leilani Estates on May 5, where he told captured the dramatic video above and told this story:

This is a killer video! And may be one of the last ground level shots I’ll be able to get before I have to evacuate … After all the lava drained out of the Pu’u ‘O’o Vent (the main vent of the 35 year long current eruption) last Monday [April 30], the contents moved down the East rift zone, causing hundreds of earthquakes as far away as Kapoho, some 15 miles downslope. All week, a new eruption had been forecast … cracks appeared on roadways, as the Earth began to swell. And then, on Thursday afternoon [May 3], Pele (the Volcano Goddess) made her appearance in the lower part of Leilani Estates Subdivision. At first, fountains of lava shot up into the air … but within a few hours, she had settled down into a lava flow that now threatens dozens of nearby homes, and a geothermal power plant just a quarter mile downslope. That flow has now stopped… but several more continue to pop out nearby. I live in the subdivision, and we are continually experiencing rolling earthquakes … it ain’t over till it’s over!

Meanwhile, the video below, also from Mick Kalber, shows the view from the air:

May 6, 2018 HUGE Fissure Eruption from Mick Kalber on Vimeo.

The eruptive activity began on April 30, 2018, when the floor of Kilauea’s crater began to collapse. Earthquakes followed, including one that measured magnitude +6.9, a very strong earthquake. All the while, lava was being pushed into new underground areas that eventually broke through the ground in such areas as the Leilani Estates (population 1,560 at the 2010 census) near the town of Pahoa, Hawaii.

Evacuations in Leilani Estates began on May 4.

The USGS is doing a good job following the volcano on Twitter.

Follow @UGSGVolcanoes on Twitter.

Kilauea is one of the world’s most active volcanoes, and it’s the youngest and southeastern-most volcano on Hawaii’s Big Island. Eruptive activity along the East Rift Zone has been continuous since 1983.

Bottom line: Images from the May 2018 eruption of Kilauea volcano on Hawaii’s Big Island.

Read more from NASA.



from EarthSky https://ift.tt/2rvaxad

NASA’s Terra satellite acquired this image on May 6, 2018. Massive sulfur dioxide plumes from Kilauea volcano are shown here in yellow and green. A smaller, but thicker, sulfur dioxide gas plume can be seen coming from Kilauea. The prevailing trade winds blow the plumes to the southwest, out over the ocean. Image via NASA/ METI/ AIST/ Japan Space Systems/ Japan ASTER Science Team.

Lava and sulfur dioxide gas are continuing to spew from Kilauea volcano on Hawaii’s Big Island, where flows of lava across a rural neighborhood have caused evacuations. By late Tuesday (May 8, 2018), some 104 acres were covered by lava. Hawaii Civil Defense said that 35 structures — including at least 26 homes — had been destroyed.

A total of 12 volcanic fissures had formed as of late Tuesday. Residents were voicing frustration and anxiety against a backdrop of flowing lava and hazardous fumes.

May 4, 2018 Third Leilani Eruption from Mick Kalber on Vimeo.

Videographer Mick Kalber was on the ground in Leilani Estates on May 5, where he told captured the dramatic video above and told this story:

This is a killer video! And may be one of the last ground level shots I’ll be able to get before I have to evacuate … After all the lava drained out of the Pu’u ‘O’o Vent (the main vent of the 35 year long current eruption) last Monday [April 30], the contents moved down the East rift zone, causing hundreds of earthquakes as far away as Kapoho, some 15 miles downslope. All week, a new eruption had been forecast … cracks appeared on roadways, as the Earth began to swell. And then, on Thursday afternoon [May 3], Pele (the Volcano Goddess) made her appearance in the lower part of Leilani Estates Subdivision. At first, fountains of lava shot up into the air … but within a few hours, she had settled down into a lava flow that now threatens dozens of nearby homes, and a geothermal power plant just a quarter mile downslope. That flow has now stopped… but several more continue to pop out nearby. I live in the subdivision, and we are continually experiencing rolling earthquakes … it ain’t over till it’s over!

Meanwhile, the video below, also from Mick Kalber, shows the view from the air:

May 6, 2018 HUGE Fissure Eruption from Mick Kalber on Vimeo.

The eruptive activity began on April 30, 2018, when the floor of Kilauea’s crater began to collapse. Earthquakes followed, including one that measured magnitude +6.9, a very strong earthquake. All the while, lava was being pushed into new underground areas that eventually broke through the ground in such areas as the Leilani Estates (population 1,560 at the 2010 census) near the town of Pahoa, Hawaii.

Evacuations in Leilani Estates began on May 4.

The USGS is doing a good job following the volcano on Twitter.

Follow @UGSGVolcanoes on Twitter.

Kilauea is one of the world’s most active volcanoes, and it’s the youngest and southeastern-most volcano on Hawaii’s Big Island. Eruptive activity along the East Rift Zone has been continuous since 1983.

Bottom line: Images from the May 2018 eruption of Kilauea volcano on Hawaii’s Big Island.

Read more from NASA.



from EarthSky https://ift.tt/2rvaxad

U.S. Army, Uber Sign Research Agreement

Uber and Army Research Lab established today an ongoing plan to partner around developing and testing the vehicles used in Uber's proposed urban aviation ride-share network.

from https://ift.tt/2rvrMbk
Uber and Army Research Lab established today an ongoing plan to partner around developing and testing the vehicles used in Uber's proposed urban aviation ride-share network.

from https://ift.tt/2rvrMbk

The 1970s Global Cooling Zombie Myth and the Tricks Some People Use to Keep it Alive, Part II

In Part I of this look back at 1970s climate science I reviewed the findings of Peterson, Connolley, and Fleck's seminal 2008 survey of 1970s peer-reviewed literature (hereafter referred to as PCF08) which found no 1970s "consensus" about a future global cooling/ice age. I also looked at a "skeptic" critique of this paper from the blogsite No Tricks Zone, penned by Kenneth Richard (hereafter referred to as NTZ), and showed some of the errors and fallacies he used to distort 1970s science. In this post I'll take a closer look at some of the papers used by NTZ to claim that there was a global cooling "consensus" in the 1970s.


The primary critique by NTZ is given in a blog post which highlights 35 "sample global cooling/low CO2 climate influence papers" from his full list of 285 papers. Like the full list of 285 papers, these samples are not given in any apparent order, not by date or by author. I rearranged these by date of publication in order to see the progression of the science as the various threads of climate research (see Part I) developed throughout the period. Below is a screenshot of the first page of a spreadsheet (click on image for pdf) of the 35 sample papers.

NTZ 35 Papers page 1 pdf

The first thing to notice is that not all of these samples are peer-reviewed scientific papers. Two of these are RAND Corporation documents (Fletcher, 1969 and Libby, 1970), one is a book review (Post, 1979), and one is from a popular science magazine (Douglas, 1975). This article was mentioned in PCF08 in their "Popular Literature of the Era" sidebar but not included in their survey. Two other sample papers are perhaps "borderline" between grey and peer-reviewed literature: two Master's thesis papers (Cimorelli & House, 1974 and Magill, 1980). And there are a few others I'm not sure about.

Contrast this with PCF08's survey which excluded anything from the "grey literature" and focused exclusively on the peer-reviewed literature.1 The best place to find out what the scientists of the time were saying about the future climate trajectory is in the peer-reviewed literature. Things can get muddy once you include documents from the popular press. Yes, these other sources may offer insight into what the "sense of the times" were, but they can also misrepresent or distort what scientists thought at the time. NTZ expands the goal posts to include some of this grey literature and thus gathers more "papers" for his "global cooling consensus".


A detailed examination of NTZ's sample list of 35 papers is beyond the scope of this blog post (see pdf of spreadsheet above for further details on papers not discussed in this post). Instead, I will focus on some examples to illustrate the ways NTZ misrepresents 1970s climate science.

NTZ Subsections

The table above shows the sub-sections which NTZ used to organize his pile of papers. In the sample list, the papers are in the two most-numerous categories: "Cooling Since 1940..." and "Dubious Human Influence...", and there is one paper from the sub-section, "Uncertainties...". By far, most papers in the full NTZ list (as well as the sample list) are found in the sub-section: "Cooling Since 1940, Forecasts for Continued Cooling/Ice Age". As pointed out in Part I, NTZ shifts the focus from PCF08's look to future climate projections to a view of what scientists were saying about the recent mid-century cooling trend.

NTZ's Cooling Category

Benton 1970, Carbon Dioxide and its Role in Climate Change

Benton 1970, a short two-page paper, succinctly lays out some of the main threads of 1970s climate science. Benton notes the rise of global temperatures at the beginning of the 20th century, followed by the mid-century cooling trend. He discusses CO2's impact on climate and then aerosol's (both volcanic and human) possible cooling and/or warming impacts. PCF08 included this paper in their survey in their "warming" category, perhaps because of this quote:

Recent numerical studies have indicated that a 10% increase in carbon dioxide should result, on average, in a temperature increase of about 0.3°C at the earth's surface. The present rate of increase of 0.7 ppm per year would therefore (if extrapolated to 2000 A.D.) result in a warming of about 0.6°Ca very substantial change.

NTZ ignores this clear prediction and only zeros in on what the paper says about mid-century cooling. This cherry picking, or selective quoting, is by far NTZ's most used technique.

Schultz 1972, Holocene Interglacial Migrations of Mammals and Other Vertebrates

Almost any paper which mentions the mid-century cooling, no matter how tangentially, is added to NTZ's pile of papers. Schultz 1972 is about climatic impacts on Holocene mammal migrations. He discusses how armadillos expanded their North American range northwards during the first half of the 20th century, but are now, in 1972, headed back south:

The armadillos, however, appear to have disappeared from their lately acquired northern range, and now seem to be found chiefly south of the central part of Kansas. This change in distribution has taken place during the past 10 yr, when the winters have been longer and colder.

That's it. The main focus of the paper is on trying to see if changes in the ranges of other late Pleistocene and Holocene mammals might also be due to climate changes. It looks at the present armadillo migrations as a possible analog of the past, and says nothing about future climate trajectory.

Kukla 1972, Insolation and Glaciation

One of the few papers that actually forecast a future ice age is Kukla 1972, as quoted by NTZ: "A new glacial insolation regime, expected to last 8000 years, began just recently. Mean global temperatures may eventually drop about 1°C in the next hundred years." This paper may be the only example of a "cooling" paper missed by the PCF08 survey and found by NTZ.

Kukla 1972 is also one of the few papers that deals with that other thread of 1970s climate science: Milankovitch cycles. Kukla showed how past changes in orbital cycles very slightly altered the amount of solar energy hitting the Earth, leading to past glacial and interglacial periods. Kukla rightly noted that the "change in the heat income due to Milankovitch mechanism is so minute that it cannot lead directly to glaciation or deglaciation...Only when multiplied by some efficient feedback mechanism can the insolation trigger climatic change". One of the main feedbacks is from changes in the Earth's albedo. At the start of a glacial period, as more and more ice accumulates in the polar regions, more and more incoming solar radiation is reflected from the growing ice fields, which leads to further cooling.

Besides looking back to past ice ages, Kukla also extrapolated the orbital cycles into the future to forecast continued cooling and a descent into another ice age. But Kukla stressed that his extrapolation was "still today [a] somewhat speculative" estimate.

Ellsaesser 1974, Has Man, Through Increasing Emissions of Particulates, Changed the Climate?

This paper is from a symposium held in 1974 on atmospheric pollution, and so it may not technically be "peer-reviewed". The NTZ quote from this paper mentions that there had been "a flood of papers" dealing with the "exponentially increasing pollution". Ellsaesser notes that "the particulate increases were usually cited as at least contributing to the post 1940 cooling and possibly capable of bringing on another ice age". But a full reading of the paper shows that the entire point of Ellsaesser's paper is to counter that argument. He complains about the environmental alarmism of the day concerning pollution and cooling. And, ironically, he complains that not enough emphasis is given to CO2:

Of the climatic problems raised, the CO2 one is best understood. There is essentially universal agreement that atmospheric CO2 is increasing as a result of the consumption of fossil fuels and that this should enhance the 'greenhouse' effect leading to a warming of the planetary surface. The strongest support for the upward trend in air-borne particulates derives from the failure of observational data to support our understanding of the CO2 effect. Yet no one ever hears the argument that man might consider a deliberate increase in particulates to counter the CO2 effect or alternatively that the CO2 effect is just what is needed to prevent or delay the onset of the next glacial advance which is now imminent according to students of this problem.

Robock 1978, Internally and Externally Caused Climate Change

NTZ put this paper in his "cooling" sub-section, but a better fit would probably be in his  "dubious human influence..." sub-section. Robock used a simple energy balance model to investigate how various forcings, both natural and anthropogenic, may have influenced global temperatures from about the 1880s to the 1960s. For the natural forcings Robock made various runs using different solar forcings and two runs using different volcanic aerosol numbers. For the anthropogenic forcings he used one run each for CO2, aerosols, and "heat". Robock found that the forcing which most closely mirrored the actual temperature observations was volcanic aerosols: "volcanic dust is the only external forcing that produces a model response significantly like the observations".

Robock only modelled each forcing separately, not in combination. But he did note that some of these forcings, working in tandem in the real world, could possibly explain the observed temperature record of the past ~100 years.

What about CO2 and anthro-aerosols? His model showed a slight warming from CO2 and a slight cooling from human pollution, not enough to really matter, and when combined they essentially cancelled each other out. NTZ's quote from the paper highlights this fact: "One could sum the anthropogenic effects for each region, which would show almost no effect in the NH [Northern Hemisphere] and warming in the SH [Southern Hemisphere]." But he ignores the following sentence: "Drawing conclusions from this exercise would not be meaningful, however, due to our lack of understanding of the aerosol effect." Robock also pointed out:

All the effects [of human forcings] almost double every 20 years. They are not of sufficient magnitude to have much effect on the observational records, which end about 1960, but may have a measurable effect in the near future.

The relative magnitudes of the effects may change in the future due to changing human pollution policies. Restrictions on particulate pollution and anticipated measures against sulfate aerosols will lessen the effects of industrial aerosols.

Indeed, this is what actually happened. Clean Air rules lessened particulate/aerosol pollution but did nothing to limit CO2 emissions. The cooling effect of aerosols never materialized but the warming effect of CO2 has steadily risen since Robock's simple model runs.

Nelson, et al. 1979, Non-Climatic Trends in Divisional and State Mean Temperatures: A Case Study in Indiana2

This paper is a "case study" which looked at the mid-century cooling in Indiana, specifically in the summer months (June, July, and August) since the authors were also interested in any change in climate during the growing season. When scientists in the 1960s-70s compiled data to build their global average temperature series they used state averages of monthly mean temperatures from weather stations around the world. Nelson et al. noted that:

any changes in location of stations and time of observation were tacitly assumed to be random and to have little effect on divisional and state mean temperature records. Schaal and Dale (1977), however, showed that in Indiana a systemic change in the time that observations were taken at cooperative climatological stations--from evening to morning--contributed to the downward trend observed in divisional and state mean temperatures since 1970. [Italics in original.]

Was the mid-century cooling really just an artifact of changes in temperature recording methods which made it seem like the globe was cooling? Nelson et al took a closer look at the Indiana data and made adjustments to correct for any biases. Their corrections got rid of some of the mid-century cooling, but not all of it.

Of course, this is just summer months in Indiana, not a global view of temperature changes. But it shows the care and precision used by scientists who did the early work of building an accurate record of global temperatures. I'm surprised that NTZ included this paper in his collection because usually "skeptics" tend to dislike "temperature adjustments" (see here and here). I guess it is okay as long as those adjustments still show cooling.

Karl, et al. 1984, Decreasing Diurnal Temperature Range in the United States and Canada from 1941 through 1980

The last paper I'll look at from this sub-category is about a changing "diurnal temperature range" or DTR. Here is NTZ's quote:

An appreciable number of nonurban stations in the United States and Canada have been identified with statistically significant (at the 90% level) decreasing trends in the monthly mean diurnal temperature range between 1941-80.

I think NTZ likes this paper because it mentions "decreasing trends" and "temperature" in the first sentence of the paper's abstract. But, I don't think he understands what "decreasing trends in monthly mean diurnal temperature range" means. The DTR is merely the "difference between the maximum and minimum temperature during a 24-hour period" (IPCC). The "decreasing trend" discussed in this paper refers to a decrease in the range between the maximum and minimum daily temperatures.

Diurnal Temperature Range

For example (see figure above), let's say the average monthly max. temp. at some location was 25°C and the min. temp. was 10°C, the range would be 15°C. Now, at some later time, if the max. temp. is 27°C and the min. temp. is 14°C, then the DTR would be 13°C. Hence, the DTR has decreased from 15 to 13°C. Notice in this example the average temperatures went up but the DTR went down. Also notice the min. temps increased slightly more than the max. temps.

Why would scientists study DTR? In the abstract, Karl et al say:

The physical mechanism responsible for the observed decrease in the diurnal range is not known. Possible explanations include greenhouse effects such as changes in cloudiness, aerosol loading, atmospheric water vapor content, or carbon dioxide.

They also pointed out that the specific nature of the decreasing DTR they found also points to an enhanced greenhouse effect:

An increased greenhouse effect due to humidity, CO2, aerosols or clouds is expected to produce a relative increase of the minima with respect to the maxima and a decrease of the diurnal range. The reported observations are consistent with the hypothesized changes.

Even today, the jury is still not settled on weather a decreasing DTR is a solid "fingerprint" of an enhanced CO2-greenhouse warming. But Karl et al was one of the papers from the 1970s (actually 1980s!) which offered support for this idea, and had nothing to do with cooling (mid-century or in a future ice age).

NTZ's Dubious Human Influence Category

Willett 1974, Do Recent Climatic Fluctuations Portend an Imminent Ice Age?

This is another rare paper in NTZ's collection which actually had a prediction about an "imminent"3 ice age. It is also one of the many PCF08 papers which NTZ repurposed: PCF08 had categorized it as "neutral". But NTZ's selected quote4 ignored any mention of a future ice age and zeroed in on Willett's dismissal of CO2's influence on recent climate:

[T]he author is convinced that recent increases of atmospheric carbon dioxide have contributed much less than 5% of the recent changes of atmospheric temperature, and will contribute no more than that in the foreseeable future.

Willett also didn't think that particulate/aerosol/dust pollution would have much effect. He preferred a solar influence hypothesis and concluded that there was no "imminent" ice age coming:

There is no other reason to anticipate an Ice Age in the near future. It is the author's contention that. 1. The pollution hypotheses [CO2 or dust/aerosols] cannot properly be made to account for recent climatic fluctuations. 2. The solar hypothesis appears to fit the observed pattern of climatic fluctuation in much greater detail, and does not call for an imminent Ice Age.

And to drive home the point, Willett said it was his "final conclusion that man will pollute himself off the face of the earth long before he can pollute himself into an Ice Age".

Well, actually, here is his final conclusion, just in case the reader wasn't paying attention:

The author's reasoned answer, then, to the question, 'Do Recent Climatic Fluctuations Portend an Imminent Ice Age?' is an emphatic NO...the next Ice Age is unlikely for at least 10,000 years, more likely for more than 30,000 years, unless the sun takes off on a new tangent. [Emphasis in the original]

NTZ ignored all of these clear predictions about a future ice age and instead he focused on Willett's views, since shown to be wrong, about CO2's "dubious influence" on climate.

Dunbar 1976, Climatic Change and Northern Development

Here is NTZ's selected quote from this paper:

The measured increase in carbon dioxide in the atmosphere, according to the most recent computations, would not be enough to have any measurable climatic effect.

The very next part of this quote is:

Rasool and Schneider (1971) conclude that an increase in the carbon dioxide content of eight times the present level would produce an increase in surface temperature of less than 2°C, and that if the concentration were to increase from the present level of 320 parts per million to about 400 by the year 2000, the predicted increase in surface global temperature would be about 0.1°C

Rasool and Schneider (1971) (hereafter: R&S71) is one of the seminal climate papers from the 1970s, and PCF08 categorized it as one of their seven "cooling" papers. PCF08 stated that this paper "may be the most misinterpreted and misused paper in the story of global cooling". R&S71 was one of the first studies using modeling, and their results found minimal warming effects from CO2 and large cooling effects from aerosols. But other scientists, and even Rasool and Schneider, quickly noticed flaws in R&S71 (Charlson et al 1972 and Rasool and Schneider 1972). Further improvements in 1975 (this time by Schneider and Mass) showed that R&S71 "had overestimated cooling [from aerosols] while underestimating the greenhouse warming contributed by carbon dioxide" (PCF08).

Dunbar's paper was published one year after Schneider and Mass's corrections to R&S71, but no corrections to R&S71 are noted in Dunbar's paper, he still used the incorrect results. Perhaps Dunbar can be excused for this oversight since the corrections were still very new. But, when we look back at these papers we should be mindful of the larger context of the science of the time, and recognize when a "money quote" may require further research.

Barrett 1978 Man and Climate: An Overview

A few years after Dunbar, Barrett 1978 also used R&S71 without noting the corrections, but he also referenced enough of the other relevant literature to arrive at a more accurate estimate of CO2's effect. The result is a good overview of the state of the science in the 1970s.

Still, NTZ was able to find a quote which seemed to downplay man's influence on climate:

In particular, detection of an anthropogenic influence through statistical analysis alone requires a long run of data of good quality and careful attention to measures of significance. It is most important to avoid the post hoc ergo propter hoc fallacy that a trend of a few years’ duration or less, following some change in human activities, can be attributed to that change even when no sound physical causal relationship is evident. [Emphasis in the original.]

Conveniently, NTZ avoided the rest of the paper which went on to describe the "sound physical causal relationship[s]" between CO2, aerosols, and climate. In the concluding remarks of the paper, Barrett predicted that atmospheric CO2 would rise to "between 350 and 415 ppmv by the end of the [20th] century". The actual value in 2000 was about 368 ppm, well within the predicted range.

Barrett also predicted that this increase in CO2 "should increase the temperature by 0.3°C; this trend might be detectable by careful analysis unless it is offset by other effects, such as those of aerosols". Careful analysis by NASA-GISS, NOAA, HadCRU, and JMA (as well as others) have shown that this prediction was also remarkably accurate, if not a bit low (see figure below).

Global surface temperature anomalies from NASA-GISS, HadCRU, NOAA, and JMA. I've circled in red the 1970s, which are centered on the zero baseline. The vertical red line indicates the year 2000, and the two horizontal lines demarcate the temp. anomaly range from 0.3 to 0.5°C. (Source: NASA-GISS.)

By the year 2000, global average temperatures had risen about 0.3 to 0.5°C since the 1970s. And they haven't stopped there. There's no global cooling in sight.

The Rich Tapestry of Climate Science

Science is a process of making observations of the natural world, gathering data, asking questions, and performing experimentsall to get a clear picture of how the world works. This pictureor description, or modelcan only be clear if it includes as much information as possible about the real world. If information is left out, our model of how the world works will be incomplete. Some incompleteness is inevitable because we can never have all of the relevant information, and the information we do have will never be "perfect".

NTZ's description of 1970s climate science focused heavily on the mid-century cooling trend in global average temperatures, and on studies which downplayed CO2's role in the greenhouse effect. This is hardly a complete picture of 1970s science, nor does it give us a very good model of the climate system (as understood by 1970s scientists). NTZ arrived at his "cooling consensus" by often just selecting quotes which supported his view. But, a thorough reading of these papers reveals the full breadth of 1970s science.

In some papers, researchers might look at the mid-century cooling trend and hypothesize that aerosols may have caused it, and then they may look at what possible effects aerosols might have on future climate. The very same paper may also make note of CO2's warming influence, and on possible outcomes if atmospheric CO2 continues to increase in the future. Benton 1970 and Barrett 1978 both fit this description. But if you just read NTZ's quotes from these papers you would know nothing about forecasts of future warming.

NTZ's goal-post shifting, straw-man arguments, and quote mining/cherry picking result in a lopsided description of 1970s climate science. It is possible to get a more accurate description of what scientists knew (and didn't know) about climate in the 1970s from NTZ's pile of papers. But to do so you have to read beyond the selected "money quotes" and look at all of the data. When you do so, you won't find a majority of 1970s scientists forecasting "global cooling". Instead you will find what PCF08 found in their literature review:

[P]erhaps more important than demonstrating that the global cooling myth is wrong, this review shows the remarkable way in which the individual threads of climate science of the timeeach group of researchers pursuing their own set of questionswas quickly woven into the integrated tapestry that created the basis for climate science as we know it today.


Thanks to jg for the illustrations, and to BaerbelW for help with the spreadsheet.


Footnotes

1. PCF08 did make a few exceptions for "prestigious reports": "The gray literature of conference proceedings were not authoritative enough to be included in the literature search. However, a few prestigious reports that may not have been peer reviewed have been included in this literature survey because they clearly represent the science of their day."

2. NTZ has incorrect date of 1975 for this paper.

3. Willett is very specific about the term "imminent": it refers "to one or at most two centuries in the future".

4. It is worth looking at the longer quote from Willett in NTZ's full list to see another example of NTZ's confused counting, as described in Part I. Willett is listed as #158 in the Part 2 list. A screenshot of this is shown below.

NTZ gives the title of Willett's paper and then a single paragraph. Then there is the next listing: #159 - Humphreys (1940). But the paragraph from Willett here doesn't contain the shorter quote given in NTZ's list of 35 papers. The actual shorter quote from the "35" list is circled in the screenshot (starting with "the author is convinced...". But isn't that from Humphreys? No, all three of these paragraphs are from Willett: the first is from the abstract and the other two are from the paper (p. 273). NTZ has simply added a new listing number (#159) to the quoted material from Humphreys (1940) used by Willett! Now the expanded time span for NTZ's "1970s" climate science extends back to 1940!



from Skeptical Science https://ift.tt/2FXlTsC

In Part I of this look back at 1970s climate science I reviewed the findings of Peterson, Connolley, and Fleck's seminal 2008 survey of 1970s peer-reviewed literature (hereafter referred to as PCF08) which found no 1970s "consensus" about a future global cooling/ice age. I also looked at a "skeptic" critique of this paper from the blogsite No Tricks Zone, penned by Kenneth Richard (hereafter referred to as NTZ), and showed some of the errors and fallacies he used to distort 1970s science. In this post I'll take a closer look at some of the papers used by NTZ to claim that there was a global cooling "consensus" in the 1970s.


The primary critique by NTZ is given in a blog post which highlights 35 "sample global cooling/low CO2 climate influence papers" from his full list of 285 papers. Like the full list of 285 papers, these samples are not given in any apparent order, not by date or by author. I rearranged these by date of publication in order to see the progression of the science as the various threads of climate research (see Part I) developed throughout the period. Below is a screenshot of the first page of a spreadsheet (click on image for pdf) of the 35 sample papers.

NTZ 35 Papers page 1 pdf

The first thing to notice is that not all of these samples are peer-reviewed scientific papers. Two of these are RAND Corporation documents (Fletcher, 1969 and Libby, 1970), one is a book review (Post, 1979), and one is from a popular science magazine (Douglas, 1975). This article was mentioned in PCF08 in their "Popular Literature of the Era" sidebar but not included in their survey. Two other sample papers are perhaps "borderline" between grey and peer-reviewed literature: two Master's thesis papers (Cimorelli & House, 1974 and Magill, 1980). And there are a few others I'm not sure about.

Contrast this with PCF08's survey which excluded anything from the "grey literature" and focused exclusively on the peer-reviewed literature.1 The best place to find out what the scientists of the time were saying about the future climate trajectory is in the peer-reviewed literature. Things can get muddy once you include documents from the popular press. Yes, these other sources may offer insight into what the "sense of the times" were, but they can also misrepresent or distort what scientists thought at the time. NTZ expands the goal posts to include some of this grey literature and thus gathers more "papers" for his "global cooling consensus".


A detailed examination of NTZ's sample list of 35 papers is beyond the scope of this blog post (see pdf of spreadsheet above for further details on papers not discussed in this post). Instead, I will focus on some examples to illustrate the ways NTZ misrepresents 1970s climate science.

NTZ Subsections

The table above shows the sub-sections which NTZ used to organize his pile of papers. In the sample list, the papers are in the two most-numerous categories: "Cooling Since 1940..." and "Dubious Human Influence...", and there is one paper from the sub-section, "Uncertainties...". By far, most papers in the full NTZ list (as well as the sample list) are found in the sub-section: "Cooling Since 1940, Forecasts for Continued Cooling/Ice Age". As pointed out in Part I, NTZ shifts the focus from PCF08's look to future climate projections to a view of what scientists were saying about the recent mid-century cooling trend.

NTZ's Cooling Category

Benton 1970, Carbon Dioxide and its Role in Climate Change

Benton 1970, a short two-page paper, succinctly lays out some of the main threads of 1970s climate science. Benton notes the rise of global temperatures at the beginning of the 20th century, followed by the mid-century cooling trend. He discusses CO2's impact on climate and then aerosol's (both volcanic and human) possible cooling and/or warming impacts. PCF08 included this paper in their survey in their "warming" category, perhaps because of this quote:

Recent numerical studies have indicated that a 10% increase in carbon dioxide should result, on average, in a temperature increase of about 0.3°C at the earth's surface. The present rate of increase of 0.7 ppm per year would therefore (if extrapolated to 2000 A.D.) result in a warming of about 0.6°Ca very substantial change.

NTZ ignores this clear prediction and only zeros in on what the paper says about mid-century cooling. This cherry picking, or selective quoting, is by far NTZ's most used technique.

Schultz 1972, Holocene Interglacial Migrations of Mammals and Other Vertebrates

Almost any paper which mentions the mid-century cooling, no matter how tangentially, is added to NTZ's pile of papers. Schultz 1972 is about climatic impacts on Holocene mammal migrations. He discusses how armadillos expanded their North American range northwards during the first half of the 20th century, but are now, in 1972, headed back south:

The armadillos, however, appear to have disappeared from their lately acquired northern range, and now seem to be found chiefly south of the central part of Kansas. This change in distribution has taken place during the past 10 yr, when the winters have been longer and colder.

That's it. The main focus of the paper is on trying to see if changes in the ranges of other late Pleistocene and Holocene mammals might also be due to climate changes. It looks at the present armadillo migrations as a possible analog of the past, and says nothing about future climate trajectory.

Kukla 1972, Insolation and Glaciation

One of the few papers that actually forecast a future ice age is Kukla 1972, as quoted by NTZ: "A new glacial insolation regime, expected to last 8000 years, began just recently. Mean global temperatures may eventually drop about 1°C in the next hundred years." This paper may be the only example of a "cooling" paper missed by the PCF08 survey and found by NTZ.

Kukla 1972 is also one of the few papers that deals with that other thread of 1970s climate science: Milankovitch cycles. Kukla showed how past changes in orbital cycles very slightly altered the amount of solar energy hitting the Earth, leading to past glacial and interglacial periods. Kukla rightly noted that the "change in the heat income due to Milankovitch mechanism is so minute that it cannot lead directly to glaciation or deglaciation...Only when multiplied by some efficient feedback mechanism can the insolation trigger climatic change". One of the main feedbacks is from changes in the Earth's albedo. At the start of a glacial period, as more and more ice accumulates in the polar regions, more and more incoming solar radiation is reflected from the growing ice fields, which leads to further cooling.

Besides looking back to past ice ages, Kukla also extrapolated the orbital cycles into the future to forecast continued cooling and a descent into another ice age. But Kukla stressed that his extrapolation was "still today [a] somewhat speculative" estimate.

Ellsaesser 1974, Has Man, Through Increasing Emissions of Particulates, Changed the Climate?

This paper is from a symposium held in 1974 on atmospheric pollution, and so it may not technically be "peer-reviewed". The NTZ quote from this paper mentions that there had been "a flood of papers" dealing with the "exponentially increasing pollution". Ellsaesser notes that "the particulate increases were usually cited as at least contributing to the post 1940 cooling and possibly capable of bringing on another ice age". But a full reading of the paper shows that the entire point of Ellsaesser's paper is to counter that argument. He complains about the environmental alarmism of the day concerning pollution and cooling. And, ironically, he complains that not enough emphasis is given to CO2:

Of the climatic problems raised, the CO2 one is best understood. There is essentially universal agreement that atmospheric CO2 is increasing as a result of the consumption of fossil fuels and that this should enhance the 'greenhouse' effect leading to a warming of the planetary surface. The strongest support for the upward trend in air-borne particulates derives from the failure of observational data to support our understanding of the CO2 effect. Yet no one ever hears the argument that man might consider a deliberate increase in particulates to counter the CO2 effect or alternatively that the CO2 effect is just what is needed to prevent or delay the onset of the next glacial advance which is now imminent according to students of this problem.

Robock 1978, Internally and Externally Caused Climate Change

NTZ put this paper in his "cooling" sub-section, but a better fit would probably be in his  "dubious human influence..." sub-section. Robock used a simple energy balance model to investigate how various forcings, both natural and anthropogenic, may have influenced global temperatures from about the 1880s to the 1960s. For the natural forcings Robock made various runs using different solar forcings and two runs using different volcanic aerosol numbers. For the anthropogenic forcings he used one run each for CO2, aerosols, and "heat". Robock found that the forcing which most closely mirrored the actual temperature observations was volcanic aerosols: "volcanic dust is the only external forcing that produces a model response significantly like the observations".

Robock only modelled each forcing separately, not in combination. But he did note that some of these forcings, working in tandem in the real world, could possibly explain the observed temperature record of the past ~100 years.

What about CO2 and anthro-aerosols? His model showed a slight warming from CO2 and a slight cooling from human pollution, not enough to really matter, and when combined they essentially cancelled each other out. NTZ's quote from the paper highlights this fact: "One could sum the anthropogenic effects for each region, which would show almost no effect in the NH [Northern Hemisphere] and warming in the SH [Southern Hemisphere]." But he ignores the following sentence: "Drawing conclusions from this exercise would not be meaningful, however, due to our lack of understanding of the aerosol effect." Robock also pointed out:

All the effects [of human forcings] almost double every 20 years. They are not of sufficient magnitude to have much effect on the observational records, which end about 1960, but may have a measurable effect in the near future.

The relative magnitudes of the effects may change in the future due to changing human pollution policies. Restrictions on particulate pollution and anticipated measures against sulfate aerosols will lessen the effects of industrial aerosols.

Indeed, this is what actually happened. Clean Air rules lessened particulate/aerosol pollution but did nothing to limit CO2 emissions. The cooling effect of aerosols never materialized but the warming effect of CO2 has steadily risen since Robock's simple model runs.

Nelson, et al. 1979, Non-Climatic Trends in Divisional and State Mean Temperatures: A Case Study in Indiana2

This paper is a "case study" which looked at the mid-century cooling in Indiana, specifically in the summer months (June, July, and August) since the authors were also interested in any change in climate during the growing season. When scientists in the 1960s-70s compiled data to build their global average temperature series they used state averages of monthly mean temperatures from weather stations around the world. Nelson et al. noted that:

any changes in location of stations and time of observation were tacitly assumed to be random and to have little effect on divisional and state mean temperature records. Schaal and Dale (1977), however, showed that in Indiana a systemic change in the time that observations were taken at cooperative climatological stations--from evening to morning--contributed to the downward trend observed in divisional and state mean temperatures since 1970. [Italics in original.]

Was the mid-century cooling really just an artifact of changes in temperature recording methods which made it seem like the globe was cooling? Nelson et al took a closer look at the Indiana data and made adjustments to correct for any biases. Their corrections got rid of some of the mid-century cooling, but not all of it.

Of course, this is just summer months in Indiana, not a global view of temperature changes. But it shows the care and precision used by scientists who did the early work of building an accurate record of global temperatures. I'm surprised that NTZ included this paper in his collection because usually "skeptics" tend to dislike "temperature adjustments" (see here and here). I guess it is okay as long as those adjustments still show cooling.

Karl, et al. 1984, Decreasing Diurnal Temperature Range in the United States and Canada from 1941 through 1980

The last paper I'll look at from this sub-category is about a changing "diurnal temperature range" or DTR. Here is NTZ's quote:

An appreciable number of nonurban stations in the United States and Canada have been identified with statistically significant (at the 90% level) decreasing trends in the monthly mean diurnal temperature range between 1941-80.

I think NTZ likes this paper because it mentions "decreasing trends" and "temperature" in the first sentence of the paper's abstract. But, I don't think he understands what "decreasing trends in monthly mean diurnal temperature range" means. The DTR is merely the "difference between the maximum and minimum temperature during a 24-hour period" (IPCC). The "decreasing trend" discussed in this paper refers to a decrease in the range between the maximum and minimum daily temperatures.

Diurnal Temperature Range

For example (see figure above), let's say the average monthly max. temp. at some location was 25°C and the min. temp. was 10°C, the range would be 15°C. Now, at some later time, if the max. temp. is 27°C and the min. temp. is 14°C, then the DTR would be 13°C. Hence, the DTR has decreased from 15 to 13°C. Notice in this example the average temperatures went up but the DTR went down. Also notice the min. temps increased slightly more than the max. temps.

Why would scientists study DTR? In the abstract, Karl et al say:

The physical mechanism responsible for the observed decrease in the diurnal range is not known. Possible explanations include greenhouse effects such as changes in cloudiness, aerosol loading, atmospheric water vapor content, or carbon dioxide.

They also pointed out that the specific nature of the decreasing DTR they found also points to an enhanced greenhouse effect:

An increased greenhouse effect due to humidity, CO2, aerosols or clouds is expected to produce a relative increase of the minima with respect to the maxima and a decrease of the diurnal range. The reported observations are consistent with the hypothesized changes.

Even today, the jury is still not settled on weather a decreasing DTR is a solid "fingerprint" of an enhanced CO2-greenhouse warming. But Karl et al was one of the papers from the 1970s (actually 1980s!) which offered support for this idea, and had nothing to do with cooling (mid-century or in a future ice age).

NTZ's Dubious Human Influence Category

Willett 1974, Do Recent Climatic Fluctuations Portend an Imminent Ice Age?

This is another rare paper in NTZ's collection which actually had a prediction about an "imminent"3 ice age. It is also one of the many PCF08 papers which NTZ repurposed: PCF08 had categorized it as "neutral". But NTZ's selected quote4 ignored any mention of a future ice age and zeroed in on Willett's dismissal of CO2's influence on recent climate:

[T]he author is convinced that recent increases of atmospheric carbon dioxide have contributed much less than 5% of the recent changes of atmospheric temperature, and will contribute no more than that in the foreseeable future.

Willett also didn't think that particulate/aerosol/dust pollution would have much effect. He preferred a solar influence hypothesis and concluded that there was no "imminent" ice age coming:

There is no other reason to anticipate an Ice Age in the near future. It is the author's contention that. 1. The pollution hypotheses [CO2 or dust/aerosols] cannot properly be made to account for recent climatic fluctuations. 2. The solar hypothesis appears to fit the observed pattern of climatic fluctuation in much greater detail, and does not call for an imminent Ice Age.

And to drive home the point, Willett said it was his "final conclusion that man will pollute himself off the face of the earth long before he can pollute himself into an Ice Age".

Well, actually, here is his final conclusion, just in case the reader wasn't paying attention:

The author's reasoned answer, then, to the question, 'Do Recent Climatic Fluctuations Portend an Imminent Ice Age?' is an emphatic NO...the next Ice Age is unlikely for at least 10,000 years, more likely for more than 30,000 years, unless the sun takes off on a new tangent. [Emphasis in the original]

NTZ ignored all of these clear predictions about a future ice age and instead he focused on Willett's views, since shown to be wrong, about CO2's "dubious influence" on climate.

Dunbar 1976, Climatic Change and Northern Development

Here is NTZ's selected quote from this paper:

The measured increase in carbon dioxide in the atmosphere, according to the most recent computations, would not be enough to have any measurable climatic effect.

The very next part of this quote is:

Rasool and Schneider (1971) conclude that an increase in the carbon dioxide content of eight times the present level would produce an increase in surface temperature of less than 2°C, and that if the concentration were to increase from the present level of 320 parts per million to about 400 by the year 2000, the predicted increase in surface global temperature would be about 0.1°C

Rasool and Schneider (1971) (hereafter: R&S71) is one of the seminal climate papers from the 1970s, and PCF08 categorized it as one of their seven "cooling" papers. PCF08 stated that this paper "may be the most misinterpreted and misused paper in the story of global cooling". R&S71 was one of the first studies using modeling, and their results found minimal warming effects from CO2 and large cooling effects from aerosols. But other scientists, and even Rasool and Schneider, quickly noticed flaws in R&S71 (Charlson et al 1972 and Rasool and Schneider 1972). Further improvements in 1975 (this time by Schneider and Mass) showed that R&S71 "had overestimated cooling [from aerosols] while underestimating the greenhouse warming contributed by carbon dioxide" (PCF08).

Dunbar's paper was published one year after Schneider and Mass's corrections to R&S71, but no corrections to R&S71 are noted in Dunbar's paper, he still used the incorrect results. Perhaps Dunbar can be excused for this oversight since the corrections were still very new. But, when we look back at these papers we should be mindful of the larger context of the science of the time, and recognize when a "money quote" may require further research.

Barrett 1978 Man and Climate: An Overview

A few years after Dunbar, Barrett 1978 also used R&S71 without noting the corrections, but he also referenced enough of the other relevant literature to arrive at a more accurate estimate of CO2's effect. The result is a good overview of the state of the science in the 1970s.

Still, NTZ was able to find a quote which seemed to downplay man's influence on climate:

In particular, detection of an anthropogenic influence through statistical analysis alone requires a long run of data of good quality and careful attention to measures of significance. It is most important to avoid the post hoc ergo propter hoc fallacy that a trend of a few years’ duration or less, following some change in human activities, can be attributed to that change even when no sound physical causal relationship is evident. [Emphasis in the original.]

Conveniently, NTZ avoided the rest of the paper which went on to describe the "sound physical causal relationship[s]" between CO2, aerosols, and climate. In the concluding remarks of the paper, Barrett predicted that atmospheric CO2 would rise to "between 350 and 415 ppmv by the end of the [20th] century". The actual value in 2000 was about 368 ppm, well within the predicted range.

Barrett also predicted that this increase in CO2 "should increase the temperature by 0.3°C; this trend might be detectable by careful analysis unless it is offset by other effects, such as those of aerosols". Careful analysis by NASA-GISS, NOAA, HadCRU, and JMA (as well as others) have shown that this prediction was also remarkably accurate, if not a bit low (see figure below).

Global surface temperature anomalies from NASA-GISS, HadCRU, NOAA, and JMA. I've circled in red the 1970s, which are centered on the zero baseline. The vertical red line indicates the year 2000, and the two horizontal lines demarcate the temp. anomaly range from 0.3 to 0.5°C. (Source: NASA-GISS.)

By the year 2000, global average temperatures had risen about 0.3 to 0.5°C since the 1970s. And they haven't stopped there. There's no global cooling in sight.

The Rich Tapestry of Climate Science

Science is a process of making observations of the natural world, gathering data, asking questions, and performing experimentsall to get a clear picture of how the world works. This pictureor description, or modelcan only be clear if it includes as much information as possible about the real world. If information is left out, our model of how the world works will be incomplete. Some incompleteness is inevitable because we can never have all of the relevant information, and the information we do have will never be "perfect".

NTZ's description of 1970s climate science focused heavily on the mid-century cooling trend in global average temperatures, and on studies which downplayed CO2's role in the greenhouse effect. This is hardly a complete picture of 1970s science, nor does it give us a very good model of the climate system (as understood by 1970s scientists). NTZ arrived at his "cooling consensus" by often just selecting quotes which supported his view. But, a thorough reading of these papers reveals the full breadth of 1970s science.

In some papers, researchers might look at the mid-century cooling trend and hypothesize that aerosols may have caused it, and then they may look at what possible effects aerosols might have on future climate. The very same paper may also make note of CO2's warming influence, and on possible outcomes if atmospheric CO2 continues to increase in the future. Benton 1970 and Barrett 1978 both fit this description. But if you just read NTZ's quotes from these papers you would know nothing about forecasts of future warming.

NTZ's goal-post shifting, straw-man arguments, and quote mining/cherry picking result in a lopsided description of 1970s climate science. It is possible to get a more accurate description of what scientists knew (and didn't know) about climate in the 1970s from NTZ's pile of papers. But to do so you have to read beyond the selected "money quotes" and look at all of the data. When you do so, you won't find a majority of 1970s scientists forecasting "global cooling". Instead you will find what PCF08 found in their literature review:

[P]erhaps more important than demonstrating that the global cooling myth is wrong, this review shows the remarkable way in which the individual threads of climate science of the timeeach group of researchers pursuing their own set of questionswas quickly woven into the integrated tapestry that created the basis for climate science as we know it today.


Thanks to jg for the illustrations, and to BaerbelW for help with the spreadsheet.


Footnotes

1. PCF08 did make a few exceptions for "prestigious reports": "The gray literature of conference proceedings were not authoritative enough to be included in the literature search. However, a few prestigious reports that may not have been peer reviewed have been included in this literature survey because they clearly represent the science of their day."

2. NTZ has incorrect date of 1975 for this paper.

3. Willett is very specific about the term "imminent": it refers "to one or at most two centuries in the future".

4. It is worth looking at the longer quote from Willett in NTZ's full list to see another example of NTZ's confused counting, as described in Part I. Willett is listed as #158 in the Part 2 list. A screenshot of this is shown below.

NTZ gives the title of Willett's paper and then a single paragraph. Then there is the next listing: #159 - Humphreys (1940). But the paragraph from Willett here doesn't contain the shorter quote given in NTZ's list of 35 papers. The actual shorter quote from the "35" list is circled in the screenshot (starting with "the author is convinced...". But isn't that from Humphreys? No, all three of these paragraphs are from Willett: the first is from the abstract and the other two are from the paper (p. 273). NTZ has simply added a new listing number (#159) to the quoted material from Humphreys (1940) used by Willett! Now the expanded time span for NTZ's "1970s" climate science extends back to 1940!



from Skeptical Science https://ift.tt/2FXlTsC

Astronomers report success with machine deep learning

Deep learning is a subset of machine learning in Artificial Intelligence (AI) that has networks which are capable of learning unsupervised from data that is unstructured or unlabeled. Image and caption via Quora.

We published a story in April about an art historian using an innovative analysis technique to unlock architectural secrets. He was using a machine learning method called deep learning – which is used in, for example, facial recognition and speech recognition software – to do science. Similarly, astronomers are beginning to report the use of machine deep learning techniques to perform research that humans can’t do using more traditional methods.

Below we describe two recent examples: the first related to planets orbiting two stars, and the second related to classifying galaxies.

Artist’s impression of Kepler-16b, discovered by NASA’s Kepler mission and the first confirmed circumbinary planet. It is a gas giant that orbits close to the edge of its binary system’s habitable zone. Image via T. Pyle / NASA / JPL-Caltech/ RAS.

First, planets orbiting two stars. Can computers predict whether planets orbiting binary stars remain in stable orbits? That’s an important question because many (perhaps most) stars in our Milky Way galaxy appear to be in multiple star systems, and because planets in stable orbits might be expected to be the most habitable planets.

And, it turns out, the answer is yes. Researchers reported on April 23, 2018 that computers that undergo deep learning can make this prediction more successfully than human astronomers. A study on this subject looked at what many astronomers call Tatooines, planets orbiting two stars, named for the fictional Tatooine first introduced in 1977 as Luke Skywalker’s home in the original Star Wars movie. Researchers Chris Lam and David Kipping were both, at the time, at the Cool Worlds lab at Columbia University in New York (Lam has since obtained his PhD and moved on). Their study is published in the peer-reviewed journal Monthly Notices of the Royal Astronomical Society. The authors explained in a statement:

Tens of these planets have so far been discovered, but working out whether they may be habitable or not can be difficult.

Moving around two stars instead of just one can lead to large changes in a planet’s orbit, which mean that it is often either ejected from the system entirely, or it crashes violently into one of its twin stars. Traditional approaches to calculating which of these occurs for a given planet get significantly more complicated as soon as the extra star is thrown into the mix.

These researchers simulated millions of possible planets with different orbits using traditional methods and found that planets were being predicted as stable that were clearly not, and vice versa. Their new study showed how machine learning could make accurate predictions possible, even if the standard approach – based on Newton’s laws of gravity and motion – breaks down. They said:

After creating ten million hypothetical Tatooines with different orbits, and simulating each one to test for stability, this huge training set was fed into the deep learning network. Within just a few hours, the network was able to out-perform the accuracy of the standard approach.

Read more about this study via the RAS

In the video below from Cool Worlds Lab, David Kipping explains why three-body systems are so tricky and how deep learning analysis methods can save the day. It’s called Droids [Machines] Ponder Tatooine:

Second, classifying galaxies. Also on April 23, 2018, astronomers at UC Santa Cruz reported using machine deep learning techniques to help astronomers analyze images of galaxies and understand how galaxies form and evolve. This new study has been accepted for publication in the peer-reviewed Astrophysical Journal and is available online. In the study, researchers used computer simulations of galaxy formation to train a deep learning algorithm, which then:

… proved surprisingly good at analyzing images of galaxies from the Hubble Space Telescope.

The researchers said they used output from the simulations to generate mock images of simulated galaxies as they would look in ordinary Hubble observations. The mock images were used to train the deep learning system to recognize three key phases of galaxy evolution previously identified in the simulations. The researchers then gave the system a large set of actual Hubble images to classify.

The results showed a remarkable level of consistency, the astronomers said, in the classifications of simulated and real galaxies. Joel Primack of UC Santa Cruz said:

We were not expecting it to be all that successful. I’m amazed at how powerful this is. We know the simulations have limitations, so we don’t want to make too strong a claim. But we don’t think this is just a lucky fluke.

Read more about this study from UC Santa Cruz.

View larger. | Astronomers at UC Santa Cruz say a ‘deep learning’ algorithm trained on images from cosmological simulations is surprisingly successful at classifying real galaxies in Hubble images. Top row: High-resolution images from a computer simulation of a young galaxy going through three phases of evolution (before, during, and after the so-called “blue nugget” phase, a star-forming phase of galaxy evolution). Middle row: The same images from the computer simulation of a young galaxy in three phases of evolution as it would appear if observed by the Hubble Space Telescope. Bottom row: Hubble Space Telescope images of distant young galaxies classified by a deep learning algorithm trained to recognize the three phases of galaxy evolution. The width of each image is approximately 100,000 light-years. Images via UCSC.

Bottom line: Two recent examples of the use of machine deep learning in astronomy. The first is related to Tatooines, or planets orbiting two stars, and the second is related to classifying galaxies in Hubble Space Telescope images.

Source: A machine learns to predict the stability of circumbinary planets

Source: Deep Learning Identifies High-z Galaxies in a Central Blue Nugget Phase in a Characteristic Mass Range

Via UCSC and RAS.



from EarthSky https://ift.tt/2FUxHfj

Deep learning is a subset of machine learning in Artificial Intelligence (AI) that has networks which are capable of learning unsupervised from data that is unstructured or unlabeled. Image and caption via Quora.

We published a story in April about an art historian using an innovative analysis technique to unlock architectural secrets. He was using a machine learning method called deep learning – which is used in, for example, facial recognition and speech recognition software – to do science. Similarly, astronomers are beginning to report the use of machine deep learning techniques to perform research that humans can’t do using more traditional methods.

Below we describe two recent examples: the first related to planets orbiting two stars, and the second related to classifying galaxies.

Artist’s impression of Kepler-16b, discovered by NASA’s Kepler mission and the first confirmed circumbinary planet. It is a gas giant that orbits close to the edge of its binary system’s habitable zone. Image via T. Pyle / NASA / JPL-Caltech/ RAS.

First, planets orbiting two stars. Can computers predict whether planets orbiting binary stars remain in stable orbits? That’s an important question because many (perhaps most) stars in our Milky Way galaxy appear to be in multiple star systems, and because planets in stable orbits might be expected to be the most habitable planets.

And, it turns out, the answer is yes. Researchers reported on April 23, 2018 that computers that undergo deep learning can make this prediction more successfully than human astronomers. A study on this subject looked at what many astronomers call Tatooines, planets orbiting two stars, named for the fictional Tatooine first introduced in 1977 as Luke Skywalker’s home in the original Star Wars movie. Researchers Chris Lam and David Kipping were both, at the time, at the Cool Worlds lab at Columbia University in New York (Lam has since obtained his PhD and moved on). Their study is published in the peer-reviewed journal Monthly Notices of the Royal Astronomical Society. The authors explained in a statement:

Tens of these planets have so far been discovered, but working out whether they may be habitable or not can be difficult.

Moving around two stars instead of just one can lead to large changes in a planet’s orbit, which mean that it is often either ejected from the system entirely, or it crashes violently into one of its twin stars. Traditional approaches to calculating which of these occurs for a given planet get significantly more complicated as soon as the extra star is thrown into the mix.

These researchers simulated millions of possible planets with different orbits using traditional methods and found that planets were being predicted as stable that were clearly not, and vice versa. Their new study showed how machine learning could make accurate predictions possible, even if the standard approach – based on Newton’s laws of gravity and motion – breaks down. They said:

After creating ten million hypothetical Tatooines with different orbits, and simulating each one to test for stability, this huge training set was fed into the deep learning network. Within just a few hours, the network was able to out-perform the accuracy of the standard approach.

Read more about this study via the RAS

In the video below from Cool Worlds Lab, David Kipping explains why three-body systems are so tricky and how deep learning analysis methods can save the day. It’s called Droids [Machines] Ponder Tatooine:

Second, classifying galaxies. Also on April 23, 2018, astronomers at UC Santa Cruz reported using machine deep learning techniques to help astronomers analyze images of galaxies and understand how galaxies form and evolve. This new study has been accepted for publication in the peer-reviewed Astrophysical Journal and is available online. In the study, researchers used computer simulations of galaxy formation to train a deep learning algorithm, which then:

… proved surprisingly good at analyzing images of galaxies from the Hubble Space Telescope.

The researchers said they used output from the simulations to generate mock images of simulated galaxies as they would look in ordinary Hubble observations. The mock images were used to train the deep learning system to recognize three key phases of galaxy evolution previously identified in the simulations. The researchers then gave the system a large set of actual Hubble images to classify.

The results showed a remarkable level of consistency, the astronomers said, in the classifications of simulated and real galaxies. Joel Primack of UC Santa Cruz said:

We were not expecting it to be all that successful. I’m amazed at how powerful this is. We know the simulations have limitations, so we don’t want to make too strong a claim. But we don’t think this is just a lucky fluke.

Read more about this study from UC Santa Cruz.

View larger. | Astronomers at UC Santa Cruz say a ‘deep learning’ algorithm trained on images from cosmological simulations is surprisingly successful at classifying real galaxies in Hubble images. Top row: High-resolution images from a computer simulation of a young galaxy going through three phases of evolution (before, during, and after the so-called “blue nugget” phase, a star-forming phase of galaxy evolution). Middle row: The same images from the computer simulation of a young galaxy in three phases of evolution as it would appear if observed by the Hubble Space Telescope. Bottom row: Hubble Space Telescope images of distant young galaxies classified by a deep learning algorithm trained to recognize the three phases of galaxy evolution. The width of each image is approximately 100,000 light-years. Images via UCSC.

Bottom line: Two recent examples of the use of machine deep learning in astronomy. The first is related to Tatooines, or planets orbiting two stars, and the second is related to classifying galaxies in Hubble Space Telescope images.

Source: A machine learns to predict the stability of circumbinary planets

Source: Deep Learning Identifies High-z Galaxies in a Central Blue Nugget Phase in a Characteristic Mass Range

Via UCSC and RAS.



from EarthSky https://ift.tt/2FUxHfj

adds 2