aads

How fast does Earth move through the Universe? (Synopsis) [Starts With A Bang]

“The slow philosophy is not about doing everything in tortoise mode. It’s less about the speed and more about investing the right amount of time and attention in the problem so you solve it.” –Carl Honore

Einstein’s theories of special and general relativity tell us that there’s no Universal, preferred frame of reference. But that doesn’t necessarily mean that our physical Universe doesn’t have an average frame of reference, one which minimizes the relative speeds of all the galaxies to one another.

Image credit: Cosmography of the Local Universe/Cosmic Flows Project — Courtois, Helene M. et al. Astron.J. 146 (2013) 69 arXiv:1306.0091 [astro-ph.CO].

Image credit: Cosmography of the Local Universe/Cosmic Flows Project — Courtois, Helene M. et al. Astron.J. 146 (2013) 69 arXiv:1306.0091 [astro-ph.CO].

While the Earth rotates, orbits the Sun, which revolves in our galaxy, which moves in the local groups, which in turn is gravitationally attracted to all the galaxies, groups and clusters in the Universe, the leftover glow from the Big Bang allows us to reconstruct exactly how fast we move relative to that. And it turns out that there is a particular frame of reference that’s better than all the others, and we’re not at rest.
Image credit: The pre-launch Planck Sky Model: a model of sky emission at submillimetre to centimetre wavelengths — Delabrouille, J. et al.Astron.Astrophys. 553 (2013) A96 arXiv:1207.3675 [astro-ph.CO].

Image credit: The pre-launch Planck Sky Model: a model of sky emission at submillimetre to centimetre wavelengths — Delabrouille, J. et al.Astron.Astrophys. 553 (2013) A96 arXiv:1207.3675 [astro-ph.CO].

Go get the full story of how fast Earth moves through the Universe — and some real science — on this non-foolish 1st of April.

from ScienceBlogs http://ift.tt/1VYmyPe

“The slow philosophy is not about doing everything in tortoise mode. It’s less about the speed and more about investing the right amount of time and attention in the problem so you solve it.” –Carl Honore

Einstein’s theories of special and general relativity tell us that there’s no Universal, preferred frame of reference. But that doesn’t necessarily mean that our physical Universe doesn’t have an average frame of reference, one which minimizes the relative speeds of all the galaxies to one another.

Image credit: Cosmography of the Local Universe/Cosmic Flows Project — Courtois, Helene M. et al. Astron.J. 146 (2013) 69 arXiv:1306.0091 [astro-ph.CO].

Image credit: Cosmography of the Local Universe/Cosmic Flows Project — Courtois, Helene M. et al. Astron.J. 146 (2013) 69 arXiv:1306.0091 [astro-ph.CO].

While the Earth rotates, orbits the Sun, which revolves in our galaxy, which moves in the local groups, which in turn is gravitationally attracted to all the galaxies, groups and clusters in the Universe, the leftover glow from the Big Bang allows us to reconstruct exactly how fast we move relative to that. And it turns out that there is a particular frame of reference that’s better than all the others, and we’re not at rest.
Image credit: The pre-launch Planck Sky Model: a model of sky emission at submillimetre to centimetre wavelengths — Delabrouille, J. et al.Astron.Astrophys. 553 (2013) A96 arXiv:1207.3675 [astro-ph.CO].

Image credit: The pre-launch Planck Sky Model: a model of sky emission at submillimetre to centimetre wavelengths — Delabrouille, J. et al.Astron.Astrophys. 553 (2013) A96 arXiv:1207.3675 [astro-ph.CO].

Go get the full story of how fast Earth moves through the Universe — and some real science — on this non-foolish 1st of April.

from ScienceBlogs http://ift.tt/1VYmyPe

Cold Front: ONR Researchers Explore Arctic Land and Sea at Navy ICEX

Ice Camp Sargo, located in the Arctic Circle, serves as the main stage for Ice Exercise (ICEX) 2016 and will house more than 200 participants from four nations over the course of the exercise. ICEX 2016 is a five-week exercise designed to research, test, and evaluate operational capabilities in the region. ICEX 2016 allows the U.S. Navy to assess operational readiness in the Arctic, increase experience in the region, advance understanding of the Arctic environment, and develop partnerships and collaborative efforts. (U.S. Navy photo by Mass Communication Specialist 2nd Class Tyler Thompson)

Ice Camp Sargo, located in the Arctic Circle, serves as the main stage for Ice Exercise (ICEX) 2016 and will house more than 200 participants from four nations over the course of the exercise. ICEX 2016 is a five-week exercise designed to research, test, and evaluate operational capabilities in the region. ICEX 2016 allows the U.S. Navy to assess operational readiness in the Arctic, increase experience in the region, advance understanding of the Arctic environment, and develop partnerships and collaborative efforts. (U.S. Navy photo by Mass Communication Specialist 2nd Class Tyler Thompson)

By David Smalley
Office of Naval Research

As the Navy’s Ice Exercise (ICEX) 2016 winds to a close this week in the frigid waters of the Arctic Ocean, officials at the Office of Naval Research (ONR) today reported new scientific research that took place during the event that will enhance our understanding of, and ability to safely operate in, Arctic maritime environments.

ICEX, a biennial, multi-week exercise sponsored by the Navy’s Arctic Submarine Laboratory, is designed to test submarine capabilities in the Arctic-as well as provide a base camp for cooperative scientific research. The temporary camp sits on a thick piece of floating sea ice approximately 200 miles north of Barrow, Alaska.

This year, for the second time in a row, the ICEX base camp had to be evacuated when cracks in the ice were discovered-proving anew the importance of better understanding the changing region.

“ONR sponsors an active Arctic research program, and ICEX provides a unique and valuable opportunity for our researchers,” said Chief of Naval Research Rear Adm. Mat Winter. “Increasing our understanding of the dynamic Polar environment will help ensure future naval operations in the region are conducted safely and efficiently.”

One of the significant ONR-sponsored projects involved the launch of an unmanned underwater vehicle (UUV) to measure temperature, salinity and ambient noise conditions beneath the surface-factors that can dramatically impact the effectiveness of sonar operations.  Sonar is a naval technology that uses sound in the water to detect and track submarines, popularly known by the famed “ping” signals shown in movie and television depictions.

The ONR UUV collected data within a submerged layer of warm water, known as the Beaufort Lens, which is flowing into the Arctic from the Pacific Ocean. This knowledge could prove essential for improved detection, classification and tracking of vessels.

In related efforts, ONR-sponsored students from the Naval Postgraduate School measured and analyzed the loss of transmission signals over different frequencies as sound penetrated the warm layer of the Beaufort Lens. Other students studied characteristics under the ice at the sea-ice interface.

ONR’s Arctic and Global Prediction program also supported Naval Research Laboratory work during the exercise that used airborne and space-based synthetic aperture radar to develop methods to remotely determine the thickness and age of Arctic sea ice floes.

“The Arctic Ocean is a dynamic and particularly challenging maritime environment,” said Capt. Robin Tyner, military deputy to ONR’s Ocean Battlespace Sensing department.  “As the changing environment opens the region for expanded maritime and naval activity, knowledge of that environment, and the ability to accurately predict weather and ice movement, will become increasingly important.”

The U.S. Navy Arctic Roadmap 2014-2030 assigns ONR lead responsibility for improving Arctic assessment and prediction, and developing comprehensive computer models to support ocean, ice and atmospheric forecasts.

David Smalley is a contractor for ONR Corporate Strategic Communications.

Follow Armed with Science on Twitter!

Disclaimer: The appearance of hyperlinks does not constitute endorsement by the Department of Defense of this website or the information, products or services contained therein. For other than authorized activities such as military exchanges and Morale, Welfare and Recreation sites, the Department of Defense does not exercise any editorial control over the information you may find at these locations. Such links are provided consistent with the stated purpose of this DOD website.

 



from Armed with Science http://ift.tt/1q9wwSb
Ice Camp Sargo, located in the Arctic Circle, serves as the main stage for Ice Exercise (ICEX) 2016 and will house more than 200 participants from four nations over the course of the exercise. ICEX 2016 is a five-week exercise designed to research, test, and evaluate operational capabilities in the region. ICEX 2016 allows the U.S. Navy to assess operational readiness in the Arctic, increase experience in the region, advance understanding of the Arctic environment, and develop partnerships and collaborative efforts. (U.S. Navy photo by Mass Communication Specialist 2nd Class Tyler Thompson)

Ice Camp Sargo, located in the Arctic Circle, serves as the main stage for Ice Exercise (ICEX) 2016 and will house more than 200 participants from four nations over the course of the exercise. ICEX 2016 is a five-week exercise designed to research, test, and evaluate operational capabilities in the region. ICEX 2016 allows the U.S. Navy to assess operational readiness in the Arctic, increase experience in the region, advance understanding of the Arctic environment, and develop partnerships and collaborative efforts. (U.S. Navy photo by Mass Communication Specialist 2nd Class Tyler Thompson)

By David Smalley
Office of Naval Research

As the Navy’s Ice Exercise (ICEX) 2016 winds to a close this week in the frigid waters of the Arctic Ocean, officials at the Office of Naval Research (ONR) today reported new scientific research that took place during the event that will enhance our understanding of, and ability to safely operate in, Arctic maritime environments.

ICEX, a biennial, multi-week exercise sponsored by the Navy’s Arctic Submarine Laboratory, is designed to test submarine capabilities in the Arctic-as well as provide a base camp for cooperative scientific research. The temporary camp sits on a thick piece of floating sea ice approximately 200 miles north of Barrow, Alaska.

This year, for the second time in a row, the ICEX base camp had to be evacuated when cracks in the ice were discovered-proving anew the importance of better understanding the changing region.

“ONR sponsors an active Arctic research program, and ICEX provides a unique and valuable opportunity for our researchers,” said Chief of Naval Research Rear Adm. Mat Winter. “Increasing our understanding of the dynamic Polar environment will help ensure future naval operations in the region are conducted safely and efficiently.”

One of the significant ONR-sponsored projects involved the launch of an unmanned underwater vehicle (UUV) to measure temperature, salinity and ambient noise conditions beneath the surface-factors that can dramatically impact the effectiveness of sonar operations.  Sonar is a naval technology that uses sound in the water to detect and track submarines, popularly known by the famed “ping” signals shown in movie and television depictions.

The ONR UUV collected data within a submerged layer of warm water, known as the Beaufort Lens, which is flowing into the Arctic from the Pacific Ocean. This knowledge could prove essential for improved detection, classification and tracking of vessels.

In related efforts, ONR-sponsored students from the Naval Postgraduate School measured and analyzed the loss of transmission signals over different frequencies as sound penetrated the warm layer of the Beaufort Lens. Other students studied characteristics under the ice at the sea-ice interface.

ONR’s Arctic and Global Prediction program also supported Naval Research Laboratory work during the exercise that used airborne and space-based synthetic aperture radar to develop methods to remotely determine the thickness and age of Arctic sea ice floes.

“The Arctic Ocean is a dynamic and particularly challenging maritime environment,” said Capt. Robin Tyner, military deputy to ONR’s Ocean Battlespace Sensing department.  “As the changing environment opens the region for expanded maritime and naval activity, knowledge of that environment, and the ability to accurately predict weather and ice movement, will become increasingly important.”

The U.S. Navy Arctic Roadmap 2014-2030 assigns ONR lead responsibility for improving Arctic assessment and prediction, and developing comprehensive computer models to support ocean, ice and atmospheric forecasts.

David Smalley is a contractor for ONR Corporate Strategic Communications.

Follow Armed with Science on Twitter!

Disclaimer: The appearance of hyperlinks does not constitute endorsement by the Department of Defense of this website or the information, products or services contained therein. For other than authorized activities such as military exchanges and Morale, Welfare and Recreation sites, the Department of Defense does not exercise any editorial control over the information you may find at these locations. Such links are provided consistent with the stated purpose of this DOD website.

 



from Armed with Science http://ift.tt/1q9wwSb

Low-Cost Air Sensors: The Risks and the Rewards

By Joel Creswell, Ph.D. 

Picture a future in which every device you interact with is connected to the internet and communicating in real time. Your thermostat notices that you’re not home and offers to turn down the heat. Your refrigerator tells you what food is going to go bad soon and pulls up recipes to help you cook it. Your kitchen faucet monitors the safety of your drinking water, and every street lamp can tell you how clean the air is.

Today, this future is part fantasy, but it’s not hard to imagine all of these ideas becoming reality based on current or rapidly-evolving technology.

Reliable information can help us make better decisions. But what if the information we get from sensors is unreliable? What if your kitchen faucet monitor tells you your water is unsafe, leading you to spend hundreds of dollars on a new filter, only to learn that the monitor was wrong? What if the air sensor on your street lamp says the air is dangerous to breathe, causing you to keep your kids inside, when actually, everything was fine? Unreliable sensors can cause unnecessary concern, wasted money, or unwarranted complacency.

The market for environmental sensors is exploding,1 and many sensor users have to determine for themselves which devices are reliable. If you have access to a laboratory, you can compare a new sensor to a proven method to make sure it works. But most consumers don’t have that kind of access and even many environmental professionals don’t have the time or resources to validate every new sensor they buy. This is why groups like EPA’s National Exposure Research Laboratory and California’s South Coast Air Quality Management District have started air sensor evaluation programs. These groups test new sensors and publish performance reports online to help guide sensor users.

Someone demonstrates air sensors to a group of people at a conference

Alan Chan of Sonoma Technology, Inc. demonstrates low-cost air sensors to conference attendees.

Independent sensor testing fills a critical knowledge gap and has become so popular that existing testing labs are having trouble keeping up with demand. EPA is considering ways to scale up the availability of sensor testing, including a network of labs with standardized protocols and reporting.

In March, I presented an outline of an independent sensor evaluation network in a session on air sensors and citizen science at the National Association of Clean Air AgenciesCommunicating Air Quality conference. My presentation emphasized that sensor technologies have the potential to revolutionize environmental monitoring, but only once their reliability is demonstrated. Many of the state agencies at the conference were eager to see a sensor evaluation network established – several told me stories of phone calls from concerned citizens who had purchased untested or unreliable air sensors and were concerned about the dangerous air quality the sensors were (erroneously) indicating. EPA, State and local environmental agencies, and the public need good information on sensor performance to eliminate the confusion caused by unreliable data and harness these powerful new tools to usher in a new era of environmental protection and decision making.

I’m excited about the proliferation of sensors. I already use my phone to count my steps every day, I have a smart thermostat in my house, and I wish I had a fridge that told me when my food was going to go bad. I love new technology that works well. But when I try something that doesn’t work well, like my old smoke alarm that would go off every time I turned on the oven, I get frustrated and stop using it. My solution to the smoke alarm problem? I took the battery out, putting myself at risk if my house ever caught fire. EPA is working to make sure we don’t all put ourselves at risk, by ensuring that we have good data on how well air sensors perform.

 

References

(1)            Gainer, K. Environmental Sensing and Monitoring Technologies: Global Markets; IAS030C; BCC Research: Wellesley, MA, 2014.

About the Author: Joel Creswell is an environmental chemist and a AAAS Fellow on the EPA Office of Research and Development’s Innovation Team. Prior to coming to EPA, he worked on developing environmental trace metals analyzers for a scientific instrument company.



from The EPA Blog http://ift.tt/1pQJ9AB

By Joel Creswell, Ph.D. 

Picture a future in which every device you interact with is connected to the internet and communicating in real time. Your thermostat notices that you’re not home and offers to turn down the heat. Your refrigerator tells you what food is going to go bad soon and pulls up recipes to help you cook it. Your kitchen faucet monitors the safety of your drinking water, and every street lamp can tell you how clean the air is.

Today, this future is part fantasy, but it’s not hard to imagine all of these ideas becoming reality based on current or rapidly-evolving technology.

Reliable information can help us make better decisions. But what if the information we get from sensors is unreliable? What if your kitchen faucet monitor tells you your water is unsafe, leading you to spend hundreds of dollars on a new filter, only to learn that the monitor was wrong? What if the air sensor on your street lamp says the air is dangerous to breathe, causing you to keep your kids inside, when actually, everything was fine? Unreliable sensors can cause unnecessary concern, wasted money, or unwarranted complacency.

The market for environmental sensors is exploding,1 and many sensor users have to determine for themselves which devices are reliable. If you have access to a laboratory, you can compare a new sensor to a proven method to make sure it works. But most consumers don’t have that kind of access and even many environmental professionals don’t have the time or resources to validate every new sensor they buy. This is why groups like EPA’s National Exposure Research Laboratory and California’s South Coast Air Quality Management District have started air sensor evaluation programs. These groups test new sensors and publish performance reports online to help guide sensor users.

Someone demonstrates air sensors to a group of people at a conference

Alan Chan of Sonoma Technology, Inc. demonstrates low-cost air sensors to conference attendees.

Independent sensor testing fills a critical knowledge gap and has become so popular that existing testing labs are having trouble keeping up with demand. EPA is considering ways to scale up the availability of sensor testing, including a network of labs with standardized protocols and reporting.

In March, I presented an outline of an independent sensor evaluation network in a session on air sensors and citizen science at the National Association of Clean Air AgenciesCommunicating Air Quality conference. My presentation emphasized that sensor technologies have the potential to revolutionize environmental monitoring, but only once their reliability is demonstrated. Many of the state agencies at the conference were eager to see a sensor evaluation network established – several told me stories of phone calls from concerned citizens who had purchased untested or unreliable air sensors and were concerned about the dangerous air quality the sensors were (erroneously) indicating. EPA, State and local environmental agencies, and the public need good information on sensor performance to eliminate the confusion caused by unreliable data and harness these powerful new tools to usher in a new era of environmental protection and decision making.

I’m excited about the proliferation of sensors. I already use my phone to count my steps every day, I have a smart thermostat in my house, and I wish I had a fridge that told me when my food was going to go bad. I love new technology that works well. But when I try something that doesn’t work well, like my old smoke alarm that would go off every time I turned on the oven, I get frustrated and stop using it. My solution to the smoke alarm problem? I took the battery out, putting myself at risk if my house ever caught fire. EPA is working to make sure we don’t all put ourselves at risk, by ensuring that we have good data on how well air sensors perform.

 

References

(1)            Gainer, K. Environmental Sensing and Monitoring Technologies: Global Markets; IAS030C; BCC Research: Wellesley, MA, 2014.

About the Author: Joel Creswell is an environmental chemist and a AAAS Fellow on the EPA Office of Research and Development’s Innovation Team. Prior to coming to EPA, he worked on developing environmental trace metals analyzers for a scientific instrument company.



from The EPA Blog http://ift.tt/1pQJ9AB

Taking Action on HFCs to Protect our Climate at Home and Abroad

By: Gina McCarthy
 
This week, EPA took another important step in a series of recent actions to help reduce our country’s use and emissions of hydrofluorocarbons (HFCs) – a potent greenhouse gas. I signed a proposed rule under the Significant New Alternatives Policy (SNAP) Program that will expand the list of climate-friendly HFC alternatives and phase out certain HFCs in favor of safer options that are already available. 
 
HFCs are predominantly used in air-conditioning and refrigeration and can be up to 10,000 times more damaging to our climate than carbon pollution. Left unchecked, growing HFC emissions would undo critical progress we’ve made to act on climate and protect the planet. 
 
That’s why cutting their use and emissions is a key part of President Obama’s Climate Action Plan. The new proposed rule not only supports the President’s goals, it also recognizes the key role of innovative companies in bringing new HFC alternatives to the marketplace. 
 
This is an example of the important work we’re doing at home. But we’re also making tremendous progress with our international partners to fully address HFCs.
 
Just yesterday, in a joint announcement, President Obama and China’s President Xi Jinping committed to working bilaterally and with other countries to achieve successful outcomes this year in related multilateral fora, including on an HFC amendment under the Montreal Protocol.
 
And I’m pleased to announce that I’m planning to lead the United States delegation at the Montreal Protocol’s Extraordinary Meeting of the Parties (ExMOP) this July in Vienna. I had the honor of leading the United States delegation to the Montreal Protocol’s 27th Meeting of the Parties in Dubai last November. At that time, the world took a significant step by agreeing to work together on a 2016 Amendment to the Montreal Protocol to reduce the production and consumption of harmful HFCs and achieve substantial greenhouse gas reductions. 
 
Next week is the first preparatory session for the 2016 negotiations in Geneva. This will be the first opportunity since Dubai for countries to come together and make concrete progress on our 2016 phase down amendment. 
 
As we saw with the historic Paris Agreement, the world can unite in action when the health of our kids and shared home is at stake. The U.S. is ready to build on this spirit and follow through on our commitments to reduce HFCs at home and abroad.
 
We are making tremendous progress with our international partners. 
This July in Vienna, I look forward to making more progress on adopting an HFC amendment that will protect our climate for future generations.


from The EPA Blog http://ift.tt/1UYy6T5
By: Gina McCarthy
 
This week, EPA took another important step in a series of recent actions to help reduce our country’s use and emissions of hydrofluorocarbons (HFCs) – a potent greenhouse gas. I signed a proposed rule under the Significant New Alternatives Policy (SNAP) Program that will expand the list of climate-friendly HFC alternatives and phase out certain HFCs in favor of safer options that are already available. 
 
HFCs are predominantly used in air-conditioning and refrigeration and can be up to 10,000 times more damaging to our climate than carbon pollution. Left unchecked, growing HFC emissions would undo critical progress we’ve made to act on climate and protect the planet. 
 
That’s why cutting their use and emissions is a key part of President Obama’s Climate Action Plan. The new proposed rule not only supports the President’s goals, it also recognizes the key role of innovative companies in bringing new HFC alternatives to the marketplace. 
 
This is an example of the important work we’re doing at home. But we’re also making tremendous progress with our international partners to fully address HFCs.
 
Just yesterday, in a joint announcement, President Obama and China’s President Xi Jinping committed to working bilaterally and with other countries to achieve successful outcomes this year in related multilateral fora, including on an HFC amendment under the Montreal Protocol.
 
And I’m pleased to announce that I’m planning to lead the United States delegation at the Montreal Protocol’s Extraordinary Meeting of the Parties (ExMOP) this July in Vienna. I had the honor of leading the United States delegation to the Montreal Protocol’s 27th Meeting of the Parties in Dubai last November. At that time, the world took a significant step by agreeing to work together on a 2016 Amendment to the Montreal Protocol to reduce the production and consumption of harmful HFCs and achieve substantial greenhouse gas reductions. 
 
Next week is the first preparatory session for the 2016 negotiations in Geneva. This will be the first opportunity since Dubai for countries to come together and make concrete progress on our 2016 phase down amendment. 
 
As we saw with the historic Paris Agreement, the world can unite in action when the health of our kids and shared home is at stake. The U.S. is ready to build on this spirit and follow through on our commitments to reduce HFCs at home and abroad.
 
We are making tremendous progress with our international partners. 
This July in Vienna, I look forward to making more progress on adopting an HFC amendment that will protect our climate for future generations.


from The EPA Blog http://ift.tt/1UYy6T5

Video: What is a pulsar?

A pulsar is a rapidly spinning neutron star which is the small, incredibly dense remnant of much more massive star. How dense? A teaspoon of matter from a neutron star weighs as much as Mount Everest. A neutron star containing more matter than our sun would be only about 15 miles across.



from EarthSky http://ift.tt/1H4P4Fl

A pulsar is a rapidly spinning neutron star which is the small, incredibly dense remnant of much more massive star. How dense? A teaspoon of matter from a neutron star weighs as much as Mount Everest. A neutron star containing more matter than our sun would be only about 15 miles across.



from EarthSky http://ift.tt/1H4P4Fl

My new project launching today: The Quisling Qourner: A group blog on the library/publisher relationship [Confessions of a Science Librarian]

It’s been really gratifying over the last year to see how my DSCaM scholarly communications empire has grown. From it’s small beginnings, Dupuis Science Computing & Medicine has craved out a small but important niche in the discount APC publishing community.

And I really appreciate how the scholarly communications community has encouraged my career progression from publisher of a journal at Elsevier to Chief Advisor on Science Libraries for the Government of Canada to last year’s huge launch of DSCaM.

And the DSCaM empire grows.

This year I would like to announce the launch of a major new initiative: The Quisling Qorner: A Group Blog on the Library/Publisher Relationsship.

I like to think of this new blogging community as being a fellow traveller with the longstanding Scholarly Kitchen blog. As well, we’d like to welcome the brand new In the Open: Libraries, Scholarship, and Publishing blog to the scholarly communications group blog family. While the Scholarly Kitchen tends to take the publisher’s side of things and IO seems headed more towards a bias in the library direction, I think the QQ has it’s own important niche.

And that niche would be the firm belief that the library side and the publisher side of the story are really the same tale, that libraries and publishers should be friends and colleagues of the highest order, that we are essentially on the same side of all the important issues in scholarly communication, that our interests are so intrinsically and explicitly tied together that they are essentially the same.

Publishers are librarians’ best friends, they know what’s good for us and we should just follow their lead in important matters.

Heaven knows, as librarians we’ve enjoyed so much publisher hospitality at conferences — the wine! the cheese! the free pens! — that it’s really time for us to give back. There have been too many years of tragic misunderstanding and animosity between the two communities.

And repairing that damaged relationship will be the role of The Quisling Qorner. I’ve invited a plethora of the brightest lights in librarianship, some well known, some up-and-comers, to contribute their thoughts about how we can bring librarians and publishers closer together. I’ve also invited friends and colleagues in the scientific and publishing communities to weight in on some of those same issues as well a provide of broader perspective of how libraries and librarians can serve their interests exclusively.

 

Finally, I’d like to announce the first set up amazing posts that I’m publishing today. I’m a firm believer that any new blogging project needs to launch with enough initial content to draw people in and keep them reading.

So here goes — the first set of posts, all by shining lights in the library/publisher interface universe!

 
And here’s a few titles for forthcoming posts, all either written and in the pipeline or under development by the authors!

  • Paywalled Journals Are the Best, Only the Best, They Are HUUUUUUGE, I’ll Build a Wall Around Them So Only the Good Scientists Can Read My Articles and Make Science Great Again by Donald Trump
  • PLoS Should Buy a Majority Stock in Elsevier: Here’s Why by Roberta Eksevierian
  • Why APCs Are the One True Way Forward for Publisher Business Models by Cameron Neylon
  • Fire all Older Librarians and Give Their Salaries to Elsevier by Phillipa Springster
  • Thomson Reuter’s ISI Makes all Citation Data Open Access in Bid to Thwart Allegations of Impact Factor Manipulations by Sharma Singh
  • Non-Disclosure Agreements as a Preferred Library Bargaining Tactic by Frances Taylor

 

And please consider this an open call. Everyone should go right ahead and pitch post ideas in the comments!

And the first authors’ meeting will be in Stockholm in 2017! Paid for by all those fantastic publishers!



from ScienceBlogs http://ift.tt/1MZ8D4f

It’s been really gratifying over the last year to see how my DSCaM scholarly communications empire has grown. From it’s small beginnings, Dupuis Science Computing & Medicine has craved out a small but important niche in the discount APC publishing community.

And I really appreciate how the scholarly communications community has encouraged my career progression from publisher of a journal at Elsevier to Chief Advisor on Science Libraries for the Government of Canada to last year’s huge launch of DSCaM.

And the DSCaM empire grows.

This year I would like to announce the launch of a major new initiative: The Quisling Qorner: A Group Blog on the Library/Publisher Relationsship.

I like to think of this new blogging community as being a fellow traveller with the longstanding Scholarly Kitchen blog. As well, we’d like to welcome the brand new In the Open: Libraries, Scholarship, and Publishing blog to the scholarly communications group blog family. While the Scholarly Kitchen tends to take the publisher’s side of things and IO seems headed more towards a bias in the library direction, I think the QQ has it’s own important niche.

And that niche would be the firm belief that the library side and the publisher side of the story are really the same tale, that libraries and publishers should be friends and colleagues of the highest order, that we are essentially on the same side of all the important issues in scholarly communication, that our interests are so intrinsically and explicitly tied together that they are essentially the same.

Publishers are librarians’ best friends, they know what’s good for us and we should just follow their lead in important matters.

Heaven knows, as librarians we’ve enjoyed so much publisher hospitality at conferences — the wine! the cheese! the free pens! — that it’s really time for us to give back. There have been too many years of tragic misunderstanding and animosity between the two communities.

And repairing that damaged relationship will be the role of The Quisling Qorner. I’ve invited a plethora of the brightest lights in librarianship, some well known, some up-and-comers, to contribute their thoughts about how we can bring librarians and publishers closer together. I’ve also invited friends and colleagues in the scientific and publishing communities to weight in on some of those same issues as well a provide of broader perspective of how libraries and librarians can serve their interests exclusively.

 

Finally, I’d like to announce the first set up amazing posts that I’m publishing today. I’m a firm believer that any new blogging project needs to launch with enough initial content to draw people in and keep them reading.

So here goes — the first set of posts, all by shining lights in the library/publisher interface universe!

 
And here’s a few titles for forthcoming posts, all either written and in the pipeline or under development by the authors!

  • Paywalled Journals Are the Best, Only the Best, They Are HUUUUUUGE, I’ll Build a Wall Around Them So Only the Good Scientists Can Read My Articles and Make Science Great Again by Donald Trump
  • PLoS Should Buy a Majority Stock in Elsevier: Here’s Why by Roberta Eksevierian
  • Why APCs Are the One True Way Forward for Publisher Business Models by Cameron Neylon
  • Fire all Older Librarians and Give Their Salaries to Elsevier by Phillipa Springster
  • Thomson Reuter’s ISI Makes all Citation Data Open Access in Bid to Thwart Allegations of Impact Factor Manipulations by Sharma Singh
  • Non-Disclosure Agreements as a Preferred Library Bargaining Tactic by Frances Taylor

 

And please consider this an open call. Everyone should go right ahead and pitch post ideas in the comments!

And the first authors’ meeting will be in Stockholm in 2017! Paid for by all those fantastic publishers!



from ScienceBlogs http://ift.tt/1MZ8D4f

This date in science: Comet Hale-Bopp

Comet Hale-Bopp with its prominent dust (white) and plasma (blue) tails. Photo via E. Kolmhofer, H. Raab; Johannes-Kepler-Observatory, Linz, Austria.

Comet Hale-Bopp with its prominent dust (white) and plasma (blue) tails. Photo via E. Kolmhofer, H. Raab; Johannes-Kepler-Observatory, Linz, Austria.

April 1, 1997. On this date, Comet Hale-Bopp – probably the best-remembered bright comet for many in the Northern Hemisphere – reached its perihelion or closest point to the sun. It was 0.9 astronomical units (AU, or Earth-sun distances) away from the sun on that day. Its brightness – though dispersed across a wider area than stars – exceeded that of any star in the sky except for Sirius, the sky’s brightest star.

As seen from the Northern Hemisphere, Hale-Bopp was the brightest comet since Comet West, sometimes called the Great Comet of 1976.

It stayed visible with the unaided eye for a record of 18 months, twice as long as the previous record holder: the Great Comet of 1811.

Hale-Bopp – officially labeled C/1995 O1 – became one of the most-viewed comets in human history. There are over 5,000 images of this comet available via a webpage maintained by NASA’s Jet Propulsion Laboratory.

Some called Hale-Bopp the Great Comet of 1997 (although others disagreed that it met the criteria for a Great Comet).

It attracted so many not only because of its rarity and beauty, but also because it enabled people to jump – in their minds – back in time. Some 4,200 years ago, when Hale-Bopp last passed the Earth and sun, the Egyptian pyramids were newly being polished by sand, and the Epic of Gilgamesh, considered the first great work of Western literature, was not yet written.

Comet Hale-Bopp was discovered on July 23, 1995 by two independently observing amateur astronomers: Alan Hale and Thomas Bopp. At that time, the comet was a whopping 7.2 AU from the sun, which made it the most distant comet to ever be discovered by amateurs up until that time.

What made that discovery possible was that Hale-Bopp was so bright. It was literally a thousand times brighter than Comet Halley had been at that same distance; Halley, one of the most famous comets, had visited the inner solar system a decade earlier. It was clear that Hale-Bopp was a very special comet, because comets typically don’t shine so brightly when they are beyond Jupiter’s orbit.

There were a few reasons explaining the comet’s unusual brightness. The main one is the enormous size of its nucleus, or core. Most cometary nuclei are thought to be no more than about 10 miles (16 km) across. The nucleus of Hale-Bopp had a diameter estimated to be between 25 and 40 miles across (40-60 km).

Giant Jupiter is thought to have affected this comet’s orbit. It’s been calculated that Hale-Bopp was last seen in Earth’s skies around 4,200 years ago. Now, though, the comet’s orbit is shorter. Astronomers think that – on what might’ve been its first voyage around the sun thousands of years ago – the comet almost collided with Jupiter. It passed very close to Jupiter again in April, 1996, shortening its orbital period still more. The comet’s current orbital period is around 2,530 Earth years.

No records have been found of the comet’s passage 4,200 years ago, but that does not mean that no records were made. It most likely means that none survived. Around 2213 BCE, when the comet last was visible, civilizations had been using the sky to track seasonal changes and other phenomena for a long time. They could not have missed Hale-Bopp.

For more about the world at Hale-Bopp passage around 2213 BCE, click here.

Thus, in a way, Hale-Bopp is like a clock that measures time in millennia. It reminds us of the progress humankind has made since its last visit.

Imagine what the world will look like when Comet Hale-Bopp next crosses our skies, sometime around the year 4380.

A night under the stars and a comet, C/1996 O1 Hale-Bopp. Owing to its orbital inclination and modest perihelion, 0.95 AU, it remained visible to the unaided eye for 18 months. (Credit: ©1997 Jerry Lodriguss / www.astropix.com)

A night under the stars and Comet Hale-Bopp. It remained visible to the unaided eye for 18 months. Photo ©1997 Jerry Lodriguss / www.astropix.com. Used with permission.

Bottom line: On April 1, 1997, Comet Hale-Bopp was at perihelion, its closest point to the sun. This comet – remembered by many – was the last widely seen comet from the Northern Hemisphere.

When is our next Great Comet?



from EarthSky http://ift.tt/25AjJrP
Comet Hale-Bopp with its prominent dust (white) and plasma (blue) tails. Photo via E. Kolmhofer, H. Raab; Johannes-Kepler-Observatory, Linz, Austria.

Comet Hale-Bopp with its prominent dust (white) and plasma (blue) tails. Photo via E. Kolmhofer, H. Raab; Johannes-Kepler-Observatory, Linz, Austria.

April 1, 1997. On this date, Comet Hale-Bopp – probably the best-remembered bright comet for many in the Northern Hemisphere – reached its perihelion or closest point to the sun. It was 0.9 astronomical units (AU, or Earth-sun distances) away from the sun on that day. Its brightness – though dispersed across a wider area than stars – exceeded that of any star in the sky except for Sirius, the sky’s brightest star.

As seen from the Northern Hemisphere, Hale-Bopp was the brightest comet since Comet West, sometimes called the Great Comet of 1976.

It stayed visible with the unaided eye for a record of 18 months, twice as long as the previous record holder: the Great Comet of 1811.

Hale-Bopp – officially labeled C/1995 O1 – became one of the most-viewed comets in human history. There are over 5,000 images of this comet available via a webpage maintained by NASA’s Jet Propulsion Laboratory.

Some called Hale-Bopp the Great Comet of 1997 (although others disagreed that it met the criteria for a Great Comet).

It attracted so many not only because of its rarity and beauty, but also because it enabled people to jump – in their minds – back in time. Some 4,200 years ago, when Hale-Bopp last passed the Earth and sun, the Egyptian pyramids were newly being polished by sand, and the Epic of Gilgamesh, considered the first great work of Western literature, was not yet written.

Comet Hale-Bopp was discovered on July 23, 1995 by two independently observing amateur astronomers: Alan Hale and Thomas Bopp. At that time, the comet was a whopping 7.2 AU from the sun, which made it the most distant comet to ever be discovered by amateurs up until that time.

What made that discovery possible was that Hale-Bopp was so bright. It was literally a thousand times brighter than Comet Halley had been at that same distance; Halley, one of the most famous comets, had visited the inner solar system a decade earlier. It was clear that Hale-Bopp was a very special comet, because comets typically don’t shine so brightly when they are beyond Jupiter’s orbit.

There were a few reasons explaining the comet’s unusual brightness. The main one is the enormous size of its nucleus, or core. Most cometary nuclei are thought to be no more than about 10 miles (16 km) across. The nucleus of Hale-Bopp had a diameter estimated to be between 25 and 40 miles across (40-60 km).

Giant Jupiter is thought to have affected this comet’s orbit. It’s been calculated that Hale-Bopp was last seen in Earth’s skies around 4,200 years ago. Now, though, the comet’s orbit is shorter. Astronomers think that – on what might’ve been its first voyage around the sun thousands of years ago – the comet almost collided with Jupiter. It passed very close to Jupiter again in April, 1996, shortening its orbital period still more. The comet’s current orbital period is around 2,530 Earth years.

No records have been found of the comet’s passage 4,200 years ago, but that does not mean that no records were made. It most likely means that none survived. Around 2213 BCE, when the comet last was visible, civilizations had been using the sky to track seasonal changes and other phenomena for a long time. They could not have missed Hale-Bopp.

For more about the world at Hale-Bopp passage around 2213 BCE, click here.

Thus, in a way, Hale-Bopp is like a clock that measures time in millennia. It reminds us of the progress humankind has made since its last visit.

Imagine what the world will look like when Comet Hale-Bopp next crosses our skies, sometime around the year 4380.

A night under the stars and a comet, C/1996 O1 Hale-Bopp. Owing to its orbital inclination and modest perihelion, 0.95 AU, it remained visible to the unaided eye for 18 months. (Credit: ©1997 Jerry Lodriguss / www.astropix.com)

A night under the stars and Comet Hale-Bopp. It remained visible to the unaided eye for 18 months. Photo ©1997 Jerry Lodriguss / www.astropix.com. Used with permission.

Bottom line: On April 1, 1997, Comet Hale-Bopp was at perihelion, its closest point to the sun. This comet – remembered by many – was the last widely seen comet from the Northern Hemisphere.

When is our next Great Comet?



from EarthSky http://ift.tt/25AjJrP

adds 2