From Oberlin to Oakland: The Advance of Lucid and the Building Dashboard

By Christina Burchette

How do you change the way people use energy? You turn them into active participants in their energy consumption by giving them tools to monitor their use. That is the goal of the Oakland-based company Lucid—to change our habits by making us aware of how much energy we use. So far, it’s been extremely successful.

The company got its start as a competing team from Oberlin College in EPA’s People, Prosperity, and the Planet (P3) grant competition. In 2005, the team won a P3 grant for their prototype: the Building Dashboard. This online tool tracks in real-time how much energy and water is being used in a building and provides visual insights that can influence occupants to change their habits.

After winning the P3 award, Lucid’s dashboard tool has been used in energy-saving competitions nationwide. In one ongoing competition, the Campus Conservation Nationals, participating students compete to reduce energy consumption in their residence halls over a three-week period. In the most recent competition, a little over 300,000 students and staff at 125 schools saved 1.9 million kilowatt-hours of energy by using Lucid’s product to track their usage. That’s equivalent to 2.4 million pounds of CO2 and $180,000 in savings!

Two Lucid employees using the product

Lucid’s platform prepares simple data visualizations to clearly show how much energy the user consumes.

Today, Lucid’s “BuildingOS” platform allows users to collect and access real-time data from all of their meters (like electricity, water, etc.) to help them monitor energy and water consumption and provides other building management tools. The platform is easy to use and prepares aesthetically-pleasing data visualizations to clearly show how much energy the user consumes. Lucid plans to take their work beyond monitoring a few buildings –their most recent initiative, “Connected Cities,” will use BuildingOS to help 100 U.S. cities become smarter about energy consumption.

Lucid is a perfect example of how small business and environmental concerns can come together and create innovative tools that change the way we think about resource consumption. We are proud to have supported them back when their business was just an idea, and that’s why we are so excited that they will be receiving a Phase II Small Business Innovation Research (SBIR) contract this month to continue developing their technology.

About the Author: Christina Burchette is a student contractor and writer for the science communication team in EPA’s Office of Research and Development.



from The EPA Blog http://ift.tt/1RDQeOg

By Christina Burchette

How do you change the way people use energy? You turn them into active participants in their energy consumption by giving them tools to monitor their use. That is the goal of the Oakland-based company Lucid—to change our habits by making us aware of how much energy we use. So far, it’s been extremely successful.

The company got its start as a competing team from Oberlin College in EPA’s People, Prosperity, and the Planet (P3) grant competition. In 2005, the team won a P3 grant for their prototype: the Building Dashboard. This online tool tracks in real-time how much energy and water is being used in a building and provides visual insights that can influence occupants to change their habits.

After winning the P3 award, Lucid’s dashboard tool has been used in energy-saving competitions nationwide. In one ongoing competition, the Campus Conservation Nationals, participating students compete to reduce energy consumption in their residence halls over a three-week period. In the most recent competition, a little over 300,000 students and staff at 125 schools saved 1.9 million kilowatt-hours of energy by using Lucid’s product to track their usage. That’s equivalent to 2.4 million pounds of CO2 and $180,000 in savings!

Two Lucid employees using the product

Lucid’s platform prepares simple data visualizations to clearly show how much energy the user consumes.

Today, Lucid’s “BuildingOS” platform allows users to collect and access real-time data from all of their meters (like electricity, water, etc.) to help them monitor energy and water consumption and provides other building management tools. The platform is easy to use and prepares aesthetically-pleasing data visualizations to clearly show how much energy the user consumes. Lucid plans to take their work beyond monitoring a few buildings –their most recent initiative, “Connected Cities,” will use BuildingOS to help 100 U.S. cities become smarter about energy consumption.

Lucid is a perfect example of how small business and environmental concerns can come together and create innovative tools that change the way we think about resource consumption. We are proud to have supported them back when their business was just an idea, and that’s why we are so excited that they will be receiving a Phase II Small Business Innovation Research (SBIR) contract this month to continue developing their technology.

About the Author: Christina Burchette is a student contractor and writer for the science communication team in EPA’s Office of Research and Development.



from The EPA Blog http://ift.tt/1RDQeOg

Researchers Mitigate Soldier Stressors with Science

Human-machine relationships with autonomous vehicles like unmanned aerial vehicles and unmanned ground vehicles are explored for future research. Photo illustration by Peggy Frierson

Human-machine relationships with autonomous vehicles like unmanned aerial vehicles and unmanned ground vehicles are explored for future research. Photo illustration by Peggy Frierson

By David Vergun

Army researchers are trying to better understand the types of stress soldiers could encounter in combat that might cause degraded performance. They are also looking into ways to mitigate those stress factors, said Dr. Mike LaFiandra.

LaFiandra, chief of the dismounted warrior branch at Army Research Laboratory, or ARL, spoke at the National Defense Industrial Association-sponsored Human Systems Conference here, Feb. 9.

Other speakers focused on both machine and human performance goals.

SOLDIER PERFORMANCE

Stressors impinging Soldier performance can be dust, toxic fumes, fatigue from carrying heavy loads and other things, LaFiandra said.

Special sensors are being used to measure the effects those variables have on performance, he said.

Once the information is quantified, the next step is exploring various mitigation strategies to prevent performance degradation, he said. Mitigation strategies that work could result in improved task performance upwards of 20 or 25 percent.

The methodology might seem pretty straightforward, but it’s actually not, LaFiandra said.

Certain stressors have a much greater impact on Soldier performance than others. Understanding why those differences occur is just as important as understanding the types of stressors, he said.

Another challenge is selecting the right sensors, he said. While some sensors are small and non-invasive, others, such as face masks which measure oxygen uptake are not, particularly in a field environment. Such invasive sensors might create stress of their own.

Identifying stressors and their causes is not always straightforward, LaFiandra said. For example, soldiers on flight lines were found to experience a much higher than average loss of hearing.

One might conclude, LaFiandra said, that the loss of hearing was simply due to the noise of helicopters and jets. But further investigation found the cause to be slightly more complicated than that. An occupational health study found that the toxic effects of aircraft fumes compounded the effects of the aircraft noise in causing hearing loss. The study could later inform mitigation strategies for that hearing loss.

ARL is working with the Defense Advanced Research Projects Agency, the other services and Special Operations Command on a number of other stressor mitigation projects that show promise to improve Soldier performance.

HUMAN-MACHINE PERFORMANCE

Dr. Greg Zacharias, the U.S. Air Force’s chief scientist and advisor to the Air Force chief of staff, spoke about human-machine relationships with autonomous vehicles like unmanned aerial vehicles, or UAV, and unmanned ground vehicles, used by all of the services.

While these vehicles are called autonomous, he noted, they’re really not, because a human is in the loop interacting with the systems.

While autonomous vehicles hold great promise for warfighters winning in a complex world, Zacharias raised concerns about how humans interact with those machines.

“Vigilance complacency” is one example of how a Soldier could team poorly with a UAV, he said. Long hours watching a blip on a computer screen could and has caused UAV operators to lose focus and not detect errors.

A possible solution to vigilance complacency, he said, is to make the machine more aware of the operator, monitoring the operator’s physiological state of alertness and providing some sort of warning when alertness levels decline below a certain point.

Complexity is another potential problem. As gear becomes more complex, longer training time is needed for operators, he said.

Complex controls, displays and actions required by the operator increase workload and decision time. Eventually, operator performance could deteriorate to the point where the benefits of autonomy become lost.

A mitigation strategy to prevent complexity would need to come at the early design phase of the system, with extensive user testing to determine how well the human is interacting with the machine.

The right level of operator trust in their equipment is also important, Zacharias noted. If an operator puts too much trust in a UAV to operate on autopilot, for instance, the operator might not notice a decrease in speed and elevation and a crash could result.

Not enough trust in a machine can also prove detrimental, he said. An operator who doesn’t trust the machine and overrides its calculations can cause harm to the machine or the mission.

FUTURE HUMAN-MACHINE ENDEAVORS

Zacharias thinks that future autonomous systems will be wired in ways similar to the human neural network. This, he said, will allow machines to have a better understanding of their human counterpart and humans will be able to better relate to their machines.

The chief scientist even thinks that systems can be designed with “flexible autonomy,” whereby the operator can hand off tasks to the machine, or the machine can hand off certain tasks to the operator, based on mission demands and workload changes.

The neural networked machine could become aware of itself, similar to the way humans are self-aware, and if the machine becomes damaged, it might even find ways to heal itself, he said.

Follow Armed with Science on Twitter!

Disclaimer: The appearance of hyperlinks does not constitute endorsement by the Department of Defense of this website or the information, products or services contained therein. For other than authorized activities such as military exchanges and Morale, Welfare and Recreation sites, the Department of Defense does not exercise any editorial control over the information you may find at these locations. Such links are provided consistent with the stated purpose of this DOD website.



from Armed with Science http://ift.tt/1LSGgEg
Human-machine relationships with autonomous vehicles like unmanned aerial vehicles and unmanned ground vehicles are explored for future research. Photo illustration by Peggy Frierson

Human-machine relationships with autonomous vehicles like unmanned aerial vehicles and unmanned ground vehicles are explored for future research. Photo illustration by Peggy Frierson

By David Vergun

Army researchers are trying to better understand the types of stress soldiers could encounter in combat that might cause degraded performance. They are also looking into ways to mitigate those stress factors, said Dr. Mike LaFiandra.

LaFiandra, chief of the dismounted warrior branch at Army Research Laboratory, or ARL, spoke at the National Defense Industrial Association-sponsored Human Systems Conference here, Feb. 9.

Other speakers focused on both machine and human performance goals.

SOLDIER PERFORMANCE

Stressors impinging Soldier performance can be dust, toxic fumes, fatigue from carrying heavy loads and other things, LaFiandra said.

Special sensors are being used to measure the effects those variables have on performance, he said.

Once the information is quantified, the next step is exploring various mitigation strategies to prevent performance degradation, he said. Mitigation strategies that work could result in improved task performance upwards of 20 or 25 percent.

The methodology might seem pretty straightforward, but it’s actually not, LaFiandra said.

Certain stressors have a much greater impact on Soldier performance than others. Understanding why those differences occur is just as important as understanding the types of stressors, he said.

Another challenge is selecting the right sensors, he said. While some sensors are small and non-invasive, others, such as face masks which measure oxygen uptake are not, particularly in a field environment. Such invasive sensors might create stress of their own.

Identifying stressors and their causes is not always straightforward, LaFiandra said. For example, soldiers on flight lines were found to experience a much higher than average loss of hearing.

One might conclude, LaFiandra said, that the loss of hearing was simply due to the noise of helicopters and jets. But further investigation found the cause to be slightly more complicated than that. An occupational health study found that the toxic effects of aircraft fumes compounded the effects of the aircraft noise in causing hearing loss. The study could later inform mitigation strategies for that hearing loss.

ARL is working with the Defense Advanced Research Projects Agency, the other services and Special Operations Command on a number of other stressor mitigation projects that show promise to improve Soldier performance.

HUMAN-MACHINE PERFORMANCE

Dr. Greg Zacharias, the U.S. Air Force’s chief scientist and advisor to the Air Force chief of staff, spoke about human-machine relationships with autonomous vehicles like unmanned aerial vehicles, or UAV, and unmanned ground vehicles, used by all of the services.

While these vehicles are called autonomous, he noted, they’re really not, because a human is in the loop interacting with the systems.

While autonomous vehicles hold great promise for warfighters winning in a complex world, Zacharias raised concerns about how humans interact with those machines.

“Vigilance complacency” is one example of how a Soldier could team poorly with a UAV, he said. Long hours watching a blip on a computer screen could and has caused UAV operators to lose focus and not detect errors.

A possible solution to vigilance complacency, he said, is to make the machine more aware of the operator, monitoring the operator’s physiological state of alertness and providing some sort of warning when alertness levels decline below a certain point.

Complexity is another potential problem. As gear becomes more complex, longer training time is needed for operators, he said.

Complex controls, displays and actions required by the operator increase workload and decision time. Eventually, operator performance could deteriorate to the point where the benefits of autonomy become lost.

A mitigation strategy to prevent complexity would need to come at the early design phase of the system, with extensive user testing to determine how well the human is interacting with the machine.

The right level of operator trust in their equipment is also important, Zacharias noted. If an operator puts too much trust in a UAV to operate on autopilot, for instance, the operator might not notice a decrease in speed and elevation and a crash could result.

Not enough trust in a machine can also prove detrimental, he said. An operator who doesn’t trust the machine and overrides its calculations can cause harm to the machine or the mission.

FUTURE HUMAN-MACHINE ENDEAVORS

Zacharias thinks that future autonomous systems will be wired in ways similar to the human neural network. This, he said, will allow machines to have a better understanding of their human counterpart and humans will be able to better relate to their machines.

The chief scientist even thinks that systems can be designed with “flexible autonomy,” whereby the operator can hand off tasks to the machine, or the machine can hand off certain tasks to the operator, based on mission demands and workload changes.

The neural networked machine could become aware of itself, similar to the way humans are self-aware, and if the machine becomes damaged, it might even find ways to heal itself, he said.

Follow Armed with Science on Twitter!

Disclaimer: The appearance of hyperlinks does not constitute endorsement by the Department of Defense of this website or the information, products or services contained therein. For other than authorized activities such as military exchanges and Morale, Welfare and Recreation sites, the Department of Defense does not exercise any editorial control over the information you may find at these locations. Such links are provided consistent with the stated purpose of this DOD website.



from Armed with Science http://ift.tt/1LSGgEg

Why space bodies come in different sizes

You've seen illustrations like this one, showing the relative sizes of things in space? This one is from www.rense.com, where you can see more like this. Also, be sure to check out the videos below.

You’ve seen illustrations like this one, showing the relative sizes of things in space? This one is from www.rense.com, where you can see more like this. Also, be sure to check out the videos below.

A professor of mechanical engineering at Duke has applied a theory he formulated in 1996 – called Constructal Law, related to the way that patterns in nature are generated – to the hierarchical array of sizes of space objects. In other words, why do objects in space come in different sizes, from mighty stars with over 1,000 times the diameter of our sun, to little chunks of rocks like those that sometimes enter Earth’s atmosphere and makes bright streaks across our sky? Why isn’t everything the same size?

Adrian Bejan at Duke University used his earlier theory to determine a possible reason. He says it’s because:

… a universe that contains some big objects and many small objects relieves gravitational tension faster than a uniform universe.

He and his team report their finding in the Journal of Applied Physics.

So … what does it mean? Bejan’s specialty is in thermodynamics, which describes how different forms of energy affect each other and the matter around them. His Constructal Law, which he’s worked on for years and which isn’t accepted by all scientists but which has a certain logic and beauty about it, is all about flow. It states that natural systems evolve to facilitate flow. So, for example:

Raindrops coalesce and move together, generating rivulets, streams and the mighty river basins of the world because this design allows them to move more easily.

That’s raindrops and water. How about solid bodies in space and their array of different sizes? Bejan and his student, Russell Wagstaff, started by calculating the tension caused by gravitational attraction between bodies of the same size, uniformly distributed in space.

They showed that if the bodies coalesce into some large bodies and some small bodies, the tension is reduced faster than if the bodies merged uniformly.

Bejan says this break-up of the uniform suspension of bodies into a few large and many small bodies occurs because it’s the fastest way to ease the internal tension caused by gravity. In other words, it happens because – in our particular universe, with our particular nature working as it does – it’s the easiest thing that can happen .Bean commented:

I never thought I would have anything to say about celestial bodies in pure physics, but by chance I realized I have a key to open a new door.

Want to read ore about gravitational tension? The physics gets very advanced very quickly, but here’s a place to start.

There are various films out about the relative sizes of things in space, but I like this recent one – from 2015 – by Wylie Overstreet and Alex Gorosh. It gets across not only the relative sizes of the sun and major planets in our solar system, but also the vast amount of empty space between them.

The next film is interesting, too, and shows the relative sizes of major planets in our solar system, plus the sun and other stars in the universe.

Bottom line: Objects in space aren’t the same size, but why not? A scientist has used a theory he formulated earlier – to explain patterns in nature on Earth – to suggest a reason.



from EarthSky http://ift.tt/1oW575r
You've seen illustrations like this one, showing the relative sizes of things in space? This one is from www.rense.com, where you can see more like this. Also, be sure to check out the videos below.

You’ve seen illustrations like this one, showing the relative sizes of things in space? This one is from www.rense.com, where you can see more like this. Also, be sure to check out the videos below.

A professor of mechanical engineering at Duke has applied a theory he formulated in 1996 – called Constructal Law, related to the way that patterns in nature are generated – to the hierarchical array of sizes of space objects. In other words, why do objects in space come in different sizes, from mighty stars with over 1,000 times the diameter of our sun, to little chunks of rocks like those that sometimes enter Earth’s atmosphere and makes bright streaks across our sky? Why isn’t everything the same size?

Adrian Bejan at Duke University used his earlier theory to determine a possible reason. He says it’s because:

… a universe that contains some big objects and many small objects relieves gravitational tension faster than a uniform universe.

He and his team report their finding in the Journal of Applied Physics.

So … what does it mean? Bejan’s specialty is in thermodynamics, which describes how different forms of energy affect each other and the matter around them. His Constructal Law, which he’s worked on for years and which isn’t accepted by all scientists but which has a certain logic and beauty about it, is all about flow. It states that natural systems evolve to facilitate flow. So, for example:

Raindrops coalesce and move together, generating rivulets, streams and the mighty river basins of the world because this design allows them to move more easily.

That’s raindrops and water. How about solid bodies in space and their array of different sizes? Bejan and his student, Russell Wagstaff, started by calculating the tension caused by gravitational attraction between bodies of the same size, uniformly distributed in space.

They showed that if the bodies coalesce into some large bodies and some small bodies, the tension is reduced faster than if the bodies merged uniformly.

Bejan says this break-up of the uniform suspension of bodies into a few large and many small bodies occurs because it’s the fastest way to ease the internal tension caused by gravity. In other words, it happens because – in our particular universe, with our particular nature working as it does – it’s the easiest thing that can happen .Bean commented:

I never thought I would have anything to say about celestial bodies in pure physics, but by chance I realized I have a key to open a new door.

Want to read ore about gravitational tension? The physics gets very advanced very quickly, but here’s a place to start.

There are various films out about the relative sizes of things in space, but I like this recent one – from 2015 – by Wylie Overstreet and Alex Gorosh. It gets across not only the relative sizes of the sun and major planets in our solar system, but also the vast amount of empty space between them.

The next film is interesting, too, and shows the relative sizes of major planets in our solar system, plus the sun and other stars in the universe.

Bottom line: Objects in space aren’t the same size, but why not? A scientist has used a theory he formulated earlier – to explain patterns in nature on Earth – to suggest a reason.



from EarthSky http://ift.tt/1oW575r

Biggest ever US methane leak from California blowout

Site of the massive methane leak. Photo: Getty Images

Site of the massive methane leak. Photo: Getty Images

The natural gas well blowout in Aliso Canyon, California, first reported on October 23, 2015, vented over 100,000 tons of the powerful greenhouse gas methane into the atmosphere before the well was finally sealed almost four months later, on February 11, 2016. That’s the largest methane leak in U.S. history, according to the first study of the accident, published in the journal Science on February 25, 2016.

The research showed that during the peak of the Aliso Canyon event, enough methane poured into the air every day to fill a balloon the size of the Rose Bowl. Total emissions during the 112-day event were equal to one-quarter of the annual methane pollution from all other sources in the Los Angeles basin combined.

The disaster’s impact on climate, said the researchers, will be equivalent to the annual greenhouse gas emissions from over half a million passenger cars.

Co-lead scientist and pilot Stephen Conley of Scientific Aviation and UC Davis said first readings in early November were so high he had to recheck his gear. He said in a statement:

It became obvious that there wasn’t anything wrong with the instruments. This was just a huge event.

At the time, Conley and his specially equipped plane were working with University of California Davis on a California Energy Commission project searching for pipeline methane leaks. The state agency asked him to overfly the area around the breached SoCalGas well.

Conley and his team’s measurements confirmed that high concentrations of methane and ethane were surging from the breached well into the densely populated San Fernando Valley.

Eventually, more than 5,726 families were evacuated and Gov. Jerry Brown declared a state of emergency.

The leaking well, marked SS25 in this picture, is very close to the community of Porter Ranch. Image: Stephen Conley

The leaking well, marked SS25 in this picture, is very close to the community of Porter Ranch. Image: Stephen Conley

The analysis found that at its peak, the blowout doubled the rate of methane emissions from the entire Los Angeles basin and temporarily created the largest known human-caused point source of methane in the United States, twice as large as the next-largest source, an Alabama coal mine.

NOAA’s Tom Ryerson was co-lead scientist on the study. He said:

The disaster will substantially impact California’s ability to meet greenhouse gas emission targets for the year, the researchers said.

Our results show how failures of natural gas infrastructure can significantly impact greenhouse gas control efforts.

Enjoying EarthSky? Sign up for our free daily newsletter today!

Bottom line: The Aliso Canyon, California natural gas well blowout was the largest methane leak in U.S. history, according to the first study of the accident, published in the journal Science on February 25, 2016. The leak, first reported on October 23, 2015, vented over 100,000 tons of the powerful greenhouse gas methane into the atmosphere before the well was finally sealed almost four months later, on February 11, 2016.

Read more from the University of California Davis



from EarthSky http://ift.tt/1L2u5tE
Site of the massive methane leak. Photo: Getty Images

Site of the massive methane leak. Photo: Getty Images

The natural gas well blowout in Aliso Canyon, California, first reported on October 23, 2015, vented over 100,000 tons of the powerful greenhouse gas methane into the atmosphere before the well was finally sealed almost four months later, on February 11, 2016. That’s the largest methane leak in U.S. history, according to the first study of the accident, published in the journal Science on February 25, 2016.

The research showed that during the peak of the Aliso Canyon event, enough methane poured into the air every day to fill a balloon the size of the Rose Bowl. Total emissions during the 112-day event were equal to one-quarter of the annual methane pollution from all other sources in the Los Angeles basin combined.

The disaster’s impact on climate, said the researchers, will be equivalent to the annual greenhouse gas emissions from over half a million passenger cars.

Co-lead scientist and pilot Stephen Conley of Scientific Aviation and UC Davis said first readings in early November were so high he had to recheck his gear. He said in a statement:

It became obvious that there wasn’t anything wrong with the instruments. This was just a huge event.

At the time, Conley and his specially equipped plane were working with University of California Davis on a California Energy Commission project searching for pipeline methane leaks. The state agency asked him to overfly the area around the breached SoCalGas well.

Conley and his team’s measurements confirmed that high concentrations of methane and ethane were surging from the breached well into the densely populated San Fernando Valley.

Eventually, more than 5,726 families were evacuated and Gov. Jerry Brown declared a state of emergency.

The leaking well, marked SS25 in this picture, is very close to the community of Porter Ranch. Image: Stephen Conley

The leaking well, marked SS25 in this picture, is very close to the community of Porter Ranch. Image: Stephen Conley

The analysis found that at its peak, the blowout doubled the rate of methane emissions from the entire Los Angeles basin and temporarily created the largest known human-caused point source of methane in the United States, twice as large as the next-largest source, an Alabama coal mine.

NOAA’s Tom Ryerson was co-lead scientist on the study. He said:

The disaster will substantially impact California’s ability to meet greenhouse gas emission targets for the year, the researchers said.

Our results show how failures of natural gas infrastructure can significantly impact greenhouse gas control efforts.

Enjoying EarthSky? Sign up for our free daily newsletter today!

Bottom line: The Aliso Canyon, California natural gas well blowout was the largest methane leak in U.S. history, according to the first study of the accident, published in the journal Science on February 25, 2016. The leak, first reported on October 23, 2015, vented over 100,000 tons of the powerful greenhouse gas methane into the atmosphere before the well was finally sealed almost four months later, on February 11, 2016.

Read more from the University of California Davis



from EarthSky http://ift.tt/1L2u5tE

The Schrödinger Sessions II: More Science for More Science Fiction [Uncertain Principles]

As you probably already know, last year we ran a workshop at the Joint Quantum Institute for science-fiction writers who would like to learn more about quantum physics. The workshop was a lot of fun from the speaker/oragnizer side, and very well received by last year’s writers, so we’re doing it again:

The Schrödinger Sessions is a three-day workshop for science fiction writers offering a “crash course” in modern physics, to be held at the Joint Quantum Institute (JQI), one of the world’s leading research centers for the study of quantum mechanics. We will introduce participants to phenomena like superposition, entanglement, and quantum information through a series of lectures by JQI scientists and tours of JQI laboratories. We hope this will inform and inspire new stories in print, on screen, and in electronic media, that will in turn inspire a broad audience to learn more about the weird and fascinating science of quantum physics and the transformative technologies it enables.

The workshop will be held at JQI from Thursday, July 28 through Saturday, July 30. Participants will be housed locally, with breakfast and lunch provided at the workshop; evenings will be free to allow participants to explore the Washington, DC area.

Participants will be selected on the basis of an application asking about personal background, interest, and publication history. We will work to ensure the greatest possible diversity of race and gender, as well as type of media (print, television, etc.), with an eye toward reaching the broadest audience. Applications will be accepted on-line from March 1 through March 20, 2015, and acceptance decisions will be made around April 15, 2015.

The online application form is now live, so if this is something you’d be interested in, check it out and send us an application. And please share this with anyone else you know who might be interested.



from ScienceBlogs http://ift.tt/1L2wiVU

As you probably already know, last year we ran a workshop at the Joint Quantum Institute for science-fiction writers who would like to learn more about quantum physics. The workshop was a lot of fun from the speaker/oragnizer side, and very well received by last year’s writers, so we’re doing it again:

The Schrödinger Sessions is a three-day workshop for science fiction writers offering a “crash course” in modern physics, to be held at the Joint Quantum Institute (JQI), one of the world’s leading research centers for the study of quantum mechanics. We will introduce participants to phenomena like superposition, entanglement, and quantum information through a series of lectures by JQI scientists and tours of JQI laboratories. We hope this will inform and inspire new stories in print, on screen, and in electronic media, that will in turn inspire a broad audience to learn more about the weird and fascinating science of quantum physics and the transformative technologies it enables.

The workshop will be held at JQI from Thursday, July 28 through Saturday, July 30. Participants will be housed locally, with breakfast and lunch provided at the workshop; evenings will be free to allow participants to explore the Washington, DC area.

Participants will be selected on the basis of an application asking about personal background, interest, and publication history. We will work to ensure the greatest possible diversity of race and gender, as well as type of media (print, television, etc.), with an eye toward reaching the broadest audience. Applications will be accepted on-line from March 1 through March 20, 2015, and acceptance decisions will be made around April 15, 2015.

The online application form is now live, so if this is something you’d be interested in, check it out and send us an application. And please share this with anyone else you know who might be interested.



from ScienceBlogs http://ift.tt/1L2wiVU

Ten Years Of Portable Internet Access [Aardvarchaeology]

My 2006 smartphone, a Qtek 9100

My 2006 smartphone, a Qtek 9100

On 2 February 2006 I took delivery of my first smartphone, or handdator as I called it in my diary – “hand computer”. On the following day I got the machine on-line. It was a Qtek 9100, with a slide-out mechanical keyboard that I still really miss, a tiny screen, a stylus and a crappy camera. Since then I’ve had portable Internet access.

I was already a self-described “net head”, and a particular reason for me to get a smartphone was that I’d started blogging a few weeks previously: I wanted to be able to post no matter where I was. On 8 February, for instance, I almost managed to blog from a train. On 26 February I blogged while skiing cross-country. On 14 April I blogged about my hatred of aluminium bottle tops while sitting on my haunches in an Östergötland field. And on 1 June I blogged from the top of a tree in the middle of the Erstavik woods. Though I had no way of putting photographs on-line from my smartphone at the time.

Quite apart from the blogging aspect, this constant access to the net has of course changed my whole way of life. Google Maps means I don’t prepare for trips anywhere near as well as before. That info can be had on the fly. Even tickets for buses and planes are in the smartphone. And any discussion of factual matters is now simply resolved by someone checking Wikipedia. Ebooks, streaming music and podcast subscription software keep me entertained and my luggage light. And email and Facebook are always with me.

The Samsung Galaxy S6 that I use these days is of course so much better than that old Qtek in most respects. Except for the virtual keyboard and autocorrect. And for the ridiculous fact that phones are no longer made with anywhere to fasten a wrist strap. But what really hampers my smartphone use is a weakness that has been with these devices ever since 2006: I still have to charge the silly thing every night.



from ScienceBlogs http://ift.tt/1LtJoLO
My 2006 smartphone, a Qtek 9100

My 2006 smartphone, a Qtek 9100

On 2 February 2006 I took delivery of my first smartphone, or handdator as I called it in my diary – “hand computer”. On the following day I got the machine on-line. It was a Qtek 9100, with a slide-out mechanical keyboard that I still really miss, a tiny screen, a stylus and a crappy camera. Since then I’ve had portable Internet access.

I was already a self-described “net head”, and a particular reason for me to get a smartphone was that I’d started blogging a few weeks previously: I wanted to be able to post no matter where I was. On 8 February, for instance, I almost managed to blog from a train. On 26 February I blogged while skiing cross-country. On 14 April I blogged about my hatred of aluminium bottle tops while sitting on my haunches in an Östergötland field. And on 1 June I blogged from the top of a tree in the middle of the Erstavik woods. Though I had no way of putting photographs on-line from my smartphone at the time.

Quite apart from the blogging aspect, this constant access to the net has of course changed my whole way of life. Google Maps means I don’t prepare for trips anywhere near as well as before. That info can be had on the fly. Even tickets for buses and planes are in the smartphone. And any discussion of factual matters is now simply resolved by someone checking Wikipedia. Ebooks, streaming music and podcast subscription software keep me entertained and my luggage light. And email and Facebook are always with me.

The Samsung Galaxy S6 that I use these days is of course so much better than that old Qtek in most respects. Except for the virtual keyboard and autocorrect. And for the ridiculous fact that phones are no longer made with anywhere to fasten a wrist strap. But what really hampers my smartphone use is a weakness that has been with these devices ever since 2006: I still have to charge the silly thing every night.



from ScienceBlogs http://ift.tt/1LtJoLO

Super Tuesday Open Thread (including my predictions) [Greg Laden's Blog]

Put your stuff in the comments!

Also being discussed on my Facebook page.

Here’s my predictions, by the way:

Screen Shot 2016-02-28 at 12.50.21 PM



from ScienceBlogs http://ift.tt/1XXS8M8

Put your stuff in the comments!

Also being discussed on my Facebook page.

Here’s my predictions, by the way:

Screen Shot 2016-02-28 at 12.50.21 PM



from ScienceBlogs http://ift.tt/1XXS8M8