Transparency vs. Harassment [Greg Laden's Blog]

Michael Halpern, of the Union of Concerned Scientists, and climate scientist Michael Mann have written an editorial for Science, “Transparency and harassment.”

Open records laws hold Universities and other public institutions accountable, protecting against biasing influences such as we might see from funding sources. (See: Cry for me Willie Soon).

Over the last couple of decades, interpersonal conversations among researchers have shifted from the milieu of vibrating air molecules in a room (or transformed into electrical signals and transferred over a phone) to electronic form. Today, a very large part of the conversation ongoing among research colleagues, or teachers and students, ends up in emails or other forms of eCommunication.

Activists of any stipe have increasingly been using open records laws and regulations to access these private conversations, as well as early drafts of papers and other information. Halpern and Mann make the point that “[t]hese requests can attack and intimidate academics, threatening their reputations, chilling their speech, disrupting their research, discouraging them from tackling contentious topics, and ultimately confusing the public.”

They ask what is the appropriate way to attain transparency while at the same time not stifling research or producing an uncontrolled form of political weaponry ripe for abuse?

Not only is excessive and invasive use of open records procedure intrusive and intimidating, it is also costly. There have been several instances, cited by Halpern and Mann, of institutions spending significant resources on addressing requests for information, a cost that is paid whether or not the information is actually accessed in the end. And, when this goes to court, the costs go up. In one case, Mann’s institution was hit with information requests that came ultimately as a result of a congressional investigation. Halpern and Mann report that in this case,

The Virginia Supreme Court ruled in 2014 that excessive disclosure could put the university at a “competitive disadvantage,” and cause “harm to university-wide research efforts, damage to faculty recruitment and retention, undermining of faculty expectations of privacy and confidentiality, and impairment of free thought and expression.”

Halpern and Mann suggest that institutions such as Universities get up to speed, and get their researchers and faculty up to speed, to know how to properly handle information requests, “not to determine the appropriate response, but to help employees understand how access to correspondence could be misused.” If this is done, there may ultimately emerge a set of standards that fill in the logic gap between fair and reasonable disclosure and normal collegiate conversation. In short, Halpern and Mann are asking for a modernization of disclosure and transparency law and procedure, with the ultimate goal of creating legitimate public trust in science and avoiding the stifling effects of misuse of open records law.

The editorial is here, but it may be behind a paywall.



from ScienceBlogs http://ift.tt/1QPRwFW

Michael Halpern, of the Union of Concerned Scientists, and climate scientist Michael Mann have written an editorial for Science, “Transparency and harassment.”

Open records laws hold Universities and other public institutions accountable, protecting against biasing influences such as we might see from funding sources. (See: Cry for me Willie Soon).

Over the last couple of decades, interpersonal conversations among researchers have shifted from the milieu of vibrating air molecules in a room (or transformed into electrical signals and transferred over a phone) to electronic form. Today, a very large part of the conversation ongoing among research colleagues, or teachers and students, ends up in emails or other forms of eCommunication.

Activists of any stipe have increasingly been using open records laws and regulations to access these private conversations, as well as early drafts of papers and other information. Halpern and Mann make the point that “[t]hese requests can attack and intimidate academics, threatening their reputations, chilling their speech, disrupting their research, discouraging them from tackling contentious topics, and ultimately confusing the public.”

They ask what is the appropriate way to attain transparency while at the same time not stifling research or producing an uncontrolled form of political weaponry ripe for abuse?

Not only is excessive and invasive use of open records procedure intrusive and intimidating, it is also costly. There have been several instances, cited by Halpern and Mann, of institutions spending significant resources on addressing requests for information, a cost that is paid whether or not the information is actually accessed in the end. And, when this goes to court, the costs go up. In one case, Mann’s institution was hit with information requests that came ultimately as a result of a congressional investigation. Halpern and Mann report that in this case,

The Virginia Supreme Court ruled in 2014 that excessive disclosure could put the university at a “competitive disadvantage,” and cause “harm to university-wide research efforts, damage to faculty recruitment and retention, undermining of faculty expectations of privacy and confidentiality, and impairment of free thought and expression.”

Halpern and Mann suggest that institutions such as Universities get up to speed, and get their researchers and faculty up to speed, to know how to properly handle information requests, “not to determine the appropriate response, but to help employees understand how access to correspondence could be misused.” If this is done, there may ultimately emerge a set of standards that fill in the logic gap between fair and reasonable disclosure and normal collegiate conversation. In short, Halpern and Mann are asking for a modernization of disclosure and transparency law and procedure, with the ultimate goal of creating legitimate public trust in science and avoiding the stifling effects of misuse of open records law.

The editorial is here, but it may be behind a paywall.



from ScienceBlogs http://ift.tt/1QPRwFW

ST398 carriage and infections in farmers, United States [Aetiology]

I’ve been working on livestock-associated Staphylococcus aureus and farming now for almost a decade. In that time, work from my lab has shown that, first, the “livestock-associated” strain of methicillin-resistant S. aureus (MRSA) that was found originally in Europe and then in Canada, ST398, is in the United States in pigs and farmers; that it’s present here in raw meat products; that “LA” S. aureus can be found not only in the agriculture-intensive Midwest, but also in tiny pig states like Connecticut. With collaborators, we’ve also shown that ST398 can be found in unexpected places, like Manhattan, and that the ST398 strain appears to have originated as a “human” type of S. aureus which subsequently was transmitted to and evolved in pigs, obtaining additional antibiotic-resistance genes while losing some genes that help the bacterium adapt to its human host. We also found a “human” type of S. aureus, ST5, way more commonly than expected in pigs originating in central Iowa, suggesting that the evolution of S. aureus in livestock is ongoing, and is more complicated than just ST398 = “livestock” Staph.

However, with all of this research, there’s been a big missing link that I repeatedly get asked about: what about actual, symptomatic infections in people? How often do S. aureus that farmers might encounter on the farm make them ill? We tried to address this in a retrospective survey we published previously, but that research suffered from all the problems that retrospective surveys do–recall bias, low response rate, and the possibility that those who responded did so *because* they had more experience with S. aureus infections, thus making the question more important to them. Plus, because it was asking about the past, we had no way to know that, even if they did report a prior infection, if it was due to ST398 or another type of S. aureus.

So, in 2011, we started a prospective study that was just published in Clinical Infectious Diseases, enrolling over 1,300 rural Iowans (mostly farmers of some type, though we did include individuals with no farming exposures as well, and spouses and children of farmers) and testing them at enrollment for S. aureus colonization in the nose or throat. Like previous studies done by our group and others in the US, we found that pig farmers were more likely to be carrying S. aureus that were resistant to multiple antibiotics, and especially to tetracycline–a common antibiotic used while raising pigs. Surprisingly, we didn’t find any difference in MRSA colonization among groups, but that’s likely because we enrolled relatively small-scale farmers, rather than workers in concentrated animal feeding operations (CAFOs) like we had examined in prior research, who are exposed to many more animals living in more crowded conditions (and possibly receiving more antibiotics).

What was unique about this study, besides its large size, was that we then followed participants for 18 months to examine development of S. aureus infections. Participants sent us a monthly questionnaire telling us that they had a possible Staph infection or not; describing the infection if there was one, including physician diagnosis and treatment; and when possible, sending us a sample of the infected area for bacterial isolation and typing. Over the course of the study, which followed people for over 15,ooo “person-months” in epi-speak, 67 of our participants reported developing over 100 skin and soft tissue infections. Some of them were “possibly” S. aureus–sometimes they didn’t go to the doctor, but they had a skin infection that matched the handout we had given them that gave pictures of what Staph infections commonly look like. Other times they were cellulitis, which often can’t be definitively confirmed as caused by S. aureus without more invasive tests. Forty-two of the infections were confirmed by a physician, or at the lab as S. aureus due to a swab sent by the patient.

Of the swabs we received that were positive, 3/10 were found to be ST398 strains–and all of those were in individuals who had contact with livestock. A fourth individual who also had contact with pigs and cows had an ST15 infection. Individuals lacking livestock contact had infections with more typical “human” strains, such as ST8 and ST5 (usually described as “community-associated” and “hospital-associated” types of Staph). So yes, ST398 is causing infections in farmers in the US–and very likely, these are flying under the radar, because 1) farmers really, really don’t like to go to the doctor unless they’re practically on their deathbed, and 2) even if they do, and even if the physician diagnoses and cultures S. aureus (which is not incredibly common–many diagnoses are made on appearance alone), there are very limited programs in rural areas to routinely type S. aureus. Even in Iowa, where invasive S. aureus infections were previously state-reportable, we know that fewer than half of the samples even from these infections ever made it to the State lab for testing–and for skin infections? Not even evaluated.

As warnings are sounded all over the world about the looming problem of antibiotic resistance, we need to rein in the denial of antibiotic resistance in the food/meat industry. Some positive steps are being made–just the other day, Tyson foods announced they plan to eliminate human-use antibiotics in their chicken, and places like McDonald’s and Chipotle are using antibiotic-free chicken and/or other meat products in response to consumer demand. However, pork and beef still remain more stubborn when it comes to antibiotic use on farms, despite a recent study showing that resistant bacteria generated on cattle feed yards can transmit via the air, and studies by my group and others demonstrating that people who live in proximity to CAFOs or areas where swine waste is deposited are more likely to have MRSA colonization and/or infections–even if it’s with the “human” types of S. aureus. The cat is already out of the bag, the genie is out of the bottle, whatever image or metaphor you prefer–we need to increase surveillance to detect and mitigate these issues, better integrate rural hospitals and clinics into our surveillance nets, and work on mitigation of resistance development and on new solutions for treatment cohesively and with all stakeholders at the table. I don’t think that’s too much to ask, given the stakes.

Reference: Wardyn SE, Forshey BM, Farina S, Kates AE, Nair R, Quick M, Wu J, Hanson BM, O’Malley S, Shows H, Heywood E, Beane-Freeman LE, Lynch CF, Carrel M, Smith TC. Swine farming is a risk factor for infection with and high prevalence of multi-drug resistant Staphylococcus aureus. Clinical Infectious Diseases, in press, 2015. Link to press release.

 



from ScienceBlogs http://ift.tt/1QPRzlo

I’ve been working on livestock-associated Staphylococcus aureus and farming now for almost a decade. In that time, work from my lab has shown that, first, the “livestock-associated” strain of methicillin-resistant S. aureus (MRSA) that was found originally in Europe and then in Canada, ST398, is in the United States in pigs and farmers; that it’s present here in raw meat products; that “LA” S. aureus can be found not only in the agriculture-intensive Midwest, but also in tiny pig states like Connecticut. With collaborators, we’ve also shown that ST398 can be found in unexpected places, like Manhattan, and that the ST398 strain appears to have originated as a “human” type of S. aureus which subsequently was transmitted to and evolved in pigs, obtaining additional antibiotic-resistance genes while losing some genes that help the bacterium adapt to its human host. We also found a “human” type of S. aureus, ST5, way more commonly than expected in pigs originating in central Iowa, suggesting that the evolution of S. aureus in livestock is ongoing, and is more complicated than just ST398 = “livestock” Staph.

However, with all of this research, there’s been a big missing link that I repeatedly get asked about: what about actual, symptomatic infections in people? How often do S. aureus that farmers might encounter on the farm make them ill? We tried to address this in a retrospective survey we published previously, but that research suffered from all the problems that retrospective surveys do–recall bias, low response rate, and the possibility that those who responded did so *because* they had more experience with S. aureus infections, thus making the question more important to them. Plus, because it was asking about the past, we had no way to know that, even if they did report a prior infection, if it was due to ST398 or another type of S. aureus.

So, in 2011, we started a prospective study that was just published in Clinical Infectious Diseases, enrolling over 1,300 rural Iowans (mostly farmers of some type, though we did include individuals with no farming exposures as well, and spouses and children of farmers) and testing them at enrollment for S. aureus colonization in the nose or throat. Like previous studies done by our group and others in the US, we found that pig farmers were more likely to be carrying S. aureus that were resistant to multiple antibiotics, and especially to tetracycline–a common antibiotic used while raising pigs. Surprisingly, we didn’t find any difference in MRSA colonization among groups, but that’s likely because we enrolled relatively small-scale farmers, rather than workers in concentrated animal feeding operations (CAFOs) like we had examined in prior research, who are exposed to many more animals living in more crowded conditions (and possibly receiving more antibiotics).

What was unique about this study, besides its large size, was that we then followed participants for 18 months to examine development of S. aureus infections. Participants sent us a monthly questionnaire telling us that they had a possible Staph infection or not; describing the infection if there was one, including physician diagnosis and treatment; and when possible, sending us a sample of the infected area for bacterial isolation and typing. Over the course of the study, which followed people for over 15,ooo “person-months” in epi-speak, 67 of our participants reported developing over 100 skin and soft tissue infections. Some of them were “possibly” S. aureus–sometimes they didn’t go to the doctor, but they had a skin infection that matched the handout we had given them that gave pictures of what Staph infections commonly look like. Other times they were cellulitis, which often can’t be definitively confirmed as caused by S. aureus without more invasive tests. Forty-two of the infections were confirmed by a physician, or at the lab as S. aureus due to a swab sent by the patient.

Of the swabs we received that were positive, 3/10 were found to be ST398 strains–and all of those were in individuals who had contact with livestock. A fourth individual who also had contact with pigs and cows had an ST15 infection. Individuals lacking livestock contact had infections with more typical “human” strains, such as ST8 and ST5 (usually described as “community-associated” and “hospital-associated” types of Staph). So yes, ST398 is causing infections in farmers in the US–and very likely, these are flying under the radar, because 1) farmers really, really don’t like to go to the doctor unless they’re practically on their deathbed, and 2) even if they do, and even if the physician diagnoses and cultures S. aureus (which is not incredibly common–many diagnoses are made on appearance alone), there are very limited programs in rural areas to routinely type S. aureus. Even in Iowa, where invasive S. aureus infections were previously state-reportable, we know that fewer than half of the samples even from these infections ever made it to the State lab for testing–and for skin infections? Not even evaluated.

As warnings are sounded all over the world about the looming problem of antibiotic resistance, we need to rein in the denial of antibiotic resistance in the food/meat industry. Some positive steps are being made–just the other day, Tyson foods announced they plan to eliminate human-use antibiotics in their chicken, and places like McDonald’s and Chipotle are using antibiotic-free chicken and/or other meat products in response to consumer demand. However, pork and beef still remain more stubborn when it comes to antibiotic use on farms, despite a recent study showing that resistant bacteria generated on cattle feed yards can transmit via the air, and studies by my group and others demonstrating that people who live in proximity to CAFOs or areas where swine waste is deposited are more likely to have MRSA colonization and/or infections–even if it’s with the “human” types of S. aureus. The cat is already out of the bag, the genie is out of the bottle, whatever image or metaphor you prefer–we need to increase surveillance to detect and mitigate these issues, better integrate rural hospitals and clinics into our surveillance nets, and work on mitigation of resistance development and on new solutions for treatment cohesively and with all stakeholders at the table. I don’t think that’s too much to ask, given the stakes.

Reference: Wardyn SE, Forshey BM, Farina S, Kates AE, Nair R, Quick M, Wu J, Hanson BM, O’Malley S, Shows H, Heywood E, Beane-Freeman LE, Lynch CF, Carrel M, Smith TC. Swine farming is a risk factor for infection with and high prevalence of multi-drug resistant Staphylococcus aureus. Clinical Infectious Diseases, in press, 2015. Link to press release.

 



from ScienceBlogs http://ift.tt/1QPRzlo

Dire Predictions: Understanding Climate Change, Must Read Book [Greg Laden's Blog]

Dire Predictions: Understanding Global Warming by Michael Mann and Lee Kump is everyperson’s guide to the latest Intergovernmental Panel on Climate Change (IPCC) report. The IPCC issues a periodic set of reports on the state of global climate change, and has been doing so for almost two decades. It is a massive undertaking and few have the time or training to read though and absorb it, yet it is very important that every citizen understands the reports’ implications. Why? Because human caused climate change has emerged as the number one existential issue of the day, and individuals, corporations, and governments must act to implement sensible and workable changes in behavior and policy or there will be dire consequences.

Dire Predictions: Understanding Global Warming is a DK Publishing product, which means it is very visual, succinct, and as is the case with all the DK products I’ve seen, well done. This is the second edition of the book, updated to reflect the most recent IPCC findings. The book gives a basic background on climate change, describes scientific projections and how they are developed, discuses impacts of climate change, and outlines vulnerability and modes of adaptation to change. The book finishes with a panoply of suggestions for solving the climate change crisis. Since Dire Predictions reflects the IPCC reports, it can be used as a primer in understanding the much more extensive and intensive original document, but it can also be used entirely on its own. I would recommend Dire Predictions for use in any of a wide range of classroom settings. It could be a primary text in middle school or high school Earth Systems classes, or a supplementary text in intro college courses. Anyone who is engaged in the climate change conversation and wants to be well informed simply needs to get this book, read it, and have it handy as a reference.

See: An interview with Michael Mann by yours truly

Lee Kump is a professor in Geosciences at Penn State, and author of a major textbook “The Earth System.” Michael Mann is Distinguished Professor of Meteorology and Director of the Earth System Science Centre at Penn State, and author of The Hockey Stick and the Climate Wars: Dispatches from the Front Lines (as well as countless scientific publications). Mann has been on top of the climate change issue for years. His work in the late 1980s, with colleagues, produced the famous “Hockey Stick” graph which had two major impacts. First, it made the link between the recent century or so of direct observation of Earth’s surface temperatures (with thermometers and/or satellites) and the “paleo” record made up of proxyindicators of temperature, an essential step in placing modern climate change in long term perspective. Second, using this connection, Mann and colleagues showed that recent global warming, known to be primarily caused by human released greenhouse gas pollution, was already extreme and likely to get more extreme. Since then, Mann has been a key scientist involved with the IPCC, and has carried out many important research projects.

See: New Research on Tree Rings as Indicators of Past Climate

I asked Dr. Mann to address a handful of questions I had about Dire Predictions.

Question: Some might think of the title of the book as a bit extreme, even “alarmist,” to reference a term we often see used by climate science deniers. I assume you chose it carefully. Why “Dire Predictions: [subtitle]” instead of “Understanding Climate Change: [subtitle]”?

Answer: This was a mutual decision between the authors (Lee Kump and myself) and the publisher. The publisher felt this title both communicates the nature of the content of the book and the larger message of urgency; The predictions really are “dire” for the worst case scenarios, i.e. if we fail to act on climate change.

Question: What are the biggest changes, or perhaps most interesting changes, between the first and second edition, such as new research? Did any of the initial projections get less dire? More dire?

Answer: The main difference is that the book reflects the latest science as reported in the most recent (5th) IPCC assessment report. Some spreads remained unchanged, i.e. we felt there were no significant developments in the science since the last report (and last book). But in other cases, there were some substantial developments, i.e. we felt compelled to talk about the “Faux Pause” since it has gotten so much attention, and the issue of equilibrium climate sensitivity is discussed in more depth. The concept of the “Anthropocene” is dealt with more explicitly. And the issue of recent cold eastern U.S. winters and what it really means, and the unprecedented current drought in California are discussed.

Question: It seems that for decades the climate science has been settled sufficiently to realize that release of fossil Carbon will have serious consequences. Yet policy and technology changes to address this have been slow. Is this simply because such things take a long time, or have the efforts of science deniers been successful in slowing down action? How much better (or less dire) would things be in, say, 2050 had people, corporations, and governments accepted climate change as a serious matter 20 years ago? In other words, how much damage has science denialism done?

Answer: Oh, that’s a fundamentally important point. There is a huge “procrastination penalty” in not acting on the problem, and we’ve presumably committed to billions if not trillions of economic losses by not having acted yet. But there is still time to avert the worst and most costly damages, so there is an urgency of action unlike there has ever been before. This is something we tackle head on in the book.

See: Michael Mann Answers Questions From Dangerous Children About Ian Somerhalder

Question: Since you finished working on the second edition, are there any new research findings you wish you could somehow add to the book? Or, any changes in what is emphasized?

Indeed. As you know, Stefan Rahmstorf, I and others recently published an article in Nature Climate Change demonstrating that the AMOC (North Atlantic ocean circulation, the so-called “conveyor belt”) may be weakening even faster than the IPCC models indicate. Yet, we have downplayed that topic (though it is mentioned in a brand new spread on “Tipping Points”) because the consensus has leaned toward this being one of the less likely tipping points to occur in the decades ahead. This is a reminder that science is often fast-moving, and in this case, had we waited a year to publish the 2nd edition of DP, we might have chosen to actually give the AMOC collapse issue even more attention!

See: A list of climate change books

Question: I’m wondering if the projections for sea level rise in Dire Predictions are conservative with respect to more recent research. Also, there seems to be a more clear and explicit link between climate change an ware or social unrest. Would these issues also have more attention if you had another shot at the book?

Answer: We do discuss the sea level rise and the fact that iPCC projections here (and for many other variables) have been historically too conservative. There is some discussion now about the role of water resources in national security and conflict, and the huge advances that are taking place in renewable energy (that is something that has changed dramatically since the first edition—and a reminder of the reasons there are for cautious optimism).


Also of interest:



from ScienceBlogs http://ift.tt/1Q3qEB6

Dire Predictions: Understanding Global Warming by Michael Mann and Lee Kump is everyperson’s guide to the latest Intergovernmental Panel on Climate Change (IPCC) report. The IPCC issues a periodic set of reports on the state of global climate change, and has been doing so for almost two decades. It is a massive undertaking and few have the time or training to read though and absorb it, yet it is very important that every citizen understands the reports’ implications. Why? Because human caused climate change has emerged as the number one existential issue of the day, and individuals, corporations, and governments must act to implement sensible and workable changes in behavior and policy or there will be dire consequences.

Dire Predictions: Understanding Global Warming is a DK Publishing product, which means it is very visual, succinct, and as is the case with all the DK products I’ve seen, well done. This is the second edition of the book, updated to reflect the most recent IPCC findings. The book gives a basic background on climate change, describes scientific projections and how they are developed, discuses impacts of climate change, and outlines vulnerability and modes of adaptation to change. The book finishes with a panoply of suggestions for solving the climate change crisis. Since Dire Predictions reflects the IPCC reports, it can be used as a primer in understanding the much more extensive and intensive original document, but it can also be used entirely on its own. I would recommend Dire Predictions for use in any of a wide range of classroom settings. It could be a primary text in middle school or high school Earth Systems classes, or a supplementary text in intro college courses. Anyone who is engaged in the climate change conversation and wants to be well informed simply needs to get this book, read it, and have it handy as a reference.

See: An interview with Michael Mann by yours truly

Lee Kump is a professor in Geosciences at Penn State, and author of a major textbook “The Earth System.” Michael Mann is Distinguished Professor of Meteorology and Director of the Earth System Science Centre at Penn State, and author of The Hockey Stick and the Climate Wars: Dispatches from the Front Lines (as well as countless scientific publications). Mann has been on top of the climate change issue for years. His work in the late 1980s, with colleagues, produced the famous “Hockey Stick” graph which had two major impacts. First, it made the link between the recent century or so of direct observation of Earth’s surface temperatures (with thermometers and/or satellites) and the “paleo” record made up of proxyindicators of temperature, an essential step in placing modern climate change in long term perspective. Second, using this connection, Mann and colleagues showed that recent global warming, known to be primarily caused by human released greenhouse gas pollution, was already extreme and likely to get more extreme. Since then, Mann has been a key scientist involved with the IPCC, and has carried out many important research projects.

See: New Research on Tree Rings as Indicators of Past Climate

I asked Dr. Mann to address a handful of questions I had about Dire Predictions.

Question: Some might think of the title of the book as a bit extreme, even “alarmist,” to reference a term we often see used by climate science deniers. I assume you chose it carefully. Why “Dire Predictions: [subtitle]” instead of “Understanding Climate Change: [subtitle]”?

Answer: This was a mutual decision between the authors (Lee Kump and myself) and the publisher. The publisher felt this title both communicates the nature of the content of the book and the larger message of urgency; The predictions really are “dire” for the worst case scenarios, i.e. if we fail to act on climate change.

Question: What are the biggest changes, or perhaps most interesting changes, between the first and second edition, such as new research? Did any of the initial projections get less dire? More dire?

Answer: The main difference is that the book reflects the latest science as reported in the most recent (5th) IPCC assessment report. Some spreads remained unchanged, i.e. we felt there were no significant developments in the science since the last report (and last book). But in other cases, there were some substantial developments, i.e. we felt compelled to talk about the “Faux Pause” since it has gotten so much attention, and the issue of equilibrium climate sensitivity is discussed in more depth. The concept of the “Anthropocene” is dealt with more explicitly. And the issue of recent cold eastern U.S. winters and what it really means, and the unprecedented current drought in California are discussed.

Question: It seems that for decades the climate science has been settled sufficiently to realize that release of fossil Carbon will have serious consequences. Yet policy and technology changes to address this have been slow. Is this simply because such things take a long time, or have the efforts of science deniers been successful in slowing down action? How much better (or less dire) would things be in, say, 2050 had people, corporations, and governments accepted climate change as a serious matter 20 years ago? In other words, how much damage has science denialism done?

Answer: Oh, that’s a fundamentally important point. There is a huge “procrastination penalty” in not acting on the problem, and we’ve presumably committed to billions if not trillions of economic losses by not having acted yet. But there is still time to avert the worst and most costly damages, so there is an urgency of action unlike there has ever been before. This is something we tackle head on in the book.

See: Michael Mann Answers Questions From Dangerous Children About Ian Somerhalder

Question: Since you finished working on the second edition, are there any new research findings you wish you could somehow add to the book? Or, any changes in what is emphasized?

Indeed. As you know, Stefan Rahmstorf, I and others recently published an article in Nature Climate Change demonstrating that the AMOC (North Atlantic ocean circulation, the so-called “conveyor belt”) may be weakening even faster than the IPCC models indicate. Yet, we have downplayed that topic (though it is mentioned in a brand new spread on “Tipping Points”) because the consensus has leaned toward this being one of the less likely tipping points to occur in the decades ahead. This is a reminder that science is often fast-moving, and in this case, had we waited a year to publish the 2nd edition of DP, we might have chosen to actually give the AMOC collapse issue even more attention!

See: A list of climate change books

Question: I’m wondering if the projections for sea level rise in Dire Predictions are conservative with respect to more recent research. Also, there seems to be a more clear and explicit link between climate change an ware or social unrest. Would these issues also have more attention if you had another shot at the book?

Answer: We do discuss the sea level rise and the fact that iPCC projections here (and for many other variables) have been historically too conservative. There is some discussion now about the role of water resources in national security and conflict, and the huge advances that are taking place in renewable energy (that is something that has changed dramatically since the first edition—and a reminder of the reasons there are for cautious optimism).


Also of interest:



from ScienceBlogs http://ift.tt/1Q3qEB6

Reading Diary: Books on Canadian politics: Harris, Wells, Delacourt, Savoie, Bourrie, Gutstein, Doern/Stoney, Pielke [Confessions of a Science Librarian]

This roundup includes reviews of a bunch of recent and not-so-recent reading about Canadian politics, in particular the Harper government and how it controls information. Some of the books are pretty directly related to science policy and some, not so much. These are all worth reading, some kind of overlap while others present fairly unique approaches. All were useful to me in my long term interest and work around Canadian science policy and in understanding the current Canadian Conservative government’s anti-science attitudes. All are solid additions to the growing body of work on the Harper government and its impacts on Canadian society and belong in every public policy collection at academic or public libraries.
 

Bourrie, Mark. Kill the Messengers — Stephen Harper’s Assault on Your Right to Know. Toronto: Patrick Crean, 2015. 400pp. ISBN-13: 978-1443431040

The books I’m reviewing here all basically have one purpose — to expose the Harper government’s anti-science, anti-democracy, anti-information leanings. They all have their individual strengths and weaknesses, they all cover slightly different aspects of the Harper record. Some are a bit dryer and more academic that others, some deep dive some topics and others are very general.

Mark Bourrie’s Kill the Messengers is a very fine addition to the cannon. While ostensibly aimed at the information control aspects of the Harper Tories, it actually covers a fairly broad swath of what’s been going on, and I think that’s the case because pretty well all aspects of their dysfunction circle around information control, from attacking libraries and archives to muzzling scientists to whipping up terrorism terror, it’s all about information.

And Bourrie does a great job of giving an accesible, detailed account of the “kill the information messenger” aspects of the Harper regime, as all-pervasive as they are.

What Bourrie does that’s a bit different — his added value, as it were in oh-so-appropriate corporate speak — is place what Harper is doing in the context of the collapse of traditional media, how what we have left if hobbled and sycophantic like never before. Where there’s less coverage, there’s less accountability. He explains how the Conservatives have used their own larger-than-ever-before communications apparatus to fill the void, replacing news with propaganda.

I highly recommend Bourrie’s book. If you’ve read all the ones that came before, like I did, there might be some redundancy but that’s probably not the case for most people. The long form census, the history-bending military fetish, the intimidation of charities, the McCarthyistic “enemy lists” are all covered very well. He doesn’t cover science or libraries as much as I’d hoped but at least Chris Turner has covered science exhaustively in his book. We’re still waiting for the definitive treatment of the Harper assault on libraries and archives, but I guess that will have to wait.

 

Delacourt, Susan. Shopping for Votes: How Politicians Choose Us and We Choose Them. Toronto: Douglas & McIntyre, 2013. 320pp. ISBN-13: 978-1926812939

Shopping for Votes is easily one of the most fascinating and important books on Canadian politics I’ve read in a long time. It’s not only or even mostly about the Conservatives — though they serve as the main case study — as it is about how electoral politics has become about using marketing, polling and micro-targeting as the main tools for fighting and winning elections. It traces the transition of the the political class’s conception of the voting public as citizen to the voting public as consumers of politics and how this plays into the hands of both governments and the media/corporate elites. Not to mention how that conception of voters-as-consumers has fed into and paralleled the rise of attack ads and negative politics. It’s a tool box largely imported into Canada from the US by the Conservatives but more and more it’s being use by all the parties.

This is an illuminating and frightening book. Highly recommended. Read this book.

 

Doern, G. Bruce and Christopher Stoney, editors. How Ottawa Spends, 2014-2015: The Harper Government – Good to Go?. Montreal: McGill-Queen’s University Press, 2014. 216pp. ISBN-13: 978-0773544444

This is the most recent in a annual series of books that discuss Canadian federal politics through the lens of, well, how Ottawa spends. I guess the idea is that you can talk about high-falutin’ policies all you want, but reality is where the budget dollars hit the road. Kind of like an Annual Review of Canadian Politics, with thematic contributions by a changing cast of experts every year. In the last little while, I’ve read a good chunk of the volumes covering the Harper years mostly to get a sense of the longer context on changes to science policy through that budgetary lens. Not all the articles are directly about budgets or spending per se, but often about governmental priorities or programs.

This 2014-2015 volume at hand has four articles with a science or environmental focus that I read with great interest. All provided solid coverage of their topic area and gave me great context and current information that was very handy for my presentation on Canadian science policy and the Harper government last fall.

Those articles are:

  • Harper’s Partisan Wedge Politics: Bad Environmental Policy and Bad Energy Policy by Glen Toner and Jennifer McKee
  • One of These Things Is Not Like the Other? Bottom-Up Reform, Open Information, Collaboration, and the Harper Government by Amanda Clarke
  • Managing Canada’s Water: The Harper Era by Davide P. Cargnello, Mark Brunet, Matthew Retallack, and Robert Slater
  • How Accurate Is the Harper Government’s Misinformation? Scientific Evidence and Scientists in Federal Policy Making Kathryn O’Hara and Paul Dufour

Perhaps not surprisingly, the article that was the most useful for me was the O’Hara/Dufour one on muzzling of Canadian scientists. They provided a great overview of the controversy, the facts and how it was covered in the media. The Toner/McKee article was also very useful in covering environment and energy, a topic that’s covered fairly regularly in the various volumes of the series.

This series is required reading for anyone interested in a detailed view of Canadian politics from the inside.

 

Gutstein, Donald. Harperism: How Stephen Harper and his think tank colleagues have transformed Canada. Toronto: Lorimer, 2014. 288pp. ISBN-13: 978-1459406636

Conservative think tanks FTW! I bet they never get audited by the Canada Revenue Agency!

But they definitely have a long term and lasting impact on Canadian government policy. Or at least that’s the thesis of Donald Gutstein’s recentish book Harperism: How Stephen Harper and his think tank colleagues have transformed Canada. And a pretty convincing case he makes of it too, in a fairly short and focused book that still covers a lot of ground.

Basically, the Conservatives have used think tanks as a way of framing key issues that they want to deal with during their mandate. Gutstein does a good of what those core conservative ideas are in his chapter titles: Reject unions and prosper; Liberate dead capital on First Nations reserves; Counter the environmental threat to the market; Undermine scientific knowledge; Deny income inequality; Fashion Canada as a great nation.

Those pretty well encompass the Harperism movement, don’t they?

Gutstein kicks off the book with one of the best extended definitions of neoliberalism that I’ve seen, including going into some depth about the influence of Friedrich Hayek on both Harperism in particular and neoliberalism in general. The meat of the book is a subject by subject exploration of how various think tanks and “thought leaders,” such as the Fraser Institute are used to both generate ideas as well as to normalize and communicate them to the public. The use of bogus ideas such as “ethical oil” or the misleading buzzword “sound science” is also explored.

This is a well-researched, precisely-argued book that adds to the growing body of analysis of the roots and impacts of the current Harper government. Recommended.

 

Harris, Michael. Party of One: Stephen Harper And Canada’s Radical Makeover. Toronto: Viking, 2014. 544pp. ISBN-13: 978-0670067015

The most recent of the general book to deal with the Harper years, this is probably also the one I got the least out of, probably mostly because I’ve read so many other books (and articles and blog posts and…) about Harper and merry gang of wreckers. But also at least in part because Harris gives the most extensive coverage to the Harper controversies that I find the least compelling and the least damning/important. I’m talking about the robocalls scandal, which in the absence of a smoking gun seems to be important but not the most important in the list of Harper’s sins. Yes, we all “know” that the election shenanigans originated at the highest levels, but “knowing” isn’t the same as knowing. I’m also talking about Mike Duffy and the senate scandals. To me the situation is too analogous to the previous Liberal government’s sponsorship scandals to regard it as anything other than politics as usual as opposed to something that marks the Harper government as uniquely disastrous compared to any other recent government. There are certainly plenty of those disastrous circumstances to go around.

And Harris, to his credit, covers most of those pretty well too, from the appalling treatment of veterans, to the situation at the Department of Fisheries and Oceans libraries to the muzzling of scientists to the various “bad boys” like Bruce Carson, Arthur Porter and Nathan Jacobson.

Harris does a pretty good job of covering the later years of the Harper government, covering some stories that the other very general books didn’t. This book is recommended.

 

Pielke, Jr., Roger A. The Honest Broker: Making Sense of Science in Policy and Politics. New York: Cambridge University Press, 2003. 188pp. ISBN-13: 978-0521694810

Roger Pielke is a bit of a controversial figure in the science policy field, which I didn’t quite realize when I picked up this book as a general introduction to science policy. Last fall I needed something to give me a theoretical introduction as a way to ground the presentation I was going to be giving as part of York University’s Science and Technology Studies Seminar Series. So I searched around Amazon and a few other places to see what I could find and this one seemed a decent choice.

And it was, for a first book. I found that the way he framed the relationship between scientists and society in terms of four idealized roles — pure scientist, science arbiter, issue advocate or honest broker — was useful for the way I wanted to frame my own presentation. As I got further in to the book, some of the parts did make me a bit queasy were ultimately reflected in what I learned about him over time. That being said, I did find his book to be a lively and useful introduction to the relationship between science and society: short enough to be easily digested while still having enough depth intellectually to be useful and challenging.

I probably need to read a few more general introductory books before the shape of the field really starts to take shape in my mind, for the issues and controversies to start to make coherent sense to me. Pielke’s book was probably as good a place as any to start on that journey.

(Yeah, yeah, this one’s not actually about Canadian politics but I see this as being all part of one large science policy project.)

 

Savoie, Donald J. Whatever Happened to the Music Teacher?: How Government Decides and Why. Montreal: McGill-Queen’s University Press, 2013. 336pp. ISBN-13: 978-0773541108

“How Government Decides and Why.” Think of this subtitle as slightly re-worded as “How does government decide and why?” That’s the question that Donald J. Savioe’s book Whatever Happened to the Music Teacher? tries to answer. And what would that answer be? Mostly, “It’s complicated” for both how and why.

So in a similar way that the Pielke book helped me frame the scientist/society relationship, the Savioe book certainly helped me think more carefully about the three fold interface between government and the bureaucracy and citizens, with the emphasis on how elected officials interact with the civil service.

While not specifically focused on the Harper years, Savoie does use them as a case study as he examines how the civil service and the elected officials have evolved in their relationship over the years. Particularly interesting is how he goes into great detail on how over time as the government has become bigger and more complex, it has become much more difficult for politicians to make sense of detailed budgets and spending reports — to the point where they no longer even seem to try any more.

Which dovetails nicely into some of Savoie’s other themes. The spenders versus the guardians. The relationships between the various deputy and associate deputy and associate deputy assistant ministers and all the rest of the ever-proliferating levels of administration. The goal of government as blame-avoidance and butt-covering of those above you in the hierarchy to keep them out of trouble, to create a regime of “no surprises.” Savoie again and again debunks the idea that private sector managerialism has any place in government or that it ever has been or ever really could be successful. That spending decisions get shifted and morphed by stealth rather than purposeful planning, all towards more complex administration. Planning relies less on evidence and more on opinion. The rise and rise of endless spin. The cocooning of the PM among a small circle of elite advisors.

And more.

Which gets us back to the original question. How and why do governments decide? Basically, the answer is that its complicated and messy, not a linear process, not a process that’s easy to predict or easily quantify.

Making governing a very human endeavor.

Which gets me to a weird place when I think about the book. While it can be a bit dry, I certainly learned a lot of rather intricate detail about how government works, stuff I never knew or even really wanted to know. Which makes the book definitely worthwhile. I certainly ended the book with a much greater appreciation of the messiness of government than when I started. So I guess that makes the read worthwhile.

 

Wells, Paul. The Longer I’m Prime Minister: Stephen Harper and Canada, 2006-. Toronto: Random House, 2013. 448pp. ISBN-13: 978-0307361325

One of the oldest books in this roundup, Paul Wells’s book is probably also the first book to really look at the Harper government’s overall legacy in a serious way. And of the books on this list, it’s also the liveliest and most entertaining. Wells has a great way with a juicy story. And he certainly doesn’t pull any punches — he’s pretty blunt about the good, bad and downright ugly about the early years of the Harper majority, about Harper’s baldly stated desire to remake Canada as a conservative (and Conservative) country. “The longer I’m prime minister” as he’s fond of saying, we won’t even recognize this place.

Perhaps a bit dated now, with so much water under the bridge these last few years, I would still recommend this book for a solid insight into the first half of the Harper government’s reign of error.

 

So what have I learned from all this reading? Aside from feeling, “holy crap have I ever read a lot of books about Canadian politics in the last few years?”

Somehow I think I should feel a bit more certain about what’s going on or have a better sense of how we could fix it if we really wanted to. But in fact just the opposite. Like initial explorations of any field of study, those first excursions really just illuminate both how much you don’t know and just how slippery solutions are.

And by solutions, I don’t just mean electing another government, that’s the easy part. I hope. What I mean is fixing the larger political climate in Canada so that evidence matters more. So that compassion matters more. So that micro-targeting narrow self-interested voter segments with tax cut goodies mattered less.

Understanding that context and framing those solutions is, if anything, even more illusive than it was when I embarked on this reading project a few years ago. And what it means is that even when the “Canadian War on Science” launched by the Conservatives is over, it does’t mean that all the Canadian science policy battles have been won. Perhaps it means that rebuilding Canadian science will be just as important and finding that path will be just as fraught.

A new process and a positive project that will have just as much place for an old science librarian as the old battles.

As a bonus, here are some of the other Canadian political books I’ve read and reviewed recently.

This roundup includes reviews of a bunch of recent and not-so-recent reading about Canadian politics, in particular the Harper government and how it controls information. Some of the books are pretty directly related to science policy and some, not so much. These are all worth reading, some kind of overlap while others present fairly unique approaches. All were useful to me in my long term interest and work around Canadian science policy and in understanding the current Canadian Conservative government’s anti-science attitudes. All are solid additions to the growing body of work on the Harper government and its impacts on Canadian society and belong in every public policy collection at academic or public libraries.
 

Bourrie, Mark. Kill the Messengers — Stephen Harper’s Assault on Your Right to Know. Toronto: Patrick Crean, 2015. 400pp. ISBN-13: 978-1443431040

The books I’m reviewing here all basically have one purpose — to expose the Harper government’s anti-science, anti-democracy, anti-information leanings. They all have their individual strengths and weaknesses, they all cover slightly different aspects of the Harper record. Some are a bit dryer and more academic that others, some deep dive some topics and others are very general.

Mark Bourrie’s Kill the Messengers is a very fine addition to the cannon. While ostensibly aimed at the information control aspects of the Harper Tories, it actually covers a fairly broad swath of what’s been going on, and I think that’s the case because pretty well all aspects of their dysfunction circle around information control, from attacking libraries and archives to muzzling scientists to whipping up terrorism terror, it’s all about information.

And Bourrie does a great job of giving an accesible, detailed account of the “kill the information messenger” aspects of the Harper regime, as all-pervasive as they are.

What Bourrie does that’s a bit different — his added value, as it were in oh-so-appropriate corporate speak — is place what Harper is doing in the context of the collapse of traditional media, how what we have left if hobbled and sycophantic like never before. Where there’s less coverage, there’s less accountability. He explains how the Conservatives have used their own larger-than-ever-before communications apparatus to fill the void, replacing news with propaganda.

I highly recommend Bourrie’s book. If you’ve read all the ones that came before, like I did, there might be some redundancy but that’s probably not the case for most people. The long form census, the history-bending military fetish, the intimidation of charities, the McCarthyistic “enemy lists” are all covered very well. He doesn’t cover science or libraries as much as I’d hoped but at least Chris Turner has covered science exhaustively in his book. We’re still waiting for the definitive treatment of the Harper assault on libraries and archives, but I guess that will have to wait.

 

Delacourt, Susan. Shopping for Votes: How Politicians Choose Us and We Choose Them. Toronto: Douglas & McIntyre, 2013. 320pp. ISBN-13: 978-1926812939

Shopping for Votes is easily one of the most fascinating and important books on Canadian politics I’ve read in a long time. It’s not only or even mostly about the Conservatives — though they serve as the main case study — as it is about how electoral politics has become about using marketing, polling and micro-targeting as the main tools for fighting and winning elections. It traces the transition of the the political class’s conception of the voting public as citizen to the voting public as consumers of politics and how this plays into the hands of both governments and the media/corporate elites. Not to mention how that conception of voters-as-consumers has fed into and paralleled the rise of attack ads and negative politics. It’s a tool box largely imported into Canada from the US by the Conservatives but more and more it’s being use by all the parties.

This is an illuminating and frightening book. Highly recommended. Read this book.

 

Doern, G. Bruce and Christopher Stoney, editors. How Ottawa Spends, 2014-2015: The Harper Government – Good to Go?. Montreal: McGill-Queen’s University Press, 2014. 216pp. ISBN-13: 978-0773544444

This is the most recent in a annual series of books that discuss Canadian federal politics through the lens of, well, how Ottawa spends. I guess the idea is that you can talk about high-falutin’ policies all you want, but reality is where the budget dollars hit the road. Kind of like an Annual Review of Canadian Politics, with thematic contributions by a changing cast of experts every year. In the last little while, I’ve read a good chunk of the volumes covering the Harper years mostly to get a sense of the longer context on changes to science policy through that budgetary lens. Not all the articles are directly about budgets or spending per se, but often about governmental priorities or programs.

This 2014-2015 volume at hand has four articles with a science or environmental focus that I read with great interest. All provided solid coverage of their topic area and gave me great context and current information that was very handy for my presentation on Canadian science policy and the Harper government last fall.

Those articles are:

  • Harper’s Partisan Wedge Politics: Bad Environmental Policy and Bad Energy Policy by Glen Toner and Jennifer McKee
  • One of These Things Is Not Like the Other? Bottom-Up Reform, Open Information, Collaboration, and the Harper Government by Amanda Clarke
  • Managing Canada’s Water: The Harper Era by Davide P. Cargnello, Mark Brunet, Matthew Retallack, and Robert Slater
  • How Accurate Is the Harper Government’s Misinformation? Scientific Evidence and Scientists in Federal Policy Making Kathryn O’Hara and Paul Dufour

Perhaps not surprisingly, the article that was the most useful for me was the O’Hara/Dufour one on muzzling of Canadian scientists. They provided a great overview of the controversy, the facts and how it was covered in the media. The Toner/McKee article was also very useful in covering environment and energy, a topic that’s covered fairly regularly in the various volumes of the series.

This series is required reading for anyone interested in a detailed view of Canadian politics from the inside.

 

Gutstein, Donald. Harperism: How Stephen Harper and his think tank colleagues have transformed Canada. Toronto: Lorimer, 2014. 288pp. ISBN-13: 978-1459406636

Conservative think tanks FTW! I bet they never get audited by the Canada Revenue Agency!

But they definitely have a long term and lasting impact on Canadian government policy. Or at least that’s the thesis of Donald Gutstein’s recentish book Harperism: How Stephen Harper and his think tank colleagues have transformed Canada. And a pretty convincing case he makes of it too, in a fairly short and focused book that still covers a lot of ground.

Basically, the Conservatives have used think tanks as a way of framing key issues that they want to deal with during their mandate. Gutstein does a good of what those core conservative ideas are in his chapter titles: Reject unions and prosper; Liberate dead capital on First Nations reserves; Counter the environmental threat to the market; Undermine scientific knowledge; Deny income inequality; Fashion Canada as a great nation.

Those pretty well encompass the Harperism movement, don’t they?

Gutstein kicks off the book with one of the best extended definitions of neoliberalism that I’ve seen, including going into some depth about the influence of Friedrich Hayek on both Harperism in particular and neoliberalism in general. The meat of the book is a subject by subject exploration of how various think tanks and “thought leaders,” such as the Fraser Institute are used to both generate ideas as well as to normalize and communicate them to the public. The use of bogus ideas such as “ethical oil” or the misleading buzzword “sound science” is also explored.

This is a well-researched, precisely-argued book that adds to the growing body of analysis of the roots and impacts of the current Harper government. Recommended.

 

Harris, Michael. Party of One: Stephen Harper And Canada’s Radical Makeover. Toronto: Viking, 2014. 544pp. ISBN-13: 978-0670067015

The most recent of the general book to deal with the Harper years, this is probably also the one I got the least out of, probably mostly because I’ve read so many other books (and articles and blog posts and…) about Harper and merry gang of wreckers. But also at least in part because Harris gives the most extensive coverage to the Harper controversies that I find the least compelling and the least damning/important. I’m talking about the robocalls scandal, which in the absence of a smoking gun seems to be important but not the most important in the list of Harper’s sins. Yes, we all “know” that the election shenanigans originated at the highest levels, but “knowing” isn’t the same as knowing. I’m also talking about Mike Duffy and the senate scandals. To me the situation is too analogous to the previous Liberal government’s sponsorship scandals to regard it as anything other than politics as usual as opposed to something that marks the Harper government as uniquely disastrous compared to any other recent government. There are certainly plenty of those disastrous circumstances to go around.

And Harris, to his credit, covers most of those pretty well too, from the appalling treatment of veterans, to the situation at the Department of Fisheries and Oceans libraries to the muzzling of scientists to the various “bad boys” like Bruce Carson, Arthur Porter and Nathan Jacobson.

Harris does a pretty good job of covering the later years of the Harper government, covering some stories that the other very general books didn’t. This book is recommended.

 

Pielke, Jr., Roger A. The Honest Broker: Making Sense of Science in Policy and Politics. New York: Cambridge University Press, 2003. 188pp. ISBN-13: 978-0521694810

Roger Pielke is a bit of a controversial figure in the science policy field, which I didn’t quite realize when I picked up this book as a general introduction to science policy. Last fall I needed something to give me a theoretical introduction as a way to ground the presentation I was going to be giving as part of York University’s Science and Technology Studies Seminar Series. So I searched around Amazon and a few other places to see what I could find and this one seemed a decent choice.

And it was, for a first book. I found that the way he framed the relationship between scientists and society in terms of four idealized roles — pure scientist, science arbiter, issue advocate or honest broker — was useful for the way I wanted to frame my own presentation. As I got further in to the book, some of the parts did make me a bit queasy were ultimately reflected in what I learned about him over time. That being said, I did find his book to be a lively and useful introduction to the relationship between science and society: short enough to be easily digested while still having enough depth intellectually to be useful and challenging.

I probably need to read a few more general introductory books before the shape of the field really starts to take shape in my mind, for the issues and controversies to start to make coherent sense to me. Pielke’s book was probably as good a place as any to start on that journey.

(Yeah, yeah, this one’s not actually about Canadian politics but I see this as being all part of one large science policy project.)

 

Savoie, Donald J. Whatever Happened to the Music Teacher?: How Government Decides and Why. Montreal: McGill-Queen’s University Press, 2013. 336pp. ISBN-13: 978-0773541108

“How Government Decides and Why.” Think of this subtitle as slightly re-worded as “How does government decide and why?” That’s the question that Donald J. Savioe’s book Whatever Happened to the Music Teacher? tries to answer. And what would that answer be? Mostly, “It’s complicated” for both how and why.

So in a similar way that the Pielke book helped me frame the scientist/society relationship, the Savioe book certainly helped me think more carefully about the three fold interface between government and the bureaucracy and citizens, with the emphasis on how elected officials interact with the civil service.

While not specifically focused on the Harper years, Savoie does use them as a case study as he examines how the civil service and the elected officials have evolved in their relationship over the years. Particularly interesting is how he goes into great detail on how over time as the government has become bigger and more complex, it has become much more difficult for politicians to make sense of detailed budgets and spending reports — to the point where they no longer even seem to try any more.

Which dovetails nicely into some of Savoie’s other themes. The spenders versus the guardians. The relationships between the various deputy and associate deputy and associate deputy assistant ministers and all the rest of the ever-proliferating levels of administration. The goal of government as blame-avoidance and butt-covering of those above you in the hierarchy to keep them out of trouble, to create a regime of “no surprises.” Savoie again and again debunks the idea that private sector managerialism has any place in government or that it ever has been or ever really could be successful. That spending decisions get shifted and morphed by stealth rather than purposeful planning, all towards more complex administration. Planning relies less on evidence and more on opinion. The rise and rise of endless spin. The cocooning of the PM among a small circle of elite advisors.

And more.

Which gets us back to the original question. How and why do governments decide? Basically, the answer is that its complicated and messy, not a linear process, not a process that’s easy to predict or easily quantify.

Making governing a very human endeavor.

Which gets me to a weird place when I think about the book. While it can be a bit dry, I certainly learned a lot of rather intricate detail about how government works, stuff I never knew or even really wanted to know. Which makes the book definitely worthwhile. I certainly ended the book with a much greater appreciation of the messiness of government than when I started. So I guess that makes the read worthwhile.

 

Wells, Paul. The Longer I’m Prime Minister: Stephen Harper and Canada, 2006-. Toronto: Random House, 2013. 448pp. ISBN-13: 978-0307361325

One of the oldest books in this roundup, Paul Wells’s book is probably also the first book to really look at the Harper government’s overall legacy in a serious way. And of the books on this list, it’s also the liveliest and most entertaining. Wells has a great way with a juicy story. And he certainly doesn’t pull any punches — he’s pretty blunt about the good, bad and downright ugly about the early years of the Harper majority, about Harper’s baldly stated desire to remake Canada as a conservative (and Conservative) country. “The longer I’m prime minister” as he’s fond of saying, we won’t even recognize this place.

Perhaps a bit dated now, with so much water under the bridge these last few years, I would still recommend this book for a solid insight into the first half of the Harper government’s reign of error.

 

So what have I learned from all this reading? Aside from feeling, “holy crap have I ever read a lot of books about Canadian politics in the last few years?”

Somehow I think I should feel a bit more certain about what’s going on or have a better sense of how we could fix it if we really wanted to. But in fact just the opposite. Like initial explorations of any field of study, those first excursions really just illuminate both how much you don’t know and just how slippery solutions are.

And by solutions, I don’t just mean electing another government, that’s the easy part. I hope. What I mean is fixing the larger political climate in Canada so that evidence matters more. So that compassion matters more. So that micro-targeting narrow self-interested voter segments with tax cut goodies mattered less.

Understanding that context and framing those solutions is, if anything, even more illusive than it was when I embarked on this reading project a few years ago. And what it means is that even when the “Canadian War on Science” launched by the Conservatives is over, it does’t mean that all the Canadian science policy battles have been won. Perhaps it means that rebuilding Canadian science will be just as important and finding that path will be just as fraught.

A new process and a positive project that will have just as much place for an old science librarian as the old battles.

As a bonus, here are some of the other Canadian political books I’ve read and reviewed recently.

Water abundant in first billion years after Big Bang?

This Hubble image features dark knots of gas and dust known as

This Hubble image features dark knots of gas and dust known as “Bok globules,” which are dense pockets in larger molecular clouds. Similar islands of material in the early universe could have held as much water vapor as we find in our galaxy today, despite containing a thousand times less oxygen. Image credit: NASA, ESA, and The Hubble Heritage Team

How soon after the Big Bang could water have existed?

Not right away, say scientists, because water molecules contain oxygen, and oxygen had to be formed in the first stars. Then that oxygen had to disperse and unite with hydrogen in significant amounts.

But despite these complications, water vapor could have been just as abundant in pockets of space a billion years after the Big Bang as it is today. That’s according to a new study by team from the Harvard-Smithsonian Center for Astrophysics (CfA) and Tel-Aviv University.

Avi Loeb is an astrophysicist at the Harvard-Smithsonian Center for Astrophysics (CfA).Loeb said:

We looked at the chemistry within young molecular clouds containing a thousand times less oxygen than our sun. To our surprise, we found we can get as much water vapor as we see in our own galaxy.

The early universe lacked elements heavier than hydrogen and helium. The first generation of stars are believed to have been massive and short-lived. Those stars generated elements like oxygen, which then spread outward via stellar winds and supernova explosions. This resulted in “islands” of gas enriched in heavy elements. Even these islands, however, were much poorer in oxygen than gas within the Milky Way today.

The team examined the chemical reactions that could lead to the formation of water within the oxygen-poor environment of early molecular clouds. They found that at temperatures around 80 degrees Fahrenheit (300 Kelvin), abundant water could form in the gas phase despite the relative lack of raw materials.

Although ultraviolet light from stars would break apart water molecules, after hundreds of millions of years an equilibrium could be reached between water formation and destruction. The team found that equilibrium to be similar to levels of water vapor seen in the local universe.

This current work calculates how much water could exist in the gas phase within molecular clouds that will form later generations of stars and planets. It doesn’t address how much water would exist in ice form (which dominates within our galaxy) or what fraction of all the water might actually be incorporated into newly forming planetary systems.

This work has been accepted for publication in the Astrophysical Journal Letters and is available online.

Bottom line: According to a new study by team from the Harvard-Smithsonian Center for Astrophysics (CfA) and Tel-Aviv University, water vapor could have been just as abundant in pockets of space a billion years after the Big Bang as it is today.

Read more from Harvard-Smithsonian Center for Astrophysics



from EarthSky http://ift.tt/1DMFOD4
This Hubble image features dark knots of gas and dust known as

This Hubble image features dark knots of gas and dust known as “Bok globules,” which are dense pockets in larger molecular clouds. Similar islands of material in the early universe could have held as much water vapor as we find in our galaxy today, despite containing a thousand times less oxygen. Image credit: NASA, ESA, and The Hubble Heritage Team

How soon after the Big Bang could water have existed?

Not right away, say scientists, because water molecules contain oxygen, and oxygen had to be formed in the first stars. Then that oxygen had to disperse and unite with hydrogen in significant amounts.

But despite these complications, water vapor could have been just as abundant in pockets of space a billion years after the Big Bang as it is today. That’s according to a new study by team from the Harvard-Smithsonian Center for Astrophysics (CfA) and Tel-Aviv University.

Avi Loeb is an astrophysicist at the Harvard-Smithsonian Center for Astrophysics (CfA).Loeb said:

We looked at the chemistry within young molecular clouds containing a thousand times less oxygen than our sun. To our surprise, we found we can get as much water vapor as we see in our own galaxy.

The early universe lacked elements heavier than hydrogen and helium. The first generation of stars are believed to have been massive and short-lived. Those stars generated elements like oxygen, which then spread outward via stellar winds and supernova explosions. This resulted in “islands” of gas enriched in heavy elements. Even these islands, however, were much poorer in oxygen than gas within the Milky Way today.

The team examined the chemical reactions that could lead to the formation of water within the oxygen-poor environment of early molecular clouds. They found that at temperatures around 80 degrees Fahrenheit (300 Kelvin), abundant water could form in the gas phase despite the relative lack of raw materials.

Although ultraviolet light from stars would break apart water molecules, after hundreds of millions of years an equilibrium could be reached between water formation and destruction. The team found that equilibrium to be similar to levels of water vapor seen in the local universe.

This current work calculates how much water could exist in the gas phase within molecular clouds that will form later generations of stars and planets. It doesn’t address how much water would exist in ice form (which dominates within our galaxy) or what fraction of all the water might actually be incorporated into newly forming planetary systems.

This work has been accepted for publication in the Astrophysical Journal Letters and is available online.

Bottom line: According to a new study by team from the Harvard-Smithsonian Center for Astrophysics (CfA) and Tel-Aviv University, water vapor could have been just as abundant in pockets of space a billion years after the Big Bang as it is today.

Read more from Harvard-Smithsonian Center for Astrophysics



from EarthSky http://ift.tt/1DMFOD4

Update on Progress M-27M / 59P

Main Control Room at ESA's European Space Operations Centre, Darmstadt, Germany. Credit: ESA/P. Shlyaev

Main Control Room at ESA's European Space Operations Centre, Darmstadt, Germany. Credit: ESA/P. Shlyaev

Editor's note: As an ISS partner agency, ESA is in close contact with the Russian and US authorities regarding the Progress M-27M / 59P mission situation. ESA's Space Debris Office is also following events closely, and is ensuring that ESA Member States are fully informed as to any potential reentry risk.

The unmanned Progress lifted off on a Soyuz launcher on 28 April at 07:09 GMT on a resupply mission to the ISS. Shortly after launch, Progress experienced technical difficulties, with the result that teams at Russian mission control could not command the spacecraft.

On 29 April, the docking of Progress with the ISS was called off. The ISS itself is in absolutely no danger and has sufficient reserves of food, fuel, water, etc.

There has been no confirmation as to the cause of the failure, nor has there been any success in regaining control of the Progress vessel. Russian flight controllers are continuing to assess the vehicle's condition and determine future actions.

The comments below are provided by Dr Holger Krag, Head of ESA's Space Debris Office at ESOC, Darmstadt.

It is now known that Russian flight control teams cannot regain control of Progress; it will inevitably undergo an uncontrolled reentry within about ten days, as its orbit steadily decays due to natural atmospheric drag and gravity forces.

In an uncontrolled reentry, the vessel in principle could reenter over any point of land or sea between approximately 51 deg N and 51 deg S latitudes, corresponding to its current orbit.

Uncontrolled reentries of space hardware are not an uncommon occurrence.

ProgressM-27M / 59P liftoff on 28 April 2015. Credit: ROSCOSMOS

ProgressM-27M / 59P liftoff on 28 April 2015. Credit: ROSCOSMOS

In the case of Progress, with a mass of about 7 tonnes, we can expect that most of the craft will burn up during reentry. However, we cannot exclude the chance that some portion of its structure, for example the heavy docking mechanism or tanks and thrusters, could survive reentry to reach the surface.

Even so, Earth has a very large surface, and it is most likely that the reentry will occur over water, desert or unpopulated areas.

At ESA's Space Debris Office, we are in contact with our US and Russian colleagues. The orbital tracking data we receive from the US radar system are being refined with additional data provided by Germany's TIRA tracking radar (operated by Fraunhofer FHR), and we are using this to make our own reentry estimates and communicate these with ESA Member States.

At this time, and absent regaining control of Progress, we expect that a reentry would occur around 9 May, with an uncertainty of plus or minus 2 days.

The two-day uncertainty is due to the unpredictability of the forces working on the vessel, among other factors, and is unavoidable at this time.

It is now impossible to say over what point of Earth's surface reentry will occur, as the current time uncertainty (at orbital speeds) translates into thousands of kilometres of distance.

As we get closer to 9 May, our ability to estimate the reentry time will improve. On the day prior to reentry, i.e., 8 May, we'll be able to forecast the time of reentry with a much higher degree of confidence. It will also become possible to exclude certain land/sea areas.

In six decades of space flight, no person has ever been hit by any piece of reentering satellite or debris and there is nothing related to this situation to indicate otherwise. We all accept much higher risks in our daily lives by driving a car or flying in airplanes.

It is most important to understand that the risk on ground to anyone is extremely small.

Related information

NASA ISS blog

Roscosmos web

Visualisation of Progress M-27M / 59P orbit

 

 



from Rocket Science » Rocket Science http://ift.tt/1AkEcz0
v
Main Control Room at ESA's European Space Operations Centre, Darmstadt, Germany. Credit: ESA/P. Shlyaev

Main Control Room at ESA's European Space Operations Centre, Darmstadt, Germany. Credit: ESA/P. Shlyaev

Editor's note: As an ISS partner agency, ESA is in close contact with the Russian and US authorities regarding the Progress M-27M / 59P mission situation. ESA's Space Debris Office is also following events closely, and is ensuring that ESA Member States are fully informed as to any potential reentry risk.

The unmanned Progress lifted off on a Soyuz launcher on 28 April at 07:09 GMT on a resupply mission to the ISS. Shortly after launch, Progress experienced technical difficulties, with the result that teams at Russian mission control could not command the spacecraft.

On 29 April, the docking of Progress with the ISS was called off. The ISS itself is in absolutely no danger and has sufficient reserves of food, fuel, water, etc.

There has been no confirmation as to the cause of the failure, nor has there been any success in regaining control of the Progress vessel. Russian flight controllers are continuing to assess the vehicle's condition and determine future actions.

The comments below are provided by Dr Holger Krag, Head of ESA's Space Debris Office at ESOC, Darmstadt.

It is now known that Russian flight control teams cannot regain control of Progress; it will inevitably undergo an uncontrolled reentry within about ten days, as its orbit steadily decays due to natural atmospheric drag and gravity forces.

In an uncontrolled reentry, the vessel in principle could reenter over any point of land or sea between approximately 51 deg N and 51 deg S latitudes, corresponding to its current orbit.

Uncontrolled reentries of space hardware are not an uncommon occurrence.

ProgressM-27M / 59P liftoff on 28 April 2015. Credit: ROSCOSMOS

ProgressM-27M / 59P liftoff on 28 April 2015. Credit: ROSCOSMOS

In the case of Progress, with a mass of about 7 tonnes, we can expect that most of the craft will burn up during reentry. However, we cannot exclude the chance that some portion of its structure, for example the heavy docking mechanism or tanks and thrusters, could survive reentry to reach the surface.

Even so, Earth has a very large surface, and it is most likely that the reentry will occur over water, desert or unpopulated areas.

At ESA's Space Debris Office, we are in contact with our US and Russian colleagues. The orbital tracking data we receive from the US radar system are being refined with additional data provided by Germany's TIRA tracking radar (operated by Fraunhofer FHR), and we are using this to make our own reentry estimates and communicate these with ESA Member States.

At this time, and absent regaining control of Progress, we expect that a reentry would occur around 9 May, with an uncertainty of plus or minus 2 days.

The two-day uncertainty is due to the unpredictability of the forces working on the vessel, among other factors, and is unavoidable at this time.

It is now impossible to say over what point of Earth's surface reentry will occur, as the current time uncertainty (at orbital speeds) translates into thousands of kilometres of distance.

As we get closer to 9 May, our ability to estimate the reentry time will improve. On the day prior to reentry, i.e., 8 May, we'll be able to forecast the time of reentry with a much higher degree of confidence. It will also become possible to exclude certain land/sea areas.

In six decades of space flight, no person has ever been hit by any piece of reentering satellite or debris and there is nothing related to this situation to indicate otherwise. We all accept much higher risks in our daily lives by driving a car or flying in airplanes.

It is most important to understand that the risk on ground to anyone is extremely small.

Related information

NASA ISS blog

Roscosmos web

Visualisation of Progress M-27M / 59P orbit

 

 



from Rocket Science » Rocket Science http://ift.tt/1AkEcz0
v

Sensors Key to Preserving Battlefield Edge

Science and technology programs involving sensors and other capabilities are on the rise. The reason for that support is that sensors are relatively inexpensive when compared to the big weapons systems they protect, plus, they provide protection for soldiers.

Dr. Mike Grove, principal deputy for Technology and Countermine, Army Communications-Electronics Research, Development & Engineering Center, Night Vision and Electronic Sensors Directorate, speaks to industry representatives during a National Defense Industrial Association-sponsored Sensors Community of Interest seminar in Springfield, Va., March 25, 2015. (Photo: David Vergun/Released)

Dr. Mike Grove, principal deputy for Technology and Countermine, Army Communications-Electronics Research, Development & Engineering Center, Night Vision and Electronic Sensors Directorate, speaks to industry representatives during a National Defense Industrial Association-sponsored Sensors Community of Interest seminar in Springfield, Va., March 25, 2015. (Photo: David Vergun/Released)

Dr. Mike Grove, principal deputy for Technology and Countermine, Army Communications-Electronics Research, Development and Engineering Center, Night Vision and Electronic Sensors Directorate, focused on the need for improved sensors during a speech at the National Defense Industrial Association-sponsored Sensors Community of Interest seminar in Springfield, Virginia, March 25.

While military sensors are inexpensive in the big scheme of modernization, they are actually quite expensive compared to sensors used in the civilian sector because military sensors must be extremely light, rugged and powerful, Grove said. It would seem convenient and logical to simply repurpose commercial sensors, but military sensors are very specialized in battlefield surveillance and target acquisition, two broad focus areas of Grove’s Sensors Community of Interest, or CoI.

GRUNT-PROOF SENSORS

Sensors used by the Army and Marine Corps are among the hardest to develop, Grove said, because they become part of the soldiers’ load. Soldiers slog through mud and snow and their equipment takes a beating. That means miniaturization, lightweight materials and use of an efficient power source are prime considerations for soldiers, as well as the small, unmanned aerial systems they carry to the battlefield.

Sensors for the Air Force and Navy, by contrast, are a lot easier to develop because there is a lot more room in ships and aircraft to place them and the weight requirements for sensors is negligible compared with the added load for a dismounted soldier, he said.

PACIFIC PATHWAYS SENSORS

The Pacific region in particular calls for a special category of “wide-area persistent surveillance” sensors, both active as well as passive that can overcome what Grove called the clutter of dense jungle interspersed with cities which are fast becoming urban-jungle megacities.

Ideal sensors for those areas would allow soldiers long-range standoff sensory capabilities. That means those sensors would need to be especially powerful. One idea that offers possibilities is emplacing passive sensors on the ocean floor and awakening them when needed, thereby conserving their power supply.

The Navy is now using facial recognition sensors that can identify persons 100 meters away. They are using those sensors to see who is coming aboard their ships, but if the distance could be increased, soldiers could use them to identify friend from foe, Grove added.

To see through dense foliage, Sensor CoI is exploring the use of Laser Illuminated Detection and Ranging, LADAR, technologies. Simply put, LADAR creates 3D-image pictures using laser range-finding sensors. Powerful algorithms are used to merge many images and separate the signal from the noise, with the signal being “focuses of interest” and noise being jungle clutter.

SENSOR WARS

Adversaries in the future are likely to acquire their own sensors, Grove said, which could in turn lead to counter-sensors, counter-counter sensors and so on. That could escalate the cost for producing new classes of sensors.

The Sensor CoI approach is to look at developing inexpensive, disposable sensors that can be programmed to do a specific task or several tasks and then be turned off or self-destruct to avoid the chance of them or their data being intercepted, as in an urban environment. Such sensors already exist which can detect noxious gases.

Sensors will continue to proliferate and the military will increasingly find ways to use them, as will potential adversaries, Grove said. There are many promising lines of research, including leveraging biomedical imaging sensors, which are now being used in the civilian world.

ABOUT SENSOR COI

The Sensor CoI is divided into three working groups: electro-optical and infrared; acoustic, seismic and magnetic; and radio frequency (radar).

There are 17 communities of interest throughout the Department of Defense. In addition to the sensor community, there are counter-weapons of mass destruction, autonomy, space, human systems, electronic warfare, air platforms, cyber, ground and sea platforms, energy and power, advanced electronics technologies, materials and manufacturing processes, weapons technologies, C4I, counter-improvised explosive devices, engineered resilient systems and biomedical.

Grove said his community concentrates solely on battlefield surveillance and target acquisition sensors. However, the various communities collaborate and share knowledge on sensors and other overlapping interests so duplication of effort is minimized.

Story and information provided by the U.S. Army
Follow Armed with Science on Facebook and Twitter!

———-

Disclaimer: The appearance of hyperlinks does not constitute endorsement by the Department of Defense. For other than authorized activities, such as, military exchanges and Morale, Welfare and Recreation sites, the Department of Defense does not exercise any editorial control over the information you may find at these locations. Such links are provided consistent with the stated purpose of this DoD website.



from Armed with Science http://ift.tt/1Eu3nmE

Science and technology programs involving sensors and other capabilities are on the rise. The reason for that support is that sensors are relatively inexpensive when compared to the big weapons systems they protect, plus, they provide protection for soldiers.

Dr. Mike Grove, principal deputy for Technology and Countermine, Army Communications-Electronics Research, Development & Engineering Center, Night Vision and Electronic Sensors Directorate, speaks to industry representatives during a National Defense Industrial Association-sponsored Sensors Community of Interest seminar in Springfield, Va., March 25, 2015. (Photo: David Vergun/Released)

Dr. Mike Grove, principal deputy for Technology and Countermine, Army Communications-Electronics Research, Development & Engineering Center, Night Vision and Electronic Sensors Directorate, speaks to industry representatives during a National Defense Industrial Association-sponsored Sensors Community of Interest seminar in Springfield, Va., March 25, 2015. (Photo: David Vergun/Released)

Dr. Mike Grove, principal deputy for Technology and Countermine, Army Communications-Electronics Research, Development and Engineering Center, Night Vision and Electronic Sensors Directorate, focused on the need for improved sensors during a speech at the National Defense Industrial Association-sponsored Sensors Community of Interest seminar in Springfield, Virginia, March 25.

While military sensors are inexpensive in the big scheme of modernization, they are actually quite expensive compared to sensors used in the civilian sector because military sensors must be extremely light, rugged and powerful, Grove said. It would seem convenient and logical to simply repurpose commercial sensors, but military sensors are very specialized in battlefield surveillance and target acquisition, two broad focus areas of Grove’s Sensors Community of Interest, or CoI.

GRUNT-PROOF SENSORS

Sensors used by the Army and Marine Corps are among the hardest to develop, Grove said, because they become part of the soldiers’ load. Soldiers slog through mud and snow and their equipment takes a beating. That means miniaturization, lightweight materials and use of an efficient power source are prime considerations for soldiers, as well as the small, unmanned aerial systems they carry to the battlefield.

Sensors for the Air Force and Navy, by contrast, are a lot easier to develop because there is a lot more room in ships and aircraft to place them and the weight requirements for sensors is negligible compared with the added load for a dismounted soldier, he said.

PACIFIC PATHWAYS SENSORS

The Pacific region in particular calls for a special category of “wide-area persistent surveillance” sensors, both active as well as passive that can overcome what Grove called the clutter of dense jungle interspersed with cities which are fast becoming urban-jungle megacities.

Ideal sensors for those areas would allow soldiers long-range standoff sensory capabilities. That means those sensors would need to be especially powerful. One idea that offers possibilities is emplacing passive sensors on the ocean floor and awakening them when needed, thereby conserving their power supply.

The Navy is now using facial recognition sensors that can identify persons 100 meters away. They are using those sensors to see who is coming aboard their ships, but if the distance could be increased, soldiers could use them to identify friend from foe, Grove added.

To see through dense foliage, Sensor CoI is exploring the use of Laser Illuminated Detection and Ranging, LADAR, technologies. Simply put, LADAR creates 3D-image pictures using laser range-finding sensors. Powerful algorithms are used to merge many images and separate the signal from the noise, with the signal being “focuses of interest” and noise being jungle clutter.

SENSOR WARS

Adversaries in the future are likely to acquire their own sensors, Grove said, which could in turn lead to counter-sensors, counter-counter sensors and so on. That could escalate the cost for producing new classes of sensors.

The Sensor CoI approach is to look at developing inexpensive, disposable sensors that can be programmed to do a specific task or several tasks and then be turned off or self-destruct to avoid the chance of them or their data being intercepted, as in an urban environment. Such sensors already exist which can detect noxious gases.

Sensors will continue to proliferate and the military will increasingly find ways to use them, as will potential adversaries, Grove said. There are many promising lines of research, including leveraging biomedical imaging sensors, which are now being used in the civilian world.

ABOUT SENSOR COI

The Sensor CoI is divided into three working groups: electro-optical and infrared; acoustic, seismic and magnetic; and radio frequency (radar).

There are 17 communities of interest throughout the Department of Defense. In addition to the sensor community, there are counter-weapons of mass destruction, autonomy, space, human systems, electronic warfare, air platforms, cyber, ground and sea platforms, energy and power, advanced electronics technologies, materials and manufacturing processes, weapons technologies, C4I, counter-improvised explosive devices, engineered resilient systems and biomedical.

Grove said his community concentrates solely on battlefield surveillance and target acquisition sensors. However, the various communities collaborate and share knowledge on sensors and other overlapping interests so duplication of effort is minimized.

Story and information provided by the U.S. Army
Follow Armed with Science on Facebook and Twitter!

———-

Disclaimer: The appearance of hyperlinks does not constitute endorsement by the Department of Defense. For other than authorized activities, such as, military exchanges and Morale, Welfare and Recreation sites, the Department of Defense does not exercise any editorial control over the information you may find at these locations. Such links are provided consistent with the stated purpose of this DoD website.



from Armed with Science http://ift.tt/1Eu3nmE