aads

The neuroplasticity bait-and-switch [Pharyngula]

As long as we’re talking about brains this morning, here’s another topic that irritates me: the abuse of the term neuroplasticity.


devneuro


Way, way back in the late 1970s, my first textbook in neuroscience was this one: Marcus Jacobson’s Developmental Neurobiology. (That link is to a more recent edition; the picture is of the blue-and-black cover I remember very well, having read the whole thing). I came into the field by way of developmental biology, and that means we focused on all the changes that go on in the brain: everything from early tissue formation to senescence, with discussions of synaptogenesis, remodeling, metabolism, transport, and functional responses to activity or inactivity. This is all under the broad umbrella of neuroplasticity, a term that’s at least a century old, and that is well-established as both a phenomenon and a science. That the brain modifies itself in response to experience is so thoroughly taken for granted that you can basically define neuroscience as the study of the responsiveness of neural tissue.


So I’m reading this interview with Norman Doidge, huckster of neuroplasticity, and I could not control my eyebrows, which started climbing up my forehead and felt like they were ascending the crown and considering a descent down to my neck. It’s not just that Doidge is so full of shit that it’s dribbling out his ears, it was the shamefully ignorant questions of the interviewer, Tim Adams. Look at this question:



One of the things that struck me, reading your books, is how entrenched our ideas of the brain’s essential fixed and unregenerative nature are. Why are those ideas so powerful?



Whoa right there. How could anyone have the idea that neuroscientists think the brain is essentially fixed and unregenerative? That’s painfully counterfactual, the precise opposite of the actual position of the field of neurobiology. Conveniently, Adams has already answered how he came by such a bogus idea: by reading Doidge’s books. That should tell you something about the worth of Doidge’s stories.



Doidge’s first book, published seven years ago, described how the principle of such healing – of the plastic brain – was becoming established fact in the laboratory through a greater understanding of ways in which circuits of neurons functioned and were created by thought. “Equipped,” Doidge wrote, “for the first time, with the tools to observe the living brain’s microscopic activities, neuroplasticians showed that the brain changes as it works. In 2000, the Nobel prize for medicine was awarded for demonstrating that, as learning occurs, the connections among nerve cells increase. The scientist behind that discovery, Eric Kandel, also showed that learning can ‘switch on’ genes that change neural structure. Hundreds of studies went on to demonstrate that mental activity is not only the product of the brain but the shaper of it.”



Christ. Wrong.


I got my Ph.D. in 1985 for studies on changing spinal circuitry in the zebrafish spinal cord. I didn’t get the Nobel for it because the idea that synapses form and change gradually and integrate new elements of the circuit was not new or revolutionary; I was filling in details on a specific organism, working within a model of neural function that basically everyone accepts. Kandel (and Carlsson and Greengard) won the Nobel prize for a large body of work on signal transduction in the nervous system. That the brain changes as it works isn’t novel; working out the details, the specific molecules and pathways involved, was, especially since they had the potential to help address human disease. Doidge’s “think yourself better” approach doesn’t.


Here comes the bait-and-switch:



You suggest often that neuroplasticity is settled fact. That doesn’t seem to me to be the case in the medical profession and certainly not beyond it…


Within the lab, within science, within neurophysiology, neuroplasticity is established fact – nobody is challenging it.



So on the one hand, Doidge is claiming that the idea of the brain being this fixed and inflexible organ is entrenched; on the other, that neuroplasticity is an unchallenged fact. Which is it?


That the brain is capable of structural, molecular, chemical, and electrical changes in response to the environment is absolutely a fact, accepted without question by the field as a whole. That isn’t a lump of phlegm in your cranium.


But what Doidge does is conflate the scientific understanding of neuroplasticity with his brand of quackery and hype. Doidge claims much more, using poorly sourced anecdotes of people curing themselves of Parkinson’s or restoring their sight by thinking and carrying out various exercises.


The medical establishment has been well aware of the capacity of the brain to repair itself for years: you do know that there are all kinds of established therapies for stroke patients, right? There is no denial of the regenerative capacity of the brain, or that it rewires itself to meet circumstances. Quacks like Doidge rely on misrepresenting the known science to make standard treatments look like a miraculous consequence of his innovative and revolutionary thinking (they aren’t), and to exaggerate the effects of his claims.


You cannot cure Parkinson’s by concentrating really hard while walking. You can learn to compensate for some of the effects of the disease. Eye exercises will not cure degenerative retinopathies. Let’s not sell false hope.


And this just infuriates me:



Yes, well I didn’t set out to do that. When I finished my first book I had come to the conclusion that many of the claims that eastern medicine was making, which led to a lot of eye-rolling among western doctors, had at least to be re-examined in the light of neuroplasticity. By the time I had finished The Brain That Changes Itself, there were significant studies, which no one disputes, which show major changes in the structure of the brain of Tibetan monks, for example, brought about through the practice of meditation. I suppose it is not really a hard sell once you have grasped that the brain is plastic, that someone who has spent 30,000 hours meditating might actually have changed the structure of their brain. I mean, a London taxi driver can change his brain by studying routes through the city for a year or two.



Aaaaargh. That’s right. No one disputes the idea that spending years doing something changes the brain. Tibetan monks get better at meditating, whatever that means; taxi drivers get familiar with travel routes; video game players get better at their games; surgeons get better at surgery with practice. It’s kind of the whole sine qua non of learning.


Doidge didn’t discover it, and it does have limitations. Don’t fall for the neuroplasticity hype — it’s promoted by charlatans who inflate its significance far beyond the fundamental utility of the concept into vast magical realms of nonsense that verge on the Secret, the ‘Law’ of Attraction, and the Power of Positive Thinking. All bunk.






from ScienceBlogs http://ift.tt/1zLcahV

As long as we’re talking about brains this morning, here’s another topic that irritates me: the abuse of the term neuroplasticity.


devneuro


Way, way back in the late 1970s, my first textbook in neuroscience was this one: Marcus Jacobson’s Developmental Neurobiology. (That link is to a more recent edition; the picture is of the blue-and-black cover I remember very well, having read the whole thing). I came into the field by way of developmental biology, and that means we focused on all the changes that go on in the brain: everything from early tissue formation to senescence, with discussions of synaptogenesis, remodeling, metabolism, transport, and functional responses to activity or inactivity. This is all under the broad umbrella of neuroplasticity, a term that’s at least a century old, and that is well-established as both a phenomenon and a science. That the brain modifies itself in response to experience is so thoroughly taken for granted that you can basically define neuroscience as the study of the responsiveness of neural tissue.


So I’m reading this interview with Norman Doidge, huckster of neuroplasticity, and I could not control my eyebrows, which started climbing up my forehead and felt like they were ascending the crown and considering a descent down to my neck. It’s not just that Doidge is so full of shit that it’s dribbling out his ears, it was the shamefully ignorant questions of the interviewer, Tim Adams. Look at this question:



One of the things that struck me, reading your books, is how entrenched our ideas of the brain’s essential fixed and unregenerative nature are. Why are those ideas so powerful?



Whoa right there. How could anyone have the idea that neuroscientists think the brain is essentially fixed and unregenerative? That’s painfully counterfactual, the precise opposite of the actual position of the field of neurobiology. Conveniently, Adams has already answered how he came by such a bogus idea: by reading Doidge’s books. That should tell you something about the worth of Doidge’s stories.



Doidge’s first book, published seven years ago, described how the principle of such healing – of the plastic brain – was becoming established fact in the laboratory through a greater understanding of ways in which circuits of neurons functioned and were created by thought. “Equipped,” Doidge wrote, “for the first time, with the tools to observe the living brain’s microscopic activities, neuroplasticians showed that the brain changes as it works. In 2000, the Nobel prize for medicine was awarded for demonstrating that, as learning occurs, the connections among nerve cells increase. The scientist behind that discovery, Eric Kandel, also showed that learning can ‘switch on’ genes that change neural structure. Hundreds of studies went on to demonstrate that mental activity is not only the product of the brain but the shaper of it.”



Christ. Wrong.


I got my Ph.D. in 1985 for studies on changing spinal circuitry in the zebrafish spinal cord. I didn’t get the Nobel for it because the idea that synapses form and change gradually and integrate new elements of the circuit was not new or revolutionary; I was filling in details on a specific organism, working within a model of neural function that basically everyone accepts. Kandel (and Carlsson and Greengard) won the Nobel prize for a large body of work on signal transduction in the nervous system. That the brain changes as it works isn’t novel; working out the details, the specific molecules and pathways involved, was, especially since they had the potential to help address human disease. Doidge’s “think yourself better” approach doesn’t.


Here comes the bait-and-switch:



You suggest often that neuroplasticity is settled fact. That doesn’t seem to me to be the case in the medical profession and certainly not beyond it…


Within the lab, within science, within neurophysiology, neuroplasticity is established fact – nobody is challenging it.



So on the one hand, Doidge is claiming that the idea of the brain being this fixed and inflexible organ is entrenched; on the other, that neuroplasticity is an unchallenged fact. Which is it?


That the brain is capable of structural, molecular, chemical, and electrical changes in response to the environment is absolutely a fact, accepted without question by the field as a whole. That isn’t a lump of phlegm in your cranium.


But what Doidge does is conflate the scientific understanding of neuroplasticity with his brand of quackery and hype. Doidge claims much more, using poorly sourced anecdotes of people curing themselves of Parkinson’s or restoring their sight by thinking and carrying out various exercises.


The medical establishment has been well aware of the capacity of the brain to repair itself for years: you do know that there are all kinds of established therapies for stroke patients, right? There is no denial of the regenerative capacity of the brain, or that it rewires itself to meet circumstances. Quacks like Doidge rely on misrepresenting the known science to make standard treatments look like a miraculous consequence of his innovative and revolutionary thinking (they aren’t), and to exaggerate the effects of his claims.


You cannot cure Parkinson’s by concentrating really hard while walking. You can learn to compensate for some of the effects of the disease. Eye exercises will not cure degenerative retinopathies. Let’s not sell false hope.


And this just infuriates me:



Yes, well I didn’t set out to do that. When I finished my first book I had come to the conclusion that many of the claims that eastern medicine was making, which led to a lot of eye-rolling among western doctors, had at least to be re-examined in the light of neuroplasticity. By the time I had finished The Brain That Changes Itself, there were significant studies, which no one disputes, which show major changes in the structure of the brain of Tibetan monks, for example, brought about through the practice of meditation. I suppose it is not really a hard sell once you have grasped that the brain is plastic, that someone who has spent 30,000 hours meditating might actually have changed the structure of their brain. I mean, a London taxi driver can change his brain by studying routes through the city for a year or two.



Aaaaargh. That’s right. No one disputes the idea that spending years doing something changes the brain. Tibetan monks get better at meditating, whatever that means; taxi drivers get familiar with travel routes; video game players get better at their games; surgeons get better at surgery with practice. It’s kind of the whole sine qua non of learning.


Doidge didn’t discover it, and it does have limitations. Don’t fall for the neuroplasticity hype — it’s promoted by charlatans who inflate its significance far beyond the fundamental utility of the concept into vast magical realms of nonsense that verge on the Secret, the ‘Law’ of Attraction, and the Power of Positive Thinking. All bunk.






from ScienceBlogs http://ift.tt/1zLcahV

Memories: trust provisionally, but verify always [Pharyngula]

Steven Novella makes an important point: memories are fluid. There’s no VCR in your head, and no tape recorder either, and memories are constructs. You remember the framework (sometimes very poorly) of a past event, and your brain builds a plausible set of details around it. When you picture Christmas at your grandmother’s house when you were 12, you don’t have a record in your head of how many logs were in the fireplace or a second by second recording of the flickering of the fire. You remember that Grandma had a fireplace, and sometimes she had logs burning in it, and maybe there was a fire that year, and your brain obligingly assembles an image for you.



Novella is talking specifically about this recent hullaballoo over Brian Williams getting a story about events in Iraq wrong — he anecdotally places himself in a more dangerous situation than actually occurred. To which I say…so what? A remembered event is intrinsically unreliable. What matters is whether someone persists in believing an error, or adjusts one’s recollection on the basis of evidence. Oh, there’s a picture of the kids around the fireplace that year, and there was no fire? OK. No big deal. Unless I insist that <AlexJonesMode>someone used high technology to edit the old polaroid in a conspiracy to false flag CO2 release from burning wood as a cause of global warming</AlexJonesMode>.


Trauma also mangles memories. My very earliest ‘memory’ is of lying in bed, and seeing my baby brother crawl out of his crib and fall and hurt himself (which puts me at about 2 or 3 years of age). It’s very vivid, and my primary emotion at the time was fear and anxiety, but my mother tells me there was no such incident — that no, my brother Jim did not fall on his head as a baby. I can accept that; I suspect that what really happened is that I imagined a terrifying scenario that impressed me so strongly that over the years, my memory of a memory of a dream assumed the status of reality.


It happens all the time. I imagine that in Williams’ case, an event that was suffused with fear and confusion and the desire to be brave and heroic was especially prone to gradual confabulation. If someone is confronted with evidence that their memories are seriously faulty, the only problem would be if there was an insistence on repeating the error, or making excuses to blame others.






from ScienceBlogs http://ift.tt/1AJvv62

Steven Novella makes an important point: memories are fluid. There’s no VCR in your head, and no tape recorder either, and memories are constructs. You remember the framework (sometimes very poorly) of a past event, and your brain builds a plausible set of details around it. When you picture Christmas at your grandmother’s house when you were 12, you don’t have a record in your head of how many logs were in the fireplace or a second by second recording of the flickering of the fire. You remember that Grandma had a fireplace, and sometimes she had logs burning in it, and maybe there was a fire that year, and your brain obligingly assembles an image for you.



Novella is talking specifically about this recent hullaballoo over Brian Williams getting a story about events in Iraq wrong — he anecdotally places himself in a more dangerous situation than actually occurred. To which I say…so what? A remembered event is intrinsically unreliable. What matters is whether someone persists in believing an error, or adjusts one’s recollection on the basis of evidence. Oh, there’s a picture of the kids around the fireplace that year, and there was no fire? OK. No big deal. Unless I insist that <AlexJonesMode>someone used high technology to edit the old polaroid in a conspiracy to false flag CO2 release from burning wood as a cause of global warming</AlexJonesMode>.


Trauma also mangles memories. My very earliest ‘memory’ is of lying in bed, and seeing my baby brother crawl out of his crib and fall and hurt himself (which puts me at about 2 or 3 years of age). It’s very vivid, and my primary emotion at the time was fear and anxiety, but my mother tells me there was no such incident — that no, my brother Jim did not fall on his head as a baby. I can accept that; I suspect that what really happened is that I imagined a terrifying scenario that impressed me so strongly that over the years, my memory of a memory of a dream assumed the status of reality.


It happens all the time. I imagine that in Williams’ case, an event that was suffused with fear and confusion and the desire to be brave and heroic was especially prone to gradual confabulation. If someone is confronted with evidence that their memories are seriously faulty, the only problem would be if there was an insistence on repeating the error, or making excuses to blame others.






from ScienceBlogs http://ift.tt/1AJvv62

Are we giving Jon Stewart a pass for his contribution to the measles outbreak? [denialism blog]

I’m glad to see clips like this from the daily show appropriately mocking the deluded, and supposedly “educated” types that don’t vaccinate.





But have we forgotten this episode from 2005 when he allowed RFK Jr to basically spout his nonsense about vaccines without challenge?







It’s good and fine for Stewart to mock these people now. But he seems to forget he helped contribute to this problem. Is anyone aware of an apology from Stewart for allowing this crackpot to use his megaphone? Isn’t it precisely members of the media like him that are to blame for failing to vet the claims made by guests such as these?






from ScienceBlogs http://ift.tt/16XuMS8

I’m glad to see clips like this from the daily show appropriately mocking the deluded, and supposedly “educated” types that don’t vaccinate.





But have we forgotten this episode from 2005 when he allowed RFK Jr to basically spout his nonsense about vaccines without challenge?







It’s good and fine for Stewart to mock these people now. But he seems to forget he helped contribute to this problem. Is anyone aware of an apology from Stewart for allowing this crackpot to use his megaphone? Isn’t it precisely members of the media like him that are to blame for failing to vet the claims made by guests such as these?






from ScienceBlogs http://ift.tt/16XuMS8

Stuff That Doesn’t Belong on Student Evaluations [Uncertain Principles]

This was a good week for “Chad bristles at side issues of massively reshared stories,” with the Vox and gender bias stories, and also this PBS piece urging parents to tell their kids science stories. That probably seems surprising, given what I do around here, but while I fully endorse the end of that piece, the opening section in which Wendy Thomas Russell explains why she never liked science mostly makes me think that she’s an awful person. She attributes her lack of interest in science to bad teaching, and provides a series of examples ending with:



Later, at the University of Nebraska, I was able to avoid math and science for the most part (the journalism department was kind to me). I did take one astronomy class — and was pretty excited about it! — until I realized that the teacher was a very old Japanese man whose heavy accent destroyed any chance I had at making sense of the universe.


He pronounced “star” like this: “stah-waaaah.” I barely scraped by with a C-.



Seriously? You know, there are a bunch of valid criticisms that can and should be made about uninspiring science teaching. Complaining about people’s foreign accents is not one of them. Openly mocking said accent is well over the line into Not OK.


There’s a bit of irony to this, as another of the pieces being massively reshared around the same time was this interactive chart comparing RateMyProfessor evaluations for men and women. This provides a nice illustration of biased language used in evaluating faculty– see the write-up at the NYT for examples if you can’t think of stuff on your own. Among the not-okay things cited as being mentioned more frequently for women are comments about appearance and personality (though as the NYT article notes, these are less frequent than you might think). Those are definitely on the list of things that make faculty say “I can’t believe I have to deal with this horseshit.”


Right up there on that list with “needs new clothes” is “has a thick accent.” That sort of thing always makes me roll my eyes when it turns up in student comments. But “He pronounced ‘star’ like this: ‘stah-waaaah'” goes past eye-rolling, to “This student’s evaluations should be disregarded because the student is a bigoted asshat.”


Bad instruction is a real problem in science, and there are valid criticisms to make about poor teaching turning people away from the subject. “The class was presented in a confusing manner” is perfectly valid. “The lectures were extremely abstract and boring” is an appropriate complaint. “The professor talked funny” is not. That has no place on an anonymous student evaluation form, and it’s completely inappropriate in a major media outlet.






from ScienceBlogs http://ift.tt/1FlLfek

This was a good week for “Chad bristles at side issues of massively reshared stories,” with the Vox and gender bias stories, and also this PBS piece urging parents to tell their kids science stories. That probably seems surprising, given what I do around here, but while I fully endorse the end of that piece, the opening section in which Wendy Thomas Russell explains why she never liked science mostly makes me think that she’s an awful person. She attributes her lack of interest in science to bad teaching, and provides a series of examples ending with:



Later, at the University of Nebraska, I was able to avoid math and science for the most part (the journalism department was kind to me). I did take one astronomy class — and was pretty excited about it! — until I realized that the teacher was a very old Japanese man whose heavy accent destroyed any chance I had at making sense of the universe.


He pronounced “star” like this: “stah-waaaah.” I barely scraped by with a C-.



Seriously? You know, there are a bunch of valid criticisms that can and should be made about uninspiring science teaching. Complaining about people’s foreign accents is not one of them. Openly mocking said accent is well over the line into Not OK.


There’s a bit of irony to this, as another of the pieces being massively reshared around the same time was this interactive chart comparing RateMyProfessor evaluations for men and women. This provides a nice illustration of biased language used in evaluating faculty– see the write-up at the NYT for examples if you can’t think of stuff on your own. Among the not-okay things cited as being mentioned more frequently for women are comments about appearance and personality (though as the NYT article notes, these are less frequent than you might think). Those are definitely on the list of things that make faculty say “I can’t believe I have to deal with this horseshit.”


Right up there on that list with “needs new clothes” is “has a thick accent.” That sort of thing always makes me roll my eyes when it turns up in student comments. But “He pronounced ‘star’ like this: ‘stah-waaaah'” goes past eye-rolling, to “This student’s evaluations should be disregarded because the student is a bigoted asshat.”


Bad instruction is a real problem in science, and there are valid criticisms to make about poor teaching turning people away from the subject. “The class was presented in a confusing manner” is perfectly valid. “The lectures were extremely abstract and boring” is an appropriate complaint. “The professor talked funny” is not. That has no place on an anonymous student evaluation form, and it’s completely inappropriate in a major media outlet.






from ScienceBlogs http://ift.tt/1FlLfek

What is a Blood Moon?


The first Blood Moon eclipse in a series of four happened on the night of April 14-15, 2014. The second one took place on the night of October 7-8, 2014. At that October Blood Moon, there was a total lunar eclipse. We in astronomy had not heard the term Blood Moon used in quite this way before this year, but now the term has become widespread in the media. The origin of the term is religious, at least according to Christian pastor John Hagee, who wrote a 2013 book about Blood Moons.


Meanwhile, both astronomers and some proponents of Christian prophesy are talking about the ongoing lunar tetrad – the series of four total lunar eclipses – which began with the total lunar eclipse on the night of April 14-15.


We at EarthSky don’t have any special knowledge about the Blood Moons of Biblical prophesy. But, since they’re moons, and since people are asking us, we wanted to provide some info. Follow the links below to learn more about Blood Moons.


What is a lunar tetrad?


Blood Moons in Biblical prophecy


Dates of Biblical prophecy Blood Moons in 2014 and 2015


How common is a tetrad of total lunar eclipses?


Why is the term Blood Moon being used to mean a full moon of a lunar tetrad?


Other times in astronomy you hear “moon” and “blood” in same sentence.


Dates of Harvest and Hunter’s Moons in 2014 and 2015


Total lunar eclipse for the Americas on night of April 14-15


This is what a total eclipse looks like. This is the total eclipse of October 27, 2004 via Fred Espenak of NASA, otherwise known as Mr. Eclipse. Visit Fred's page here.

This is what a total eclipse looks like. This is the total eclipse of October 27, 2004 via Fred Espenak of NASA. Visit Fred’s page here. We astronomy writers often describe a totally eclipsed moon as appearing ‘blood red.’ Here’s why the moon turns red during a total eclipse.



What is a lunar tetrad? Both astronomers and followers of certain Christian pastors are talking about the lunar tetrad of 2014-2015. What is a tetrad? It’s four successive total lunar eclipses, with no partial lunar eclipses in between, each of which is separated from the other by six lunar months (six full moons)


Blood Moons in Biblical prophecy. We’re not experts on prophecy of any kind. But we’ll tell you what we know about the new definition for Blood Moon that has raised so many questions recently.


From what we’ve been able to gather, two Christian pastors, Mark Blitz and John Hagee, use the term Blood Moon to apply to the full moons of the ongoing tetrad – four successive total lunar eclipses, with no partial lunar eclipses in between, each of which is separated from the other by six lunar months (six full moons) – in 2014 and 2015. John Hagee appears to have popularized the term in his 2013 book Four Blood Moons: Something is About to Change.


Mark Blitz and John Hagee speak of a lunar tetrad as representing a fulfillment of Biblical prophecy. After all, the moon is supposed to turn blood red before the end times, isn’t it? As described in Joel 2:31 (Common English Bible):



The sun will be turned to darkness, and the moon to blood before the great and dreadful day of the LORD comes.



That description, by the way, describes both a total solar eclipse and total lunar eclipse. Sun turned to darkness = moon directly between the Earth and sun in a total solar eclipse. Moon turned to blood = Earth directly between the sun and moon, Earth’s shadow falling on the moon in a total lunar eclipse.


This book, published in 2013, is apparently what launched all the questions to our astronomy website about Blood Moons. We confess. We haven't read it.

This book, published in 2013, is apparently what launched all the questions to our astronomy website about Blood Moons. We confess. We haven’t read it.



Dates of Biblical prophecy Blood Moons in 2014 and 2015. These are the dates for the ongoing tetrad – four successive total lunar eclipses – in these years.



2014:

Total lunar eclipse: April 15

Total lunar eclipse: October 8


2015:

Total lunar eclipse: April 4

Total lunar eclipse: September 28



There are a total of 8 tetrads in the 21st century (2001 to 2100). But proponents of this Biblical prophecy regard the ongoing tetrad as especially significant because it coincides with two important Jewish holidays: Passover and Tabernacles.


The April 2014 and April 2015 total lunar eclipses align with the feast of Passover. The October 2014 and September 2015 total lunar eclipses align with the feast of Tabernacles.


The Jewish calendar is a lunar calendar. In any year, it’s inevitable that a full moon should fall on or near the feasts of Passover (15 Nissan) and Tabernacles (15 Tishri). Nissan and Tishri are the first and seventh months of the Jewish calendar, respectively.


It is somewhat ironic that three of these four lunar eclipses are not visible – even in part – from Israel. The only eclipse that can be seen at all from Israel is the tail end of the September 28, 2015 eclipse, which may be observable for a short while before sunrise.


How common is a tetrad of total lunar eclipses? Depending upon the century in which you live, a lunar tetrad (four consecutive total lunar eclipses, spaced at six lunar months apart from one another) may happen fairly frequently – or not at all.


For instance, in our 21st century (2001-2100), there are a total 8 tetrads, but in the 17th, 18th and 19th centuries, there were none at all. If we include all the centuries from the 1st century (AD 1-100) through the 21st century (2001-2100), inclusive, there are a total of 62 tetrads. The last one occurred in 2003-2004, and the next one after the 2014-2015 tetrad will happen in 2032-2033.


However, if we want to know which tetrads specifically fell on the Jewish feasts of Passover and Tabernacles, there appear to be a total of 8 in these 21 centuries:


1. 162-163 C.E. (Common Era)

2. 795-796 C.E.

3. 842-843 C.E.

4. 860-861 C.E.

5. 1493-1494 C.E.

6. 1949-1950 C.E.

7. 1967-1968 C.E.

8. 2014-2015 C.E.


Why is the term Blood Moon being used to mean a full moon of a lunar tetrad? We can’t really tell you why more and more people are using the term Blood Moon to describe the four full moons of a lunar tetrad. We don’t know why, exactly.


Here’s the definition of a lunar tetrad, again: four successive total lunar eclipses, with no partial eclipses in between, each of which is separated from the other by six lunar months (six full moons). There’s no obvious reason why Blood Moon should be associated with this term.


To the best of our knowledge, however, the use of the term Blood Moon to describe a lunar tetrad is of recent origin. It might have originated with John Hagee’s 2013 book.


We’re still not sure whether Blood Moon pertains to the full moon of any tetrad, or specifically to a tetrad that coincides with the feasts of Passover and Tabernacles.


Either way, I suspect the nouveau definition of Blood Moon will gain traction as we approach the tetrad, the four total lunar eclipses of 2014 and 2015.


View larger. | Hunter's Moon collage from EarthSky Facebook friend Kausor Khan in Hyberabad, India. Click here to expand this image

View larger. | Hunter’s Moon collage from EarthSky Facebook friend Kausor Khan in Hyberabad, India. Notice that she choose reddish moons to depict the Hunter’s Moon. That’s because many people see the Hunter’s Moon low in the sky, and moons seen low in the sky appear reddish. In 2014, the Hunter’s Moon – sometimes called the Blood Moon – will come on October 8. It will feature the second total eclipse of the moon in the ongoing lunar tetrad. So we’ll have two reasons to call the October 8 moon a Blood Moon.



Other times in astronomy you hear “moon” and “blood” in same sentence. The full moon nearly always appears coppery red during a total lunar eclipse. That’s because the dispersed light from all the Earth’s sunrises and sunsets falls on the face of the moon at mid-eclipse. Thus the term blood moon can be and probably is applied to any and all total lunar eclipses. It’s only in years where volcanic activity is pronounced that the moon’s face during a total lunar eclipse might appear more brownish or gray in color. Usually, the moon looks red. We astronomy writers often say it looks blood red. Why? Because it sounds dramatic, and a lunar eclipse is a dramatic natural event. Read more here: Why does the moon look red during a total lunar eclipse?


What’s more, in folklore, all the full moons have names. The names typically coincide with months of the year, or seasons. One of the most famous moon names is the Hunter’s Moon. It is the the full moon immediately following the Harvest Moon, which is the full moon occurring most closely to the autumnal equinox.


The Hunter’s Moon, in skylore, is also sometimes called the Blood Moon. Why? Probably because it’s a characteristic of these autumn full moons that they appear nearly full – and rise soon after sunset – for several evenings in a row. Many people see them when they are low in the sky, shortly after they’ve risen, at which time there’s more atmosphere between you and the moon than when the moon is overhead. When you see the moon low in the sky, the extra air between you and the moon makes the moon look reddish. Voila. Blood moon.


The second total lunar eclipse of the coming lunar tetrad will take place on October 8, the same night as the Hunter’s Moon. So there will be two reasons to use the term Blood Moon that night.


Dates for the Northern Hemisphere’s Harvest and Hunter’s Moons in 2014 and 2015:



2014:

Harvest Moon: September 9

Autumn Equinox: September 23

Hunter’s (Blood) Moon: October 8


2015:

Autumn Equinox: September 23

Harvest Moon: September 28

Hunter’s (Blood) Moon: October 27



Bottom line: The term Blood Moon in Biblical prophecy appears to have been popularized by two Christian pastors, Mark Blitz and John Hagee. They use the term Blood Moon to apply to the full moons of the ongoing tetrad – four successive total lunar eclipses, with no partial lunar eclipses in between, each of which is separated from the other by six lunar months (six full moons) – beginning on the night of April 14-15, 2014. The next Blood Moon eclipse comes on the night of October 7-8, 2014.


A planisphere is virtually indispensable for beginning stargazers. Order your EarthSky planisphere today.


Total lunar eclipse of Hunter’s Moon on night of October 7-8






from EarthSky http://ift.tt/1gD1Lv9

The first Blood Moon eclipse in a series of four happened on the night of April 14-15, 2014. The second one took place on the night of October 7-8, 2014. At that October Blood Moon, there was a total lunar eclipse. We in astronomy had not heard the term Blood Moon used in quite this way before this year, but now the term has become widespread in the media. The origin of the term is religious, at least according to Christian pastor John Hagee, who wrote a 2013 book about Blood Moons.


Meanwhile, both astronomers and some proponents of Christian prophesy are talking about the ongoing lunar tetrad – the series of four total lunar eclipses – which began with the total lunar eclipse on the night of April 14-15.


We at EarthSky don’t have any special knowledge about the Blood Moons of Biblical prophesy. But, since they’re moons, and since people are asking us, we wanted to provide some info. Follow the links below to learn more about Blood Moons.


What is a lunar tetrad?


Blood Moons in Biblical prophecy


Dates of Biblical prophecy Blood Moons in 2014 and 2015


How common is a tetrad of total lunar eclipses?


Why is the term Blood Moon being used to mean a full moon of a lunar tetrad?


Other times in astronomy you hear “moon” and “blood” in same sentence.


Dates of Harvest and Hunter’s Moons in 2014 and 2015


Total lunar eclipse for the Americas on night of April 14-15


This is what a total eclipse looks like. This is the total eclipse of October 27, 2004 via Fred Espenak of NASA, otherwise known as Mr. Eclipse. Visit Fred's page here.

This is what a total eclipse looks like. This is the total eclipse of October 27, 2004 via Fred Espenak of NASA. Visit Fred’s page here. We astronomy writers often describe a totally eclipsed moon as appearing ‘blood red.’ Here’s why the moon turns red during a total eclipse.



What is a lunar tetrad? Both astronomers and followers of certain Christian pastors are talking about the lunar tetrad of 2014-2015. What is a tetrad? It’s four successive total lunar eclipses, with no partial lunar eclipses in between, each of which is separated from the other by six lunar months (six full moons)


Blood Moons in Biblical prophecy. We’re not experts on prophecy of any kind. But we’ll tell you what we know about the new definition for Blood Moon that has raised so many questions recently.


From what we’ve been able to gather, two Christian pastors, Mark Blitz and John Hagee, use the term Blood Moon to apply to the full moons of the ongoing tetrad – four successive total lunar eclipses, with no partial lunar eclipses in between, each of which is separated from the other by six lunar months (six full moons) – in 2014 and 2015. John Hagee appears to have popularized the term in his 2013 book Four Blood Moons: Something is About to Change.


Mark Blitz and John Hagee speak of a lunar tetrad as representing a fulfillment of Biblical prophecy. After all, the moon is supposed to turn blood red before the end times, isn’t it? As described in Joel 2:31 (Common English Bible):



The sun will be turned to darkness, and the moon to blood before the great and dreadful day of the LORD comes.



That description, by the way, describes both a total solar eclipse and total lunar eclipse. Sun turned to darkness = moon directly between the Earth and sun in a total solar eclipse. Moon turned to blood = Earth directly between the sun and moon, Earth’s shadow falling on the moon in a total lunar eclipse.


This book, published in 2013, is apparently what launched all the questions to our astronomy website about Blood Moons. We confess. We haven't read it.

This book, published in 2013, is apparently what launched all the questions to our astronomy website about Blood Moons. We confess. We haven’t read it.



Dates of Biblical prophecy Blood Moons in 2014 and 2015. These are the dates for the ongoing tetrad – four successive total lunar eclipses – in these years.



2014:

Total lunar eclipse: April 15

Total lunar eclipse: October 8


2015:

Total lunar eclipse: April 4

Total lunar eclipse: September 28



There are a total of 8 tetrads in the 21st century (2001 to 2100). But proponents of this Biblical prophecy regard the ongoing tetrad as especially significant because it coincides with two important Jewish holidays: Passover and Tabernacles.


The April 2014 and April 2015 total lunar eclipses align with the feast of Passover. The October 2014 and September 2015 total lunar eclipses align with the feast of Tabernacles.


The Jewish calendar is a lunar calendar. In any year, it’s inevitable that a full moon should fall on or near the feasts of Passover (15 Nissan) and Tabernacles (15 Tishri). Nissan and Tishri are the first and seventh months of the Jewish calendar, respectively.


It is somewhat ironic that three of these four lunar eclipses are not visible – even in part – from Israel. The only eclipse that can be seen at all from Israel is the tail end of the September 28, 2015 eclipse, which may be observable for a short while before sunrise.


How common is a tetrad of total lunar eclipses? Depending upon the century in which you live, a lunar tetrad (four consecutive total lunar eclipses, spaced at six lunar months apart from one another) may happen fairly frequently – or not at all.


For instance, in our 21st century (2001-2100), there are a total 8 tetrads, but in the 17th, 18th and 19th centuries, there were none at all. If we include all the centuries from the 1st century (AD 1-100) through the 21st century (2001-2100), inclusive, there are a total of 62 tetrads. The last one occurred in 2003-2004, and the next one after the 2014-2015 tetrad will happen in 2032-2033.


However, if we want to know which tetrads specifically fell on the Jewish feasts of Passover and Tabernacles, there appear to be a total of 8 in these 21 centuries:


1. 162-163 C.E. (Common Era)

2. 795-796 C.E.

3. 842-843 C.E.

4. 860-861 C.E.

5. 1493-1494 C.E.

6. 1949-1950 C.E.

7. 1967-1968 C.E.

8. 2014-2015 C.E.


Why is the term Blood Moon being used to mean a full moon of a lunar tetrad? We can’t really tell you why more and more people are using the term Blood Moon to describe the four full moons of a lunar tetrad. We don’t know why, exactly.


Here’s the definition of a lunar tetrad, again: four successive total lunar eclipses, with no partial eclipses in between, each of which is separated from the other by six lunar months (six full moons). There’s no obvious reason why Blood Moon should be associated with this term.


To the best of our knowledge, however, the use of the term Blood Moon to describe a lunar tetrad is of recent origin. It might have originated with John Hagee’s 2013 book.


We’re still not sure whether Blood Moon pertains to the full moon of any tetrad, or specifically to a tetrad that coincides with the feasts of Passover and Tabernacles.


Either way, I suspect the nouveau definition of Blood Moon will gain traction as we approach the tetrad, the four total lunar eclipses of 2014 and 2015.


View larger. | Hunter's Moon collage from EarthSky Facebook friend Kausor Khan in Hyberabad, India. Click here to expand this image

View larger. | Hunter’s Moon collage from EarthSky Facebook friend Kausor Khan in Hyberabad, India. Notice that she choose reddish moons to depict the Hunter’s Moon. That’s because many people see the Hunter’s Moon low in the sky, and moons seen low in the sky appear reddish. In 2014, the Hunter’s Moon – sometimes called the Blood Moon – will come on October 8. It will feature the second total eclipse of the moon in the ongoing lunar tetrad. So we’ll have two reasons to call the October 8 moon a Blood Moon.



Other times in astronomy you hear “moon” and “blood” in same sentence. The full moon nearly always appears coppery red during a total lunar eclipse. That’s because the dispersed light from all the Earth’s sunrises and sunsets falls on the face of the moon at mid-eclipse. Thus the term blood moon can be and probably is applied to any and all total lunar eclipses. It’s only in years where volcanic activity is pronounced that the moon’s face during a total lunar eclipse might appear more brownish or gray in color. Usually, the moon looks red. We astronomy writers often say it looks blood red. Why? Because it sounds dramatic, and a lunar eclipse is a dramatic natural event. Read more here: Why does the moon look red during a total lunar eclipse?


What’s more, in folklore, all the full moons have names. The names typically coincide with months of the year, or seasons. One of the most famous moon names is the Hunter’s Moon. It is the the full moon immediately following the Harvest Moon, which is the full moon occurring most closely to the autumnal equinox.


The Hunter’s Moon, in skylore, is also sometimes called the Blood Moon. Why? Probably because it’s a characteristic of these autumn full moons that they appear nearly full – and rise soon after sunset – for several evenings in a row. Many people see them when they are low in the sky, shortly after they’ve risen, at which time there’s more atmosphere between you and the moon than when the moon is overhead. When you see the moon low in the sky, the extra air between you and the moon makes the moon look reddish. Voila. Blood moon.


The second total lunar eclipse of the coming lunar tetrad will take place on October 8, the same night as the Hunter’s Moon. So there will be two reasons to use the term Blood Moon that night.


Dates for the Northern Hemisphere’s Harvest and Hunter’s Moons in 2014 and 2015:



2014:

Harvest Moon: September 9

Autumn Equinox: September 23

Hunter’s (Blood) Moon: October 8


2015:

Autumn Equinox: September 23

Harvest Moon: September 28

Hunter’s (Blood) Moon: October 27



Bottom line: The term Blood Moon in Biblical prophecy appears to have been popularized by two Christian pastors, Mark Blitz and John Hagee. They use the term Blood Moon to apply to the full moons of the ongoing tetrad – four successive total lunar eclipses, with no partial lunar eclipses in between, each of which is separated from the other by six lunar months (six full moons) – beginning on the night of April 14-15, 2014. The next Blood Moon eclipse comes on the night of October 7-8, 2014.


A planisphere is virtually indispensable for beginning stargazers. Order your EarthSky planisphere today.


Total lunar eclipse of Hunter’s Moon on night of October 7-8






from EarthSky http://ift.tt/1gD1Lv9

Moon and star Spica rise at late evening February 8


Tonight – February 8, 2015 – stay up late (or wake up early) to see the moon with the star Spica. And if you miss them tonight, try again over the next several nights. At mid-northern latitudes, the waning moon and Spica, the brightest star in the constellation Virgo the Maiden, climb up above the horizon a short while before the midnight hour. At southerly latitudes in the Southern Hemisphere, the moon and Spica are up by around mid-evening.


As the Earth spins eastward beneath the starry sky tonight, the moon and Spica will move upward and westward until reaching their high point for the night around 4 a.m. local time Monday morning, February 9. Before daybreak, look for the moon and the star Spica to shine in the western half of sky.


The moon rises and sets an average 50 minutes later each day. On the other hand, the backdrop stars of the Zodiac rise and set about four minutes earlier daily. That’s why the moon and Spica shine together on the sky’s dome for only a few days of the month.


The star Spica rises and sets about one-half hour earlier with each passing week, or some two hours earlier with each passing month. So by the time April comes rolling around, Spica will be up by nightfall. That’s why we in the Northern Hemisphere associate this sparkling blue-white beauty with the season of spring!


Enjoying EarthSky so far? Sign up for our free daily newsletter today!


Live by the moon with your 2015 EarthSky lunar calendar!


The moon rises aboout 50 minutes later each day, whereas Spica rises about four minutes earlier each day. By early April, we'll see Spica at nightfall.

The moon rises about 50 minutes later each day. That’s why you’ll see the moon sweep past Spica over the nights of February 8 and 9, 2015.



The star Spica is also called Alpha Virginis, because it is the brightest star in Virgo. Its color is distinctly blue-white, even to the eye. Photo by Fred Espenak. Read more about Spica at Fred Espenak's website.

The star Spica is also called Alpha Virginis, because it is the brightest star in Virgo. Its color is distinctly blue-white, even to the eye. Photo by Fred Espenak. Read more about Spica at Fred Espenak’s website.



Bottom line: Either late this evening – February 8, 2015 – or in the wee hours before sunrise on February 9, see the waning moon paired up with Spica, the brightest star in the constellation Virgo.


Donate: Your support means the world to us






from EarthSky http://ift.tt/1yVUdKH

Tonight – February 8, 2015 – stay up late (or wake up early) to see the moon with the star Spica. And if you miss them tonight, try again over the next several nights. At mid-northern latitudes, the waning moon and Spica, the brightest star in the constellation Virgo the Maiden, climb up above the horizon a short while before the midnight hour. At southerly latitudes in the Southern Hemisphere, the moon and Spica are up by around mid-evening.


As the Earth spins eastward beneath the starry sky tonight, the moon and Spica will move upward and westward until reaching their high point for the night around 4 a.m. local time Monday morning, February 9. Before daybreak, look for the moon and the star Spica to shine in the western half of sky.


The moon rises and sets an average 50 minutes later each day. On the other hand, the backdrop stars of the Zodiac rise and set about four minutes earlier daily. That’s why the moon and Spica shine together on the sky’s dome for only a few days of the month.


The star Spica rises and sets about one-half hour earlier with each passing week, or some two hours earlier with each passing month. So by the time April comes rolling around, Spica will be up by nightfall. That’s why we in the Northern Hemisphere associate this sparkling blue-white beauty with the season of spring!


Enjoying EarthSky so far? Sign up for our free daily newsletter today!


Live by the moon with your 2015 EarthSky lunar calendar!


The moon rises aboout 50 minutes later each day, whereas Spica rises about four minutes earlier each day. By early April, we'll see Spica at nightfall.

The moon rises about 50 minutes later each day. That’s why you’ll see the moon sweep past Spica over the nights of February 8 and 9, 2015.



The star Spica is also called Alpha Virginis, because it is the brightest star in Virgo. Its color is distinctly blue-white, even to the eye. Photo by Fred Espenak. Read more about Spica at Fred Espenak's website.

The star Spica is also called Alpha Virginis, because it is the brightest star in Virgo. Its color is distinctly blue-white, even to the eye. Photo by Fred Espenak. Read more about Spica at Fred Espenak’s website.



Bottom line: Either late this evening – February 8, 2015 – or in the wee hours before sunrise on February 9, see the waning moon paired up with Spica, the brightest star in the constellation Virgo.


Donate: Your support means the world to us






from EarthSky http://ift.tt/1yVUdKH

No One Needs A Moral Philosophy [EvolutionBlog]

Here’s something that happened this week: David Brooks wrote a bad column about secularism. In fairness, it gets off to a decent start:



Over the past few years, there has been a sharp rise in the number of people who are atheist, agnostic or without religious affiliation. A fifth of all adults and a third of the youngest adults fit into this category.


As secularism becomes more prominent and self-confident, its spokesmen have more insistently argued that secularism should not be seen as an absence — as a lack of faith — but rather as a positive moral creed. Phil Zuckerman, a Pitzer College sociologist, makes this case as fluidly and pleasurably as anybody in his book, “Living the Secular Life.”


Zuckerman argues that secular morality is built around individual reason, individual choice and individual responsibility. Instead of relying on some eye in the sky to tell them what to do, secular people reason their way to proper conduct.


Secular people, he argues, value autonomy over groupthink. They deepen their attachment to this world instead of focusing on a next one. They may not be articulate about why they behave as they do, he argues, but they try their best to follow the Golden Rule, to be considerate and empathetic toward others. “Secular morality hinges upon little else than not harming others and helping those in need,” Zuckerman writes.



Having enjoyed Zuckerman’s previous book Society Without God, I’m looking forward to reading this one as well. If Brooks had stopped here, I would have thought it a pretty good column. The trouble is that from here the column rattles off a series of bullet points that are meant to convince us that it is terribly difficult to live without religion. All of them deserve a vigorous response, but to keep this to a reasonable length I will just focus on one:



Secular individuals have to build their own moral philosophies. Religious people inherit creeds that have evolved over centuries. Autonomous secular people are called upon to settle on their own individual sacred convictions.



If Richard Dawkins had written those first two sentences he would have been accused of indulging in caricatures of religious people. He would be charged with making religious people look like thoughtless zombies, just mindlessly adhering to moral principles other folks had taught them.


Let’s see if we can enumerate everything that’s wrong with Brooks’s argument.


First, religious people no less than secular people have to work out their convictions for themselves. Blindly following the dictates of authority figures is fine for children, but we mostly expect adults to be more thoughtful than that. Deciding that one should follow the teachings of clerics and holy texts is as fateful a decision as anything secularists face.


Second, to the extent that religious folks can be said to inherit moral creeds, those creeds are frequently very poor ones. They are often based on very dubious notions about “natural law” and can often lead to very bad consequences. To go for the low-hanging fruit, do I need to remind you that slavery was routinely defended in explicitly religious terms? Things are scarcely different today, where the most repressive and immoral views about women and homosexuals are promoted by religion. You might retort that many Christians hold far more sensible views on those subjects than do the extremists, but that only supports my argument. They hold those views precisely because they bring their own ideas and values to the discussion, and don’t just rely on the teachings of their religious authorities.


Third, secular people, no less than religious people, can tap into a long tradition of thought and argument. It’s not as though secular morality is some new thing. Many great philosophers have weighed in on the subject, and their thoughts are far more deserving of serious consideration than anything found in the world’s holy texts, or anything shrieked from a pulpit on Sunday morning.


Finally, and most significantly in my view, no one needs a moral philosophy. True moral dilemmas are exceedingly rare, and most people go their whole lives without ever encountering one in their day-to-day experience. Yet most people also manage to go their whole lives without behaving badly, or at least with no more than the trivial sort of badness we all sometimes engage in. Even when we do behave badly, it is almost never because we are confused about the right thing to do.


How many people, though, have really thought seriously about the differences between consequentialism, virtue ethics, deontology, and the divine command theory? Somehow we all manage to know right from wrong without consulting the works of moral philosophers. That is because most people just regard it as obvious, even axiomatic, that you shouldn’t hurt other people without an awfully good reason, or that we should have some empathy for other people. Most moral questions can be resolved with a few basic principles nearly everyone accepts instinctively, or perhaps because we learned them at a very young age. Asked to explain why murder is wrong, most people will just reply with a funny look, as thought it reflects poorly on a person even to raise the question. That reply is entirely appropriate in my opinion.


The sheer ubiquity and cultural force of religion can make it difficult to live the secular life, in some parts of the country more than others. In practice, it can sometimes be difficult to live without religion simply because you might be seriously limiting your social options by doing so. But finding one’s way to right moral thinking is not one of the problems secular people face. The reality is precisely the opposite, in fact. Clear thinking about morality cannot begin until you shake off antiquated and harmful religious notions.






from ScienceBlogs http://ift.tt/1ztlLLi

Here’s something that happened this week: David Brooks wrote a bad column about secularism. In fairness, it gets off to a decent start:



Over the past few years, there has been a sharp rise in the number of people who are atheist, agnostic or without religious affiliation. A fifth of all adults and a third of the youngest adults fit into this category.


As secularism becomes more prominent and self-confident, its spokesmen have more insistently argued that secularism should not be seen as an absence — as a lack of faith — but rather as a positive moral creed. Phil Zuckerman, a Pitzer College sociologist, makes this case as fluidly and pleasurably as anybody in his book, “Living the Secular Life.”


Zuckerman argues that secular morality is built around individual reason, individual choice and individual responsibility. Instead of relying on some eye in the sky to tell them what to do, secular people reason their way to proper conduct.


Secular people, he argues, value autonomy over groupthink. They deepen their attachment to this world instead of focusing on a next one. They may not be articulate about why they behave as they do, he argues, but they try their best to follow the Golden Rule, to be considerate and empathetic toward others. “Secular morality hinges upon little else than not harming others and helping those in need,” Zuckerman writes.



Having enjoyed Zuckerman’s previous book Society Without God, I’m looking forward to reading this one as well. If Brooks had stopped here, I would have thought it a pretty good column. The trouble is that from here the column rattles off a series of bullet points that are meant to convince us that it is terribly difficult to live without religion. All of them deserve a vigorous response, but to keep this to a reasonable length I will just focus on one:



Secular individuals have to build their own moral philosophies. Religious people inherit creeds that have evolved over centuries. Autonomous secular people are called upon to settle on their own individual sacred convictions.



If Richard Dawkins had written those first two sentences he would have been accused of indulging in caricatures of religious people. He would be charged with making religious people look like thoughtless zombies, just mindlessly adhering to moral principles other folks had taught them.


Let’s see if we can enumerate everything that’s wrong with Brooks’s argument.


First, religious people no less than secular people have to work out their convictions for themselves. Blindly following the dictates of authority figures is fine for children, but we mostly expect adults to be more thoughtful than that. Deciding that one should follow the teachings of clerics and holy texts is as fateful a decision as anything secularists face.


Second, to the extent that religious folks can be said to inherit moral creeds, those creeds are frequently very poor ones. They are often based on very dubious notions about “natural law” and can often lead to very bad consequences. To go for the low-hanging fruit, do I need to remind you that slavery was routinely defended in explicitly religious terms? Things are scarcely different today, where the most repressive and immoral views about women and homosexuals are promoted by religion. You might retort that many Christians hold far more sensible views on those subjects than do the extremists, but that only supports my argument. They hold those views precisely because they bring their own ideas and values to the discussion, and don’t just rely on the teachings of their religious authorities.


Third, secular people, no less than religious people, can tap into a long tradition of thought and argument. It’s not as though secular morality is some new thing. Many great philosophers have weighed in on the subject, and their thoughts are far more deserving of serious consideration than anything found in the world’s holy texts, or anything shrieked from a pulpit on Sunday morning.


Finally, and most significantly in my view, no one needs a moral philosophy. True moral dilemmas are exceedingly rare, and most people go their whole lives without ever encountering one in their day-to-day experience. Yet most people also manage to go their whole lives without behaving badly, or at least with no more than the trivial sort of badness we all sometimes engage in. Even when we do behave badly, it is almost never because we are confused about the right thing to do.


How many people, though, have really thought seriously about the differences between consequentialism, virtue ethics, deontology, and the divine command theory? Somehow we all manage to know right from wrong without consulting the works of moral philosophers. That is because most people just regard it as obvious, even axiomatic, that you shouldn’t hurt other people without an awfully good reason, or that we should have some empathy for other people. Most moral questions can be resolved with a few basic principles nearly everyone accepts instinctively, or perhaps because we learned them at a very young age. Asked to explain why murder is wrong, most people will just reply with a funny look, as thought it reflects poorly on a person even to raise the question. That reply is entirely appropriate in my opinion.


The sheer ubiquity and cultural force of religion can make it difficult to live the secular life, in some parts of the country more than others. In practice, it can sometimes be difficult to live without religion simply because you might be seriously limiting your social options by doing so. But finding one’s way to right moral thinking is not one of the problems secular people face. The reality is precisely the opposite, in fact. Clear thinking about morality cannot begin until you shake off antiquated and harmful religious notions.






from ScienceBlogs http://ift.tt/1ztlLLi

adds 2