Teaching critical thinking to combat fake news and bullshit: You have to start young [Respectful Insolence]


As much as I like to deconstruct pseudoscientific claims, particularly about health, medicine, and health care, Sometimes it gets a bit draining. There’s just so much pseudoscience, so much credulity, so much sheer idiocy out there that trying to refute them and encourage a more skeptical mindset often feels like pissing into the ocean, for all the effect it has. In the age of fake news and Donald Trump, it even feels as though we’re going backward—and not slowly, either. That’s why I felt it was time for a bit of a break, a bit more optimism than I’ve been able to muster before. So it was a good thing that I happened across an article by Julia Belluz and Alvin Chang entitled This researcher may have discovered the antidote to health bullshit.

I’ve always suspected that the key to combatting bullshit of all stripes, be it related to health (one of the more dangerous forms), conspiracy theories, or one of the many other varieties of nonsense, pseudoscience, conspiracy theories, and quackery, will be starting young. Let’s face it. By the time we’re adults, changing the way we think is very difficult. That’s not to say that it can’t be done, but in general you have to want to change. Again, let’s face it. Most people don’t want to change. They resist it. That’s part of the reason why adults are so good at motivated reasoning, in which they become very good at finding observations and evidence that support their preexisting point of view while downplaying or discounting evidence that does not. In essence, they develop only part of the skills needed to be a skeptic in that they are very good at deconstructing ideas they disagree with but remain not so good at critically examining ideas that they do agree with that might lack good evidence to support them.

Yes, as Andy Oxman shows, you have to get ’em young. He relates a story about his visit to his then 10 year old son’s class in 2000.that shows how children actually have a proclivity for becoming skeptical:

“I told them that some teenagers had discovered that red M&Ms gave them a good feeling in their body and helped them write and draw more quickly,” Oxman said. “But there also were some bad effects: a little pain in their stomach, and they got dizzy if they stood up quickly.”

He challenged the kids to try to find out if the teens were right. He split the class into small groups and gave each group a bag of M&Ms.

The children quickly figured out they had to try eating M&Ms of different colors to find out what happens, but that it wouldn’t be a fair test if they could see the color of the M&Ms. In other words, they intuitively understood the concept of “blinding” in a clinical trial. (This is when researchers prevent study participants and doctors from knowing who got what treatment so they’re less likely to be biased about the outcome.)

In a short time, they were running their own blinded, randomized trials — the gold standard for testing medical claims — in the classroom. By the end of their experiment, Oxman said, “They figured out that there was little if any difference in the effects of the different colors and they asked me if the teenagers who made the claim really believed that.”

As a result of this experience, Oxman began working with other academics to develop curricula designed to teach critical thinking skills in children. The idea is to arm them with the skills needed to spot “alternative facts” more readily and, as importantly, how to keep fake news from spreading in the first place. He’s not the only one who asked this question. So did Imogen Evans, Hazel Thornton, Iain Chalmers and Paul Glasziou, who wrote a book called Testing Treatments that’s available for free as a downloadable PDF. In 2012, Oxman teamed up with one of the authors of the book, Sir Iain Chalmers, asking him to help to develop curricula based on the concepts in the book appropriate for primary school children

Because Oxman had ties with researchers in Uganda, he tested the new materials there on children between 10-12 years of age. Personally, when I first saw that, I thought that ten years old is a bit old to start the process. I would have thought that age eight or even six would be the time to start. Be that as it may, Oxman ran a randomized trial in which a representative sample of eligible schools were randomized either to control (no change in curriculum) or including the Informed Health Choices (as the program came to be known) primary school resources (textbooks, exercise books, and a teachers’ guide) in the lesson plan. Teachers teaching the Informed Health Choices curriculum attended a 2 day introductory workshop and gave nine 80 min lessons during one school term. The lessons addressed 12 concepts essential to assessing claims about treatment effects and making informed health choices.

Twelve key concepts were emphasized:

Claims

  • Treatments might be harmful
  • Personal experiences or anecdotes (stories) are an unreliable basis for assessing the effects of most treatments
  • Widely used treatments or treatments that have been used for a long time are not necessarily beneficial or safe
  • New, brand-named, or more expensive treatments may not be better than available alternatives
  • Opinions of experts or authorities do not alone provide a reliable basis for deciding on the benefits and harms of treatments
  • Conflicting interests may result in misleading claims about the effects of treatments

Comparisons

  • Evaluating the effects of treatments requires appropriate comparisons
  • Apart from the treatments being compared, the comparison groups need to be similar (ie, “like needs to be compared with like”)
  • If possible, people should not know which of the treatments being compared they are receiving
  • Small studies in which few outcome events occur are usually not informative and the results may be misleading
  • The results of single comparisons of treatments can be misleading

Choices

  • Treatments usually have beneficial and harmful effects

These are indeed key concepts that any medical skeptic needs to know and understand. I’ve empnasized pretty much all of them at one time or another over the last 12 years. Heck, we even have names for some of them, such as the appeal to antiquity to describe the concept that just because a treatment is old does not make it better. After all, many of these “ancient” remedies date back to times when medicine was anything but scientific and the very basics of what causes disease were not understood and instead diseases were attributed to “imbalances” in humors, “bad air,” or even the intervention of malign spirits. Any of them that might work were basically discovered by sheer accident, and ancient herbal remedies that work have mostly already been picked over and turned into purified drugs.

The result was a publication in The Lancet. At the end, results from students at non-curriculum schools were compared with those of students at Informed Health Choices schools on a 24-question multiple choice test on the 12 concepts (two questions per concept). The results were striking:

The average score for children in the intervention schools was 62·4% (SD 18·8) compared with 43·1% (15·2) in the control schools. The adjusted mean difference (based on the regression analysis) was 20·0% (95% CI 17·3–22·7; p<0·00001) higher in the intervention than in the control group. Appendix 1 shows the distribution of test scores. In the intervention schools, 3967 (69%) of 5753 children had a passing score (≥13 of 24 correct answers), compared with 1186 (27%) of 4430 in the control schools (table 2). The adjusted difference (based on the odds ratio from the logistic regression analysis) was 50% more children who passed (95% CI 44–55; p<0·00001) in the intervention than in the control group.

Noting:

Use of the Informed Health Choices primary school resources had a large effect on the ability of primary school children in Uganda to assess claims about treatment effects. This effect was larger for children with better reading skills, but the intervention was effective for children lacking basic reading skills, as well as for children with basic or advanced reading skills. This effect was achieved even though the learning materials and the tests were in English, which was not the children’s first language. Based on findings from pilot testing both the resources and the test used to measure the outcomes, we were surprised by the size of the effect, which is also large in comparison to other education interventions in primary schools in low-income and middle-income countries,20 and other interventions to teach critical thinking for all ages in high-income countries.11 In addition, the intervention had a positive effect on the children’s intended behaviours and the teachers’ mastery of the key concepts.

Overall, about one-fifth of the children achieved a test score indicating that they had mastered the key concepts (getting at least 20 questions out of the 24 correct), while less than 1% of the children in control schools achieved that high a score. This is quite an effect.

The authors do acknowledge weaknesses in their study. One thing I tend to question is whether the questions on the multiple choice test actually correspond to what the authors say they do. In other words, we don’t know if the children learned how to answer test questions but at the end of the teaching didn’t know how to apply the concepts they learned to real life. We also have no way of knowing what the long term effects of these interventions are and whether they will actually have measurable effects on the health choices made by these participants when they are adults. I also hate to be the pessimist, but I find it disappointing that, even after this intervention, only one-fifth of the children mastered all of the concepts and the average score among those who underwent the teaching still got more than 1/3 of the questions wrong.

As much as there is an emphasis on starting to learn critical thinking at a young age, all is not lost for us old farts:

Separately, the researchers also created a podcast on critical thinking concepts for parents, and tested that approach in another randomized controlled trial, also published in the Lancet. They were successful here as well: Nearly twice as many parents who listened to the podcast series passed a test on their understanding of key health concepts compared with parents in the control group.

Or, in a bit more detail:

We recruited parents between July 21, 2016, and Oct 7, 2016. We randomly assigned 675 parents to the podcast group (n=334) or the public service announcement group (n=341); 561 (83%) participants completed follow-up. The mean score for parents in the podcast group was 67·8% (SD 19·6) compared with 52·4% (17·6) in the control group (adjusted mean difference 15·5%, 95% CI 12·5–18·6; p<0·0001). In the podcast group, 203 (71%) of 288 parents had a predetermined passing score (≥11 of 18 correct answers) compared with 103 (38%) of 273 parents in the control group (adjusted difference 34%, 95% CI 26–41; p<0·0001). No adverse events were reported.

Studies like these, for all their messiness and shortcomings that are unavoidable in carrying out studies like this, give me hope. We human beings have cognitive wiring that leave us prone to making all sorts of incorrect inferences and latching on to all sorts of pseudoscientific beliefs, particularly about health. Critical thinking, while second nature in some areas (such as economic; e.g., when buying a used car), is not natural to most humans, particularly when it comes to health claims, where anecdotes can profoundly mislead and we are very quick to confuse correlation with causation. It can be taught, however. The problem is that there has to be the will and resources to teach it, as well as a large enough core of motivated teachers trained to do it. Then there’s the issue of competing for time among existing subjects that have to be taught. I can’t help but wonder if, rather than teaching critical thinking like this as a module separate from ohter subjects, it would be more effective to find a way to weave training in critical thinking into all subjects, especially science, math, and history, but by no means limited to them and including topics like writing, literature, and other humanities.

However it’s done, the need is acute. 2016 was the year fake news and bullshit appeared to have reached a tipping point. The need to be able to recognize and combat it is more acute than ever, and it’s never too early to start inculcating critical thinking skills into our children, who will need them more than our generation ever did.



from ScienceBlogs http://ift.tt/2q60nio

As much as I like to deconstruct pseudoscientific claims, particularly about health, medicine, and health care, Sometimes it gets a bit draining. There’s just so much pseudoscience, so much credulity, so much sheer idiocy out there that trying to refute them and encourage a more skeptical mindset often feels like pissing into the ocean, for all the effect it has. In the age of fake news and Donald Trump, it even feels as though we’re going backward—and not slowly, either. That’s why I felt it was time for a bit of a break, a bit more optimism than I’ve been able to muster before. So it was a good thing that I happened across an article by Julia Belluz and Alvin Chang entitled This researcher may have discovered the antidote to health bullshit.

I’ve always suspected that the key to combatting bullshit of all stripes, be it related to health (one of the more dangerous forms), conspiracy theories, or one of the many other varieties of nonsense, pseudoscience, conspiracy theories, and quackery, will be starting young. Let’s face it. By the time we’re adults, changing the way we think is very difficult. That’s not to say that it can’t be done, but in general you have to want to change. Again, let’s face it. Most people don’t want to change. They resist it. That’s part of the reason why adults are so good at motivated reasoning, in which they become very good at finding observations and evidence that support their preexisting point of view while downplaying or discounting evidence that does not. In essence, they develop only part of the skills needed to be a skeptic in that they are very good at deconstructing ideas they disagree with but remain not so good at critically examining ideas that they do agree with that might lack good evidence to support them.

Yes, as Andy Oxman shows, you have to get ’em young. He relates a story about his visit to his then 10 year old son’s class in 2000.that shows how children actually have a proclivity for becoming skeptical:

“I told them that some teenagers had discovered that red M&Ms gave them a good feeling in their body and helped them write and draw more quickly,” Oxman said. “But there also were some bad effects: a little pain in their stomach, and they got dizzy if they stood up quickly.”

He challenged the kids to try to find out if the teens were right. He split the class into small groups and gave each group a bag of M&Ms.

The children quickly figured out they had to try eating M&Ms of different colors to find out what happens, but that it wouldn’t be a fair test if they could see the color of the M&Ms. In other words, they intuitively understood the concept of “blinding” in a clinical trial. (This is when researchers prevent study participants and doctors from knowing who got what treatment so they’re less likely to be biased about the outcome.)

In a short time, they were running their own blinded, randomized trials — the gold standard for testing medical claims — in the classroom. By the end of their experiment, Oxman said, “They figured out that there was little if any difference in the effects of the different colors and they asked me if the teenagers who made the claim really believed that.”

As a result of this experience, Oxman began working with other academics to develop curricula designed to teach critical thinking skills in children. The idea is to arm them with the skills needed to spot “alternative facts” more readily and, as importantly, how to keep fake news from spreading in the first place. He’s not the only one who asked this question. So did Imogen Evans, Hazel Thornton, Iain Chalmers and Paul Glasziou, who wrote a book called Testing Treatments that’s available for free as a downloadable PDF. In 2012, Oxman teamed up with one of the authors of the book, Sir Iain Chalmers, asking him to help to develop curricula based on the concepts in the book appropriate for primary school children

Because Oxman had ties with researchers in Uganda, he tested the new materials there on children between 10-12 years of age. Personally, when I first saw that, I thought that ten years old is a bit old to start the process. I would have thought that age eight or even six would be the time to start. Be that as it may, Oxman ran a randomized trial in which a representative sample of eligible schools were randomized either to control (no change in curriculum) or including the Informed Health Choices (as the program came to be known) primary school resources (textbooks, exercise books, and a teachers’ guide) in the lesson plan. Teachers teaching the Informed Health Choices curriculum attended a 2 day introductory workshop and gave nine 80 min lessons during one school term. The lessons addressed 12 concepts essential to assessing claims about treatment effects and making informed health choices.

Twelve key concepts were emphasized:

Claims

  • Treatments might be harmful
  • Personal experiences or anecdotes (stories) are an unreliable basis for assessing the effects of most treatments
  • Widely used treatments or treatments that have been used for a long time are not necessarily beneficial or safe
  • New, brand-named, or more expensive treatments may not be better than available alternatives
  • Opinions of experts or authorities do not alone provide a reliable basis for deciding on the benefits and harms of treatments
  • Conflicting interests may result in misleading claims about the effects of treatments

Comparisons

  • Evaluating the effects of treatments requires appropriate comparisons
  • Apart from the treatments being compared, the comparison groups need to be similar (ie, “like needs to be compared with like”)
  • If possible, people should not know which of the treatments being compared they are receiving
  • Small studies in which few outcome events occur are usually not informative and the results may be misleading
  • The results of single comparisons of treatments can be misleading

Choices

  • Treatments usually have beneficial and harmful effects

These are indeed key concepts that any medical skeptic needs to know and understand. I’ve empnasized pretty much all of them at one time or another over the last 12 years. Heck, we even have names for some of them, such as the appeal to antiquity to describe the concept that just because a treatment is old does not make it better. After all, many of these “ancient” remedies date back to times when medicine was anything but scientific and the very basics of what causes disease were not understood and instead diseases were attributed to “imbalances” in humors, “bad air,” or even the intervention of malign spirits. Any of them that might work were basically discovered by sheer accident, and ancient herbal remedies that work have mostly already been picked over and turned into purified drugs.

The result was a publication in The Lancet. At the end, results from students at non-curriculum schools were compared with those of students at Informed Health Choices schools on a 24-question multiple choice test on the 12 concepts (two questions per concept). The results were striking:

The average score for children in the intervention schools was 62·4% (SD 18·8) compared with 43·1% (15·2) in the control schools. The adjusted mean difference (based on the regression analysis) was 20·0% (95% CI 17·3–22·7; p<0·00001) higher in the intervention than in the control group. Appendix 1 shows the distribution of test scores. In the intervention schools, 3967 (69%) of 5753 children had a passing score (≥13 of 24 correct answers), compared with 1186 (27%) of 4430 in the control schools (table 2). The adjusted difference (based on the odds ratio from the logistic regression analysis) was 50% more children who passed (95% CI 44–55; p<0·00001) in the intervention than in the control group.

Noting:

Use of the Informed Health Choices primary school resources had a large effect on the ability of primary school children in Uganda to assess claims about treatment effects. This effect was larger for children with better reading skills, but the intervention was effective for children lacking basic reading skills, as well as for children with basic or advanced reading skills. This effect was achieved even though the learning materials and the tests were in English, which was not the children’s first language. Based on findings from pilot testing both the resources and the test used to measure the outcomes, we were surprised by the size of the effect, which is also large in comparison to other education interventions in primary schools in low-income and middle-income countries,20 and other interventions to teach critical thinking for all ages in high-income countries.11 In addition, the intervention had a positive effect on the children’s intended behaviours and the teachers’ mastery of the key concepts.

Overall, about one-fifth of the children achieved a test score indicating that they had mastered the key concepts (getting at least 20 questions out of the 24 correct), while less than 1% of the children in control schools achieved that high a score. This is quite an effect.

The authors do acknowledge weaknesses in their study. One thing I tend to question is whether the questions on the multiple choice test actually correspond to what the authors say they do. In other words, we don’t know if the children learned how to answer test questions but at the end of the teaching didn’t know how to apply the concepts they learned to real life. We also have no way of knowing what the long term effects of these interventions are and whether they will actually have measurable effects on the health choices made by these participants when they are adults. I also hate to be the pessimist, but I find it disappointing that, even after this intervention, only one-fifth of the children mastered all of the concepts and the average score among those who underwent the teaching still got more than 1/3 of the questions wrong.

As much as there is an emphasis on starting to learn critical thinking at a young age, all is not lost for us old farts:

Separately, the researchers also created a podcast on critical thinking concepts for parents, and tested that approach in another randomized controlled trial, also published in the Lancet. They were successful here as well: Nearly twice as many parents who listened to the podcast series passed a test on their understanding of key health concepts compared with parents in the control group.

Or, in a bit more detail:

We recruited parents between July 21, 2016, and Oct 7, 2016. We randomly assigned 675 parents to the podcast group (n=334) or the public service announcement group (n=341); 561 (83%) participants completed follow-up. The mean score for parents in the podcast group was 67·8% (SD 19·6) compared with 52·4% (17·6) in the control group (adjusted mean difference 15·5%, 95% CI 12·5–18·6; p<0·0001). In the podcast group, 203 (71%) of 288 parents had a predetermined passing score (≥11 of 18 correct answers) compared with 103 (38%) of 273 parents in the control group (adjusted difference 34%, 95% CI 26–41; p<0·0001). No adverse events were reported.

Studies like these, for all their messiness and shortcomings that are unavoidable in carrying out studies like this, give me hope. We human beings have cognitive wiring that leave us prone to making all sorts of incorrect inferences and latching on to all sorts of pseudoscientific beliefs, particularly about health. Critical thinking, while second nature in some areas (such as economic; e.g., when buying a used car), is not natural to most humans, particularly when it comes to health claims, where anecdotes can profoundly mislead and we are very quick to confuse correlation with causation. It can be taught, however. The problem is that there has to be the will and resources to teach it, as well as a large enough core of motivated teachers trained to do it. Then there’s the issue of competing for time among existing subjects that have to be taught. I can’t help but wonder if, rather than teaching critical thinking like this as a module separate from ohter subjects, it would be more effective to find a way to weave training in critical thinking into all subjects, especially science, math, and history, but by no means limited to them and including topics like writing, literature, and other humanities.

However it’s done, the need is acute. 2016 was the year fake news and bullshit appeared to have reached a tipping point. The need to be able to recognize and combat it is more acute than ever, and it’s never too early to start inculcating critical thinking skills into our children, who will need them more than our generation ever did.



from ScienceBlogs http://ift.tt/2q60nio

Aucun commentaire:

Enregistrer un commentaire