Another study shows just how hard it is to change beliefs in antivaccine misinformation [Respectful Insolence]


One of the reasons I’m so passionate about pushing back against antivaccin pseudoscience is because I view it as an estreme threat to public health, particularly the health of children. I’m a history buff. I study history. I know what child mortality was like before vaccines. I’m also a scientist, which is why I know that antivaccine claims and arguments are either misinformation, pseudoscience, utter nonsense, or a combination of the three. Vaccines are safe and effective, and there’s no scientific evidence that is even the least bit convincing that they cause autism, the main fear of the antivaccine movement. They don’t cause autoimmune diseases, either.

One of the biggest questions of the last couple of decades is how to combat antivaccine misinformation. Although there has been great disagreement and many suggested strategies, one strategy has been to try to educate parents about the dangers of not vaccinating. It’s a valid strategy, as vaccines have been a victim of their own success. Most parents have never seen a case of the various deadly vaccine-preventable diseases against which we vaccinate. As a result, very few parents have any idea how bad the childhood diseases commonly vaccinated against can be. Antivaxers take advantage of that by playing up (translation: lying about) the risks of vaccines. Given that it’s easy to have the perception that these diseases aren’t a danger because they are uncommon now (thanks to vaccination) it’s a ploy that seems reasonable on the surface, particularly in light of the “backfire effect,” in which providing disconfirming evidence to people with a strong belief often causes them to double down on that belief, making the belief stronger. Given that we are human beings, our lizard brains have a hard time taking a threat seriously that we don’t see and that we haven’t observed affecting us or those we know directly. On the other hand, our lizard brains are also easily activated by “threats” based on the unknown, such as “toxins” in vaccines.

Unfortunately, this strategy doesn’t appear to work, either.

Indeed, if a recently published study is correct, practically nothing works. The study, out of the University of Edinburgh, tested three common pro-vaccination strategies: one contrasting myths vs. facts, one employing fact and icon boxes, and one showing images of non-vaccinated sick children. Basically, none of them worked; in fact, they often backfired, reinforcing preexisting beliefs. The authors note the barriers to overcome in the introduction:

Vaccines are the safest and most effective tools for preventing infectious diseases and their success in achieving relevant public health outcomes, such as the reduction or eradication of many life-threatening conditions, is well-established. However, many people appear hesitant about vaccines, doubting their benefits, worrying over their safety or questioning the need for them. Addressing vaccine hesitancy, defined as a “delay in acceptance or refusal of vaccines despite availability of vaccination services” ([1], p575), is not a simple task for the following reasons. First, vaccine hesitancy is rooted in a set of cognitive mechanisms that conspire to render misinformation particularly “sticky” and pro-vaccination beliefs counter-intuitive [2], involving a multitude of emotional, social, cultural, and political factors [3]. Second, public information campaigns designed to dispel erroneous vaccination beliefs often overlook these factors and have limited or even unintended opposite effects [4, 5]. Furthermore, even when attempts to correct invalid information do not “backfire” by entrenching the original misinformation [6], they can frequently fail because people cannot successfully update their memories and still fall back on information they know is not correct in order to make inferences and explain events.

Another reason, of course, is that we are story telling apes. We find stories (i.e., anecdotes from people we know or have come to trust) far more compelling than scientific evidence or pictures of sick children with no connection to us. This has been the problem combatting pseudoscience from the very beginning. Even though anecdotes (which are often distorted versions of what really happened tarted up to strengthen the perceived association between, for example, vaccines and autism) are the weakest form of evidence, our primitive brains consider them the most powerful. Also, as mentioned above, our brains also latch on to information, particularly if it reinforces preexisting beliefs, and won’t let go:

Classical laboratory research on memory for inferences [10, 6] demonstrates that the continued reliance on discredited information is very difficult to correct. Even when people clearly remember and understand a subsequent correction when asked about it immediately (suggesting that they have encoded it and can retrieve and potentially comply with it), they can still be influenced by the effect of the retracted misinformation. That is, people are susceptible to misinformation even though they had acknowledged that the information at hand is factually incorrect. As Rapp and Braasch stated ([11], p3), “the problem is not just that people rely on inaccurate information but that they rely on it when they seemingly should know better”. This seemingly irrational reliance on outright misinformation has been demonstrated with beliefs related to well-known material (e.g., biblical narratives [12, 13]), blatant hoaxes (e.g., paranormal claims [14]) or personally experienced events (e.g., distorted eyewitness testimonies [15]). It also occurs despite measures intended to make the presentation of information clearer and despite explicit warnings about the misleading nature of the information at hand [16, 17]. Therefore, simply retracting a piece of information does not stop its influence because outdated pieces of information linger in memory. In the case of vaccines, providing evidence about the safety of immunisation may not be enough as people may have heard or read somewhere that, for example, vaccines are not necessary, that they cause autism or contain dangerous chemicals. This false information persists in their minds.

The authors then present the three techniques. Presenting “myth vs. fact”-style information seems as though it would be effective, but it turns out that this strategy often backfires, with people remembering the myths more than the disconfirming information. While some studies suggest that this can be an effective strategy, others have observed a backfire effect. For instance, on study shows that some people can misremember myths as fact after as short as 30 minutes, although by three days many people often forget what is myth and what is fact.

The second strategy is to put the information in visual form, using well-designed graphs and graphics that can attract and hold people’s attention. Examples of these graphics include the “fact box,” which consists of a table for transparent risk communication, summarizing the scientific evidence for a drug, treatment, or screening method in an easily understandable manner. Fact boxes often “show benefits and harms for people with and without treatment in plain frequencies, avoiding misleading statistics or statements of risk that may be misunderstood by laypeople.” Another example is an “icon box,” which consists of a visual tool showing two groups of individuals, those who underwent a treatment and those who didn’t, with each person represented by an icon indicating benefit and harms. (These are frequently used to describe the benefits and risks of cancer screening tests, for instance.) The authors note that there has been relatively little research on the use of fact boxes as a tool for informing the general public.

Finally, the authors note that a third strategy is to harness the power of fear and emotion, with messages and images that show the risks from the diseases and the consequences of not vaccinating. Although there is evidence that these appeals to fear can work, especially for positive action and as long as those seeing the message fear the outcomes and believe that the action being promoted will avert the threat, there’s also evidence that this, too, can backfire. The authors cite studies on cigarette labeling with “resonant texts and vivid pictures” (often, in my experience, pictures of people tracheostomies and half their faces eaten away by smoking-caused head and neck cancer or photos of diseased lungs), which often don’t motivate smokers to quit. They also cited a 2014 study by Nyhan et al, which showed that images of sick children with scary messages about the consequences of not vaccinating also do not work.

The authors of this study recruited participants from “diverse departments of the University of Edinburgh, the Suor Orsola Benincasa University of Naples, and the Second University of Naples, resulting in an initial sample of 134 individuals.” The study was conducted in two “waves.” The first wave included the preliminary survey and included 134 subjects. Roughly 10% dropped out and 120 completed the second wave. Among those, 47 (39.2%) were men and 73 (60.8%) women. Mean age was 25.35 years (SD 3.52, range 19–34). Most participants had a Bachelor’s (n. 46, 38.3%) or a Masters’ degree (n. 63, 52.5%), while 11 respondents (9.2%) were PhD students. Of course, I can see a couple of problems right here with the study. First, the numbers are relatively small. Second, it’s hard to view this sample as representative. It’s full of young people, and we have no idea how many of them have children, for example. They’re also all from university settings; i.e., they are a “convenience” sample, rather than a sample that needs to be targeted.

All participants completed two questionnaires. The first one was a preliminary survey to assess baseline beliefs and attitudes towards vaccines that has been used in previous studies consisting of eight items that “covered common attitudes from both the pro- (e.g., ‘Getting vaccines is a good way to protect my future child(ren) from disease’) and the anti-vaccination side (e.g., ‘Some vaccines cause autism in healthy children’).” The second questionnaire was a post-manipulation survey that assessed whether and how participants’ beliefs and attitudes towards vaccines changed compared to the baseline measure. This is also a survey that has been used in previous studies. This survey was administered twice, immediately after the interventions (Time 1) and then after a seven day delay to evaluate the longevity and robustness of the observed effects (Time 2). The authors looked for interactions between the intervention and changes in attitudes as determined in the followup surveys to produce what they called a change score. A positive score means more belief in the antivaccine myths being examined, while a negative change score means less.

Basically, the results of the study can be summed up in one graph, in Figure 3 (click to embiggen):

The first thing you notice is that none of the changes are negative. All of the bars show either no change or a positive change, which is bad because it means more acceptance of the specific negative attitudes towards vaccines tested, such as the beliefs that vaccines cause autism and produce horrible side effects and vaccine hesitancy. Particularly striking is how the fear correction produced the largest increase in fear of vaccine side effects compared to any other intervention. This is dramatically shown when the results are shown at the two different time point, with the fear correction producing a dramatic increase in the belief in serious vaccine side effects at one week (click to embiggen):

The authors summarize their observations thusly:

Our study provided further support to the growing literature showing how corrective information may have unexpected and even counter-productive results. Specifically, we found that the myths vs. facts format, at odds with its aims, induced stronger beliefs in the vaccine/autism link and in vaccines side effects over time, lending credit to the literature showing that countering false information in ways that repeat it may further contribute to its dissemination [25]. Also the exposure to fear appeals through images of sick children led to more increased misperceptions about vaccines causing autism. Moreover, this corrective strategy induced the strongest beliefs in vaccines side effects, highlighting the negative consequences of using loss-framed messages and fear appeals to promote preventive health behaviours [45, 38]. Our findings also suggest that no corrective strategy was useful in enhancing vaccination intention. Compared to the other techniques, the usage of fact/icon boxes resulted in less damage but did not bring any effective result.

The further suggest:

Presumably, a golden strategy capable of overcoming all the intricacies of setting people straight, regardless of their basic beliefs and/or temporal shifts, does not exist. Public information campaigns may instead benefit from tailoring different, simultaneous, and frequent interventions to increase the likelihood of corrective messages’ dissemination and acceptance [40]. Ideally, corrective strategies should be directed at the precise factors that may influence vaccination decision-making and impede vaccine uptake, which include, over and beyond strong attitudes against vaccines, social norms pushing individuals to conform to the majority’s behaviour, standards for vaccine uptake in a specific population, and structural barriers to vaccination such as potential financial costs of vaccines and their ease of access. Successful interventions should therefore be targeted to differently “driven” vaccine-hesitant individuals. For instance, when people do not vaccinate because they lack confidence in vaccines, corrective strategies should dispel vaccination myths, or when people do not vaccinate because perceived risks outweigh benefits, interventions should emphasize the social benefit deriving from vaccination and add incentives [41]. However, the inter-relationship of multi-level factors which contribute to vaccine hesitancy seems somewhat difficult to disentangle in order to make such targeted approach successful; indeed, the independent and relative impact of each determinant of vaccination choice is complex and context-specific, varying across time, place, and vaccines [3]. What is clear, though, is the urgent need for appropriately designed, well-executed, and rigorously evaluated interventions to address parental vaccine refusal and hesitancy [53].

The findings of this study aren’t really anything new; they basically reinforce a growing body of evidence that it is much, much harder to change people’s minds about beliefs in which they have an emotional investment. Unfortunately, this study was too small to look at something that would be really interesting to know. As we all know, antivaccine beliefs exist on a spectrum, from mildly vaccine-hesitant to full blown nutty antivaxers like Kent Heckenlively. It’s incredibly unlikely that any strategy will change the mind of someone like Heckenlively, but it is quite possible that one or a combination of the strategies studied in this report could be potentially effective in persuading parents with much lower levels of fear and loathing of vaccines. We know that information can persuade some people; the problem is that it is a much smaller percentage than is desirable and the price is that the information will backfire in some.

I like to joke about how every study concludes that “more research is needed,” but if ever there is an area where more research is needed it is on how to persuade the vaccine hesitant of the safety and efficacy of vaccines and, of course, how to counter the message of the hard core antivaxers that result in that vaccine hesitancy.



from ScienceBlogs http://ift.tt/2vmksR1

One of the reasons I’m so passionate about pushing back against antivaccin pseudoscience is because I view it as an estreme threat to public health, particularly the health of children. I’m a history buff. I study history. I know what child mortality was like before vaccines. I’m also a scientist, which is why I know that antivaccine claims and arguments are either misinformation, pseudoscience, utter nonsense, or a combination of the three. Vaccines are safe and effective, and there’s no scientific evidence that is even the least bit convincing that they cause autism, the main fear of the antivaccine movement. They don’t cause autoimmune diseases, either.

One of the biggest questions of the last couple of decades is how to combat antivaccine misinformation. Although there has been great disagreement and many suggested strategies, one strategy has been to try to educate parents about the dangers of not vaccinating. It’s a valid strategy, as vaccines have been a victim of their own success. Most parents have never seen a case of the various deadly vaccine-preventable diseases against which we vaccinate. As a result, very few parents have any idea how bad the childhood diseases commonly vaccinated against can be. Antivaxers take advantage of that by playing up (translation: lying about) the risks of vaccines. Given that it’s easy to have the perception that these diseases aren’t a danger because they are uncommon now (thanks to vaccination) it’s a ploy that seems reasonable on the surface, particularly in light of the “backfire effect,” in which providing disconfirming evidence to people with a strong belief often causes them to double down on that belief, making the belief stronger. Given that we are human beings, our lizard brains have a hard time taking a threat seriously that we don’t see and that we haven’t observed affecting us or those we know directly. On the other hand, our lizard brains are also easily activated by “threats” based on the unknown, such as “toxins” in vaccines.

Unfortunately, this strategy doesn’t appear to work, either.

Indeed, if a recently published study is correct, practically nothing works. The study, out of the University of Edinburgh, tested three common pro-vaccination strategies: one contrasting myths vs. facts, one employing fact and icon boxes, and one showing images of non-vaccinated sick children. Basically, none of them worked; in fact, they often backfired, reinforcing preexisting beliefs. The authors note the barriers to overcome in the introduction:

Vaccines are the safest and most effective tools for preventing infectious diseases and their success in achieving relevant public health outcomes, such as the reduction or eradication of many life-threatening conditions, is well-established. However, many people appear hesitant about vaccines, doubting their benefits, worrying over their safety or questioning the need for them. Addressing vaccine hesitancy, defined as a “delay in acceptance or refusal of vaccines despite availability of vaccination services” ([1], p575), is not a simple task for the following reasons. First, vaccine hesitancy is rooted in a set of cognitive mechanisms that conspire to render misinformation particularly “sticky” and pro-vaccination beliefs counter-intuitive [2], involving a multitude of emotional, social, cultural, and political factors [3]. Second, public information campaigns designed to dispel erroneous vaccination beliefs often overlook these factors and have limited or even unintended opposite effects [4, 5]. Furthermore, even when attempts to correct invalid information do not “backfire” by entrenching the original misinformation [6], they can frequently fail because people cannot successfully update their memories and still fall back on information they know is not correct in order to make inferences and explain events.

Another reason, of course, is that we are story telling apes. We find stories (i.e., anecdotes from people we know or have come to trust) far more compelling than scientific evidence or pictures of sick children with no connection to us. This has been the problem combatting pseudoscience from the very beginning. Even though anecdotes (which are often distorted versions of what really happened tarted up to strengthen the perceived association between, for example, vaccines and autism) are the weakest form of evidence, our primitive brains consider them the most powerful. Also, as mentioned above, our brains also latch on to information, particularly if it reinforces preexisting beliefs, and won’t let go:

Classical laboratory research on memory for inferences [10, 6] demonstrates that the continued reliance on discredited information is very difficult to correct. Even when people clearly remember and understand a subsequent correction when asked about it immediately (suggesting that they have encoded it and can retrieve and potentially comply with it), they can still be influenced by the effect of the retracted misinformation. That is, people are susceptible to misinformation even though they had acknowledged that the information at hand is factually incorrect. As Rapp and Braasch stated ([11], p3), “the problem is not just that people rely on inaccurate information but that they rely on it when they seemingly should know better”. This seemingly irrational reliance on outright misinformation has been demonstrated with beliefs related to well-known material (e.g., biblical narratives [12, 13]), blatant hoaxes (e.g., paranormal claims [14]) or personally experienced events (e.g., distorted eyewitness testimonies [15]). It also occurs despite measures intended to make the presentation of information clearer and despite explicit warnings about the misleading nature of the information at hand [16, 17]. Therefore, simply retracting a piece of information does not stop its influence because outdated pieces of information linger in memory. In the case of vaccines, providing evidence about the safety of immunisation may not be enough as people may have heard or read somewhere that, for example, vaccines are not necessary, that they cause autism or contain dangerous chemicals. This false information persists in their minds.

The authors then present the three techniques. Presenting “myth vs. fact”-style information seems as though it would be effective, but it turns out that this strategy often backfires, with people remembering the myths more than the disconfirming information. While some studies suggest that this can be an effective strategy, others have observed a backfire effect. For instance, on study shows that some people can misremember myths as fact after as short as 30 minutes, although by three days many people often forget what is myth and what is fact.

The second strategy is to put the information in visual form, using well-designed graphs and graphics that can attract and hold people’s attention. Examples of these graphics include the “fact box,” which consists of a table for transparent risk communication, summarizing the scientific evidence for a drug, treatment, or screening method in an easily understandable manner. Fact boxes often “show benefits and harms for people with and without treatment in plain frequencies, avoiding misleading statistics or statements of risk that may be misunderstood by laypeople.” Another example is an “icon box,” which consists of a visual tool showing two groups of individuals, those who underwent a treatment and those who didn’t, with each person represented by an icon indicating benefit and harms. (These are frequently used to describe the benefits and risks of cancer screening tests, for instance.) The authors note that there has been relatively little research on the use of fact boxes as a tool for informing the general public.

Finally, the authors note that a third strategy is to harness the power of fear and emotion, with messages and images that show the risks from the diseases and the consequences of not vaccinating. Although there is evidence that these appeals to fear can work, especially for positive action and as long as those seeing the message fear the outcomes and believe that the action being promoted will avert the threat, there’s also evidence that this, too, can backfire. The authors cite studies on cigarette labeling with “resonant texts and vivid pictures” (often, in my experience, pictures of people tracheostomies and half their faces eaten away by smoking-caused head and neck cancer or photos of diseased lungs), which often don’t motivate smokers to quit. They also cited a 2014 study by Nyhan et al, which showed that images of sick children with scary messages about the consequences of not vaccinating also do not work.

The authors of this study recruited participants from “diverse departments of the University of Edinburgh, the Suor Orsola Benincasa University of Naples, and the Second University of Naples, resulting in an initial sample of 134 individuals.” The study was conducted in two “waves.” The first wave included the preliminary survey and included 134 subjects. Roughly 10% dropped out and 120 completed the second wave. Among those, 47 (39.2%) were men and 73 (60.8%) women. Mean age was 25.35 years (SD 3.52, range 19–34). Most participants had a Bachelor’s (n. 46, 38.3%) or a Masters’ degree (n. 63, 52.5%), while 11 respondents (9.2%) were PhD students. Of course, I can see a couple of problems right here with the study. First, the numbers are relatively small. Second, it’s hard to view this sample as representative. It’s full of young people, and we have no idea how many of them have children, for example. They’re also all from university settings; i.e., they are a “convenience” sample, rather than a sample that needs to be targeted.

All participants completed two questionnaires. The first one was a preliminary survey to assess baseline beliefs and attitudes towards vaccines that has been used in previous studies consisting of eight items that “covered common attitudes from both the pro- (e.g., ‘Getting vaccines is a good way to protect my future child(ren) from disease’) and the anti-vaccination side (e.g., ‘Some vaccines cause autism in healthy children’).” The second questionnaire was a post-manipulation survey that assessed whether and how participants’ beliefs and attitudes towards vaccines changed compared to the baseline measure. This is also a survey that has been used in previous studies. This survey was administered twice, immediately after the interventions (Time 1) and then after a seven day delay to evaluate the longevity and robustness of the observed effects (Time 2). The authors looked for interactions between the intervention and changes in attitudes as determined in the followup surveys to produce what they called a change score. A positive score means more belief in the antivaccine myths being examined, while a negative change score means less.

Basically, the results of the study can be summed up in one graph, in Figure 3 (click to embiggen):

The first thing you notice is that none of the changes are negative. All of the bars show either no change or a positive change, which is bad because it means more acceptance of the specific negative attitudes towards vaccines tested, such as the beliefs that vaccines cause autism and produce horrible side effects and vaccine hesitancy. Particularly striking is how the fear correction produced the largest increase in fear of vaccine side effects compared to any other intervention. This is dramatically shown when the results are shown at the two different time point, with the fear correction producing a dramatic increase in the belief in serious vaccine side effects at one week (click to embiggen):

The authors summarize their observations thusly:

Our study provided further support to the growing literature showing how corrective information may have unexpected and even counter-productive results. Specifically, we found that the myths vs. facts format, at odds with its aims, induced stronger beliefs in the vaccine/autism link and in vaccines side effects over time, lending credit to the literature showing that countering false information in ways that repeat it may further contribute to its dissemination [25]. Also the exposure to fear appeals through images of sick children led to more increased misperceptions about vaccines causing autism. Moreover, this corrective strategy induced the strongest beliefs in vaccines side effects, highlighting the negative consequences of using loss-framed messages and fear appeals to promote preventive health behaviours [45, 38]. Our findings also suggest that no corrective strategy was useful in enhancing vaccination intention. Compared to the other techniques, the usage of fact/icon boxes resulted in less damage but did not bring any effective result.

The further suggest:

Presumably, a golden strategy capable of overcoming all the intricacies of setting people straight, regardless of their basic beliefs and/or temporal shifts, does not exist. Public information campaigns may instead benefit from tailoring different, simultaneous, and frequent interventions to increase the likelihood of corrective messages’ dissemination and acceptance [40]. Ideally, corrective strategies should be directed at the precise factors that may influence vaccination decision-making and impede vaccine uptake, which include, over and beyond strong attitudes against vaccines, social norms pushing individuals to conform to the majority’s behaviour, standards for vaccine uptake in a specific population, and structural barriers to vaccination such as potential financial costs of vaccines and their ease of access. Successful interventions should therefore be targeted to differently “driven” vaccine-hesitant individuals. For instance, when people do not vaccinate because they lack confidence in vaccines, corrective strategies should dispel vaccination myths, or when people do not vaccinate because perceived risks outweigh benefits, interventions should emphasize the social benefit deriving from vaccination and add incentives [41]. However, the inter-relationship of multi-level factors which contribute to vaccine hesitancy seems somewhat difficult to disentangle in order to make such targeted approach successful; indeed, the independent and relative impact of each determinant of vaccination choice is complex and context-specific, varying across time, place, and vaccines [3]. What is clear, though, is the urgent need for appropriately designed, well-executed, and rigorously evaluated interventions to address parental vaccine refusal and hesitancy [53].

The findings of this study aren’t really anything new; they basically reinforce a growing body of evidence that it is much, much harder to change people’s minds about beliefs in which they have an emotional investment. Unfortunately, this study was too small to look at something that would be really interesting to know. As we all know, antivaccine beliefs exist on a spectrum, from mildly vaccine-hesitant to full blown nutty antivaxers like Kent Heckenlively. It’s incredibly unlikely that any strategy will change the mind of someone like Heckenlively, but it is quite possible that one or a combination of the strategies studied in this report could be potentially effective in persuading parents with much lower levels of fear and loathing of vaccines. We know that information can persuade some people; the problem is that it is a much smaller percentage than is desirable and the price is that the information will backfire in some.

I like to joke about how every study concludes that “more research is needed,” but if ever there is an area where more research is needed it is on how to persuade the vaccine hesitant of the safety and efficacy of vaccines and, of course, how to counter the message of the hard core antivaxers that result in that vaccine hesitancy.



from ScienceBlogs http://ift.tt/2vmksR1

Aucun commentaire:

Enregistrer un commentaire