Tricking People Into Changing Their Political Opinions?

Choice blindness, eh?

The experiment relies on a phenomenon known as choice blindness. Choice blindness was discovered in 2005 by a team of Swedish researchers. They presented participants with two photos of faces and asked participants to choose the photo they thought was more attractive, and then handed participants that photo. Using a clever trick inspired by stage magic, when participants received the photo it had been switched to the person not chosen by the participant—the less attractive photo. Remarkably, most participants accepted this card as their own choice and then proceeded to give arguments for why they had chosen that face in the first place. This revealed a striking mismatch between our choices and our ability to rationalize outcomes. This same finding has since been replicated in various domains including taste for jam, financial decisions, and eye-witness testimony.

While it is remarkable that people can be fooled into picking an attractive photo or a sweet jam in the moment, we wondered whether it would be possible to use this false-feedback to alter political beliefs in a way that would stand the test of time.

In our experiment, we first gave false-feedback about their choices, but this time concerning actual political questions (e.g., climate taxes on consumer goods). Participants were then asked to state their views a second time that same day, and again one week later. The results were striking.  Participants’ responses were shifted considerably in the direction of the manipulation. For instance, those who originally had favoured higher taxes were more likely to be undecided or even opposed to it.

2 comments:

Douglas2/Unknown said...

This fails my "can I get to the actual paper presenting the research in three clicks from the magazine article?" test, as although what purports to be that link exists in the Scientific American article, it is broken.

My theory is that if the author of the article you are reading is misrepresenting the conclusions or strength of some research, he/she will neither link directly to the study nor to any other source that does link directly to the study.

That said, Pärnamets has quite a collection of interesting publications on Choice Blindness, including several on changing political attitudes, from his time as a member of the Swedish team that discovered the concept. And while people do overstate the strength and applicability of their own research when trying to promote it, they can't go too far and still be credible in their field.

I know I frustrate my spouse because she will say "I know you are strongly on the "X" side of this issue" and I will then proceed to relay all the strengths of the opposing position, and where people on the "X" side are most likely to use straw-man arguments, etc. –– I guess I've noticed that she absorbs the viewpoints of the strongest-willed around her, and I don't want to be married to a wife who is bullied into agreeing with her husband's views.

I'd like to think I'd be immune to the subterfuge used in the research, but who knows if that would be the case. If more people were forced during their education to advocate for both sides of a proposition, then I think there would be less polarization as people put some effort into finding something other than evil intent in the postions held by those outside their own "tribe". Perhaps, like Krugman, we all think we could pass the idological-turing-test but our opponents would look ridiculous.

http://coyoteblog.com/coyote_blog/2018/07/the-ideological-turing-test-how-to-be-less-wrong.html

Grim said...

It's less important that it's true than that they think that it's true. If they think it's true, they're going to try it. Hold onto your hat.