"And that people are so invested in these lies that they can't recalibrate their opinions."
This goes back to a comment I dropped in another article you wrote about a year back. The book is Superforecasting and the reference in there is to a study on experts and their predictions. Two sets of experts were asked to predict an outcome. O…
"And that people are so invested in these lies that they can't recalibrate their opinions."
This goes back to a comment I dropped in another article you wrote about a year back. The book is Superforecasting and the reference in there is to a study on experts and their predictions. Two sets of experts were asked to predict an outcome. One set of experts publicly proclaimed their predicted outcomes; the other set kept their predictions private.
Then, new facts were introduced that affected whether the predicted outcome was more likely to be accurate or inaccurate. Experts were periodically asked whether they wanted to revise their predictions, then new facts would be introduced and the experts re-queried, and so on. Eventually, there was an outcome.
The experts who publicly proclaimed their predictions were more than twice as likely to refuse to change their prediction, even when presented with new facts that undermined the likelihood of their prediction. This was true even when they were told that the actual outcome was the opposite of what they predicted -- some who publicly staked themselves to the opposite outcome refused to concede.
This is what we see with social media. People publicly take a position on issues. Then they refuse to change their minds, even when confronted with new facts that undermine the position they've taken. It's actually even broader than that. People publicly stake themselves to this or that echo chamber. Then, they refuse to believe the echo chamber could have led them astray.
"And that people are so invested in these lies that they can't recalibrate their opinions."
This goes back to a comment I dropped in another article you wrote about a year back. The book is Superforecasting and the reference in there is to a study on experts and their predictions. Two sets of experts were asked to predict an outcome. One set of experts publicly proclaimed their predicted outcomes; the other set kept their predictions private.
Then, new facts were introduced that affected whether the predicted outcome was more likely to be accurate or inaccurate. Experts were periodically asked whether they wanted to revise their predictions, then new facts would be introduced and the experts re-queried, and so on. Eventually, there was an outcome.
The experts who publicly proclaimed their predictions were more than twice as likely to refuse to change their prediction, even when presented with new facts that undermined the likelihood of their prediction. This was true even when they were told that the actual outcome was the opposite of what they predicted -- some who publicly staked themselves to the opposite outcome refused to concede.
This is what we see with social media. People publicly take a position on issues. Then they refuse to change their minds, even when confronted with new facts that undermine the position they've taken. It's actually even broader than that. People publicly stake themselves to this or that echo chamber. Then, they refuse to believe the echo chamber could have led them astray.
"It's the social media, stupid."
Is this the book?
https://www.amazon.com/Superforecasting-Science-Prediction-Philip-Tetlock/dp/0804136718
That's it. Good read. 100% non-political.