Week 4

Computational Sociology

Christopher Barrie

Introduction

  1. Housekeeping
  2. Misinformation and fake news

Introduction: Misinformation and fake news

  • When is the last time you read some misinfo. online?

Introduction: Misinformation and fake news

  • Do you think others are more exposed than you?
    • Why?

Introduction: Misinformation and fake news

  • When is the last time you believed some misinformation/fake news online?

Introduction: Misinformation and fake news

  • When is the last time you believed some misinformation/fake news online?
    • Do you think it affected you?

      • How?

Why it matters

Why it matters

Why it matters

But what is it?

Often abused terms…

Misinformation: (un)knowingly false information

Disinformation: knowingly false information often spread to advance a particular cause or viewpoint

Fake news: knowingly false information often designed to look like real news

Some examples

Misinformation: the claim that exiting the European Union would deliver £350m

Some examples

Disinformation: the claim that Russia’s invasion of Ukraine is a result of US treaty non-compliance and NATO aggression

Some examples

Fake news: the claim that Ukraine is organizing nuclear strike on Russia

So what questions should we be asking?

  1. Do we all get exposed to misinformation/fake news?
  2. What are the effects on
    1. Individual level?

    2. Institutional level?

So what questions should we be asking?

  1. Do we all get exposed to misinformation/fake news?

Exposure heterogeneity

Different types of people are more/less likely to be exposed to misinformation and fake news

Different types of people are more/less likely to believe misinformation and fake news

Exposure heterogeneity

Exposure heterogeneity

In Allen et al. (2020) we see that:

  • Older individuals more likely to consume fake news

  • 2% consumed more fake news than mainstream news

    • But only .7% spent more than 1 min. per day

So is 2% a big number or a small number?

What do we have to consider here?

Exposure heterogeneity

In Allen et al. (2020) we see that:

  • Older individuals more likely to consume fake news

  • 2% consumed more fake news than mainstream news

    • But only .7% spent more than 1 min. per day

So is 2% a big number or a small number?

What do we have to consider here?

  1. The (voting) population of the US
  2. The size of the information ecosystem

Exposure heterogeneity

Do these trends generalize?

Exposure heterogeneity

See A. Guess, Nagler, and Tucker (2019)

Exposure heterogeneity

It’s not just age…

But also cognitive traits. See Pennycook and Rand (2019) for more details.

Exposure heterogeneity

And partisan selective exposure (cf. Week 2) extends to fake news too:

Exposure heterogeneity

Confirmed by other studies too such as OSMUNDSEN et al. (2021)

A summary so far…

So consumption:

  1. Differs by age
  2. Differs by partisan position
  3. Differs by cognitive traits (reflection)

But what does all this do?

In other words: does consumption of fake news/misinformation have consequences for politics/society/democracy?

One hypothesis:

  • Despite small overall consumption share, misinformation consumption has outsized effects among small subpopulation

But what does all this do?

Some evidence from A. M. Guess et al. (2020)

But what does all this do?

Some evidence from A. M. Guess et al. (2020)

But what does all this do?

Though some experimental evidence from A. M. Guess et al. (2020) suggests effect on voting but not other types of political participation (or trust in media).

But what does all this do?

Though other work by Bail et al. (2019) shows that exposure to misinformation by Internet Research Agency trolls had no effect on important political attitudes and behaviours

In summary

  • Different types of people are exposed to different types of untrustworthy content
  • The size of this consumption is likely small
    • But what is small after all…?
  • The effects are most obvious for beliefs
    • More work needed on political behavioural effects

A note on computational thinking

This week:

  • We see what’s measurable (and what isn’t) using digital data
    • e.g., with the Allen et al. (2020) article
  • We see how we can extend and validate theoretical models by using digital trace data
    • e.g., with the article by Vosoughi, Roy, and Aral (2018)
  • We see how we can combine multiple sources of information to approach an approximate solution to a question
    • e.g. with the article by A. M. Guess, Nyhan, and Reifler (2020)

References

Allen, Jennifer, Baird Howland, Markus Mobius, David Rothschild, and Duncan J. Watts. 2020. “Evaluating the Fake News Problem at the Scale of the Information Ecosystem.” Science Advances 6 (14): eaay3539. https://doi.org/10.1126/sciadv.aay3539.
Bail, Christopher A., Brian Guay, Emily Maloney, Aidan Combs, D. Sunshine Hillygus, Friedolin Merhout, Deen Freelon, and Alexander Volfovsky. 2019. “Assessing the Russian Internet Research Agencys Impact on the Political Attitudes and Behaviors of American Twitter Users in Late 2017.” Proceedings of the National Academy of Sciences 117 (1): 243–50. https://doi.org/10.1073/pnas.1906420116.
Guess, Andrew M., Dominique Lockett, Benjamin Lyons, Jacob M. Montgomery, Brendan Nyhan, and Jason Reifler. 2020. Fake News May Have Limited Effects on Political Participation Beyond Increasing Beliefs in False Claims.” Harvard Kennedy School Misinformation Review, January. https://doi.org/10.37016/mr-2020-004.
Guess, Andrew M., Brendan Nyhan, and Jason Reifler. 2020. “Exposure to Untrustworthy Websites in the 2016 US Election.” Nature Human Behaviour 4 (5): 472–80. https://doi.org/10.1038/s41562-020-0833-x.
Guess, Andrew, Jonathan Nagler, and Joshua Tucker. 2019. “Less Than You Think: Prevalence and Predictors of Fake News Dissemination on Facebook.” Science Advances 5 (1). https://doi.org/10.1126/sciadv.aau4586.
OSMUNDSEN, MATHIAS, ALEXANDER BOR, PETER BJERREGAARD VAHLSTRUP, ANJA BECHMANN, and MICHAEL BANG PETERSEN. 2021. “Partisan Polarization Is the Primary Psychological Motivation Behind Political Fake News Sharing on Twitter.” American Political Science Review 115 (3): 999–1015. https://doi.org/10.1017/s0003055421000290.
Pennycook, Gordon, and David G. Rand. 2019. “Lazy, Not Biased: Susceptibility to Partisan Fake News Is Better Explained by Lack of Reasoning Than by Motivated Reasoning.” Cognition 188 (July): 39–50. https://doi.org/10.1016/j.cognition.2018.06.011.
Vosoughi, Soroush, Deb Roy, and Sinan Aral. 2018. “The Spread of True and False News Online.” Science 359 (6380): 1146–51. https://doi.org/10.1126/science.aap9559.