Home » Data science, infodemics, and misinformation spreading: Deciphering the role of digital information ecosystems

Homepage » Blog » article

Data science, infodemics, and misinformation spreading: Deciphering the role of digital information ecosystems

Data science and infodemics

Sapienza, City, and Ca’ Foscari researchers investigate large-scale social dynamics to analyze and understand online users’ behavior. 

As part of the IRIS Academic Research Group’s ‘Learn’ research stream, Prof Quattrociocchi (Sapienza University), Prof Baronchelli (City University of London), and Prof Zollo (Ca’ Foscari University of Venice) and their teams are developing ad-hoc data-driven models and metrics to investigate the spread of misinformation online and identify trends, topics, and polarization dynamics.  

Recognizing the nature and role of infodemics to improve epidemic management 

The COVID-19 pandemic caused an unprecedented global health crisis that highlighted the complexity of devising a suitable public emergency response and the need for a more efficient exchange between science and policymaking.

The pandemic also underlined the critical role of information diffusion in a disintermediated news cycle. In particular, such phenomena can impact the epidemic process and alter the effectiveness of the countermeasures deployed by governments. 

As such, the ability to monitor infodemics as they unfold is key to designing effective interventions that anticipate and account for misinformation-driven public reactions. But the research and evidence base to enable this has a long way to go. 

Photo by Gabriella Clare Marino

Shaping the strategic response to misinformation 

Early misinformation research gave much attention to the Fake News problem, which framed the strategic response in terms of a competition between true and false information (e.g., fact-checking). Newer research shows that this dichotomized perspective is too limited and risks oversimplifying the workings of misinformation. These maturing and expanding insights are important to ensure that future misinformation interventions (e.g., behavioral, technological) are appropriately developed and applied. 

To illustrate this shift, a study limited to Twitter in 2018 claimed that fake news traveled faster than real news. To explain this, the researchers identified ‘human psychology’ as an important driver but did not explore this further within the study.  Nor did they examine the specific platform’s peculiarities (interaction paradigm and community). We now know that many factors affect information spreading on social media platforms. However, other dimensions, such as the impact of information on users’ opinions, remain unclear. 

Architects and architecture of opinion polarization in social media

One way forward is to study online polarization, a dynamic phenomenon that may foster misinformation spreading. Research findings have identified technological and cognitive features that might lead to the creation of specific limitations within the information ecosystem and potentially propagate misinformation.  

On the technical side, the social media business model plays a key role in the information ecosystem, and (mis)information spreading inherits the properties of that specific environment. Cognitively, our attention span is limited, and feed algorithms might further limit our content selection process by suggesting content similar to the types we are usually exposed to. 

Further, users tend to look for (and like) information adhering to their worldviews, ignore content dissenting from their beliefs and join groups of like-minded individuals around a shared narrative with the potential to become echo chambers. 

We can broadly define echo chambers as environments in which users’ opinions, political leaning, or beliefs about a topic are reinforced due to repeated interactions with peers or sources having similar tendencies and attitudes.  

Together, selective exposure and confirmation bias (i.e., the tendency to seek information adhering to pre-existing opinions) may explain the emergence of echo chambers and polarization on social media platforms. 

To examine these potential interactions, we explored how information influences users’ opinions and behavior and how different social media platforms influence the related social dynamics. We found that consuming content online does not necessarily reflect a change in users’ attitudes and that different social media platforms produce different polarization dynamics

Photo by Rodion Kutsaev

Our infodemiology research – Personal choice, Preparedness, Polarization, and Policies 

In a joint effort with experts from the WHO and CDCs, we investigated unresolved issues in the study of infodemics and their relationship with epidemic management, and proposed research directions to enhance preparedness for future health crises. 

To date, we have looked at how infodemics might influence health behaviors. We have explored the interplay between an overabundance of information and vaccine acceptance, finding that the two quantities are largely decoupled.  

We have also spotlighted the social media activity around contemporary – and often contentious – global health events to see what happens. At the UN Conference of The Parties on Climate Change (COP), we examined how the online discussion around climate change evolved and, during the 2022 COP26, found a significant increase in polarization. With future climate action reliant on negotiations at COP27 and onwards, our results highlight the importance of monitoring polarization in the public discourse about climate and how this may impact political action. 

We are also looking at the role of social media infrastructure and design by building models to assess the relationship between social media feed algorithms and related social dynamics, especially opinion polarization to predict the evolution of the online environment. 

Recently the rumors about the acquisition of Twitter by Elon Musk raised concerns about the limits of free speech and moderation policies implemented by social media platforms. In our latest paper, we explored how different moderation policies impact the behavior of users and their tendency to create groups around shared narratives.  

The next envisioned steps are understanding who or what drives the agenda in a multi-platform environment, diving into the interplay between information and adoption of beliefs, and disentangling the very nature of online group polarization. 

Watch this space! 



EU strategy: Media Literacy for Democracy

20 January, 2023
Online (Free tickets)
Taking place as part of the EU strategy for media literacy, Marco Delmastro, a social scientist supporting IRIS Academic’s infodemic research pillar, will join European panellists to discuss the phenomena of mis- and disinformation and the threat they pose to European democracy and well-being.