Pompeu Fabra University research has confirmed a rise in ideological polarisation and biased or false news posted on Facebook. The study analysed over 6 million news-related URLs from 1,231 different US media domains shared between 2017 and 2020.
The upward trend in ideological polarisation and the dissemination of false or biased news coincides with changes to the platform. These changes altered how information is presented to users and gave greater weight to certain types of interactions in the algorithm used to determine rankings.
Emma Fraxanet, a researcher in computational social sciences in the UPF Department of Engineering and principal author of the study, examined the level of engagement with posts containing over 6 million news-related URLs, calculated with a combination of metrics including clicks, shares, likes, comments and other reactions. The team characterised these in relation to user ideology and news source quality.
An ideological gap
The analysis observed an ideological gap in the news consumed by conservatives and liberals and tracked its evolution over time. The research examined two significant changes to the platform, implemented in 2018 and 2020, which were revealed in a data leak that sparked public outcry over Facebook’s algorithm.
In both instances, the changes were followed by variations in engagement patterns, an increase in ideological polarisation, and the sharing of low-quality news. However, the researchers stated that in neither case may a direct causal relationship be established between the events, an issue which must be explored further in future studies.
The researchers describe engagement patterns as being U-shaped, with engagement higher among more extreme-leaning users than moderates. Content with higher levels of engagement tends to gain visibility following the changes of 2018 and 2020, which could explain the upward trend in more extreme ideological content. The most biased content also comes from the lowest quality sources.
In 2018, the platform decreased the weight attached to likes and increased that of shares and comments in the ranking algorithm to foster more meaningful interactions between family and friends. The study shows that following this change, user ideologies became more polarised and engagement with lower-quality content increased.
In 2020, Facebook decreased the weight of shares and increased that of comments in the ranking algorithm. The platform’s reasons for doing so are not altogether clear, although it appears the intention was to limit toxic or low-quality content. Despite the rise in both ideological polarisation and engagement with lower-quality content, activity on the platform dropped significantly.
The study notes that another reason for such polarisation is user behaviour, with users tending to consume content related to their ideological leanings. The findings suggest that since the platform implemented changes, the differences between the news diets of liberals and conservatives have grown, making it harder to find common ground for democratic debate.
The research was published in the journal EPJ Data Science. Co-authors include Vicenç Gómez, member of the Artificial Intelligence and Machine Learning Research Group, Andreas Kaltenbrunner of UOC and Fabrizio Germano of UPF and BSE.