Social media algorithms are currently designed to show you exactly what you want to see, creating a comfortable loop of validation that often hardens beliefs rather than challenging them.
But new research from the University of Rochester suggests that this “echo chamber” effect is not an inevitable fact of online life — it is a design choice that can be fixed with a simple tweak: adding a little randomness.
In a study published in IEEE Transactions on Affective Computing, an interdisciplinary team of computer scientists and political scientists found that introducing random variations into news feeds significantly weakens the feedback loops that drive polarisation.
“The most dangerous feeds are not the ones that upset us, but the ones that convince us we are always right,” says Adiba Mahbub Proma, a PhD student and the study’s first author.
The randomness cure
The researchers recruited 163 participants to use simulated social media platforms. Some users were given feeds modelled on traditional algorithms that prioritise engagement and agreement, while others saw feeds that incorporated a degree of randomness.
The study clarified that “randomness” does not mean filling a feed with nonsense or irrelevant content. Instead, it involves loosening the strict logic that dictates “show me more of what I already agree with”.
In the experiment, users were periodically exposed to opinions and connections they had not explicitly chosen. The results showed that when the algorithm stopped perfectly catering to a user’s existing biases, their belief rigidity decreased.
“Across a series of experiments, we find that what people see online does influence their beliefs, often pulling them closer to the views they are repeatedly exposed to,” says Proma. “But when algorithms incorporate more randomisation, this feedback loop weakens. Users are exposed to a broader range of perspectives and become more open to differing views.”
A design choice
The team, led by Professor Ehsan Hoque, argues that current recommendation systems drive users into echo chambers by making divisive content more attractive.
However, the solution does not require eliminating personalisation entirely. The researchers recommend a design shift that introduces variety while still allowing users to maintain control over their feeds.
“If your feed feels too comfortable, that might be by design,” Proma warns. “Seek out voices that challenge you.”