From Thanksgiving dinner conversations to pop culture rants, it’s easy to feel like people with different political ideologies occupy vastly different worlds, especially online. People often blame algorithms—the invisible sets of rules that shape online landscapes from social media to search engines—of limiting the use of digital “filter bubbles” by feeding us content that reinforces our pre-existing worldview.
Algorithms are always biased: studies have shown that Facebook ads target specific racial and gender demographics. Dating apps choose matches based on the history of the user’s previous activities. And search engines prioritize links based on what they think is most relevant. But not every algorithm causes political polarization, according to a new study.
A study published today in Nature found that Google search engine does not return disproportionately biased results. Instead, politically polarized Google users tend to isolate themselves by clicking on links to partisan news sites. These results show that, at least when it comes to Google searches, it may be easier for people to avoid online echo chambers than previously thought, but only if they choose to do so.
Algorithms permeate almost every aspect of our online existence and have the power to shape how we view the world around us. “They have some influence on how we consume information and therefore how we form opinions,” says Katherine Ognyanova, a communication researcher at Rutgers University and co-author of the new study.
But it can sometimes be difficult to quantify how much these programs contribute to political polarization. The algorithm can take into account “who you are, where you are, what device you are searching from, geography, language,” says Ognyanova. “But we don’t know exactly how the algorithm works. It’s a black box.”
Most of the research analyzing the political polarization caused by algorithms has focused on social media platforms such as Twitter and Facebook rather than search engines. This is because, until recently, it was easier for researchers to get useful data from social networking sites with their public APIs. “There is no such tool for search engines,” says Daniel Trielli, a new assistant professor of media and democracy at the University of Maryland who was not involved in the study.
But Ognyanova and her co-authors found a way around this problem. Instead of relying on anonymous public data, they sent volunteers a browser extension that logged all of their Google searches and the links they clicked from those pages for months. The extension acted like backyard camera traps that take pictures of animals—in this case, it provided snapshots of everything that populates each participant’s online landscape.
The researchers collected data from hundreds of Google users three months before the 2018 US midterm elections and nine months before the 2020 US presidential election. They then analyzed what they gathered based on participants’ age and their stated political orientation, ranked on a scale of one to seven, from strong Democrat to strong Republican. Yotam Shmargad, a sociologist at the University of Arizona who was not a member of the research team, calls “innovative” the approach to presenting real behavioral data about participants’ search activity with survey information about their political affiliations.
This type of field data is also extremely valuable from a policy-making standpoint, says University of Pennsylvania cybersecurity researcher Homa Hosseinmardi, who was also not involved in the study. To ensure that search engine giants such as Google, which handle more than 8.5 billion queries daily, are working with people in mind, it is not enough to know how the algorithm works. “You need to see how people use the algorithm,” says Hosseinmardi.
While many lawmakers are currently pushing for major tech companies to publicly release their anonymous user data, some researchers fear that this will encourage platforms to post misleading, distorted or incomplete information. One notable case was when Meta hired a group of scholars explored the platform’s relationship to democracy and political polarization, but then failed. give the data he promised to share helped. “I think it makes more sense to go directly to the user,” says Ronald Robertson, a network scientist at Stanford University and lead author of the new study.
Ultimately, the team found that a quick Google search did not provide users with a selection of news stories based on their political views. “In general, Google doesn’t personalize that much,” says Robertson. “And if the personalization is low, then maybe the algorithm doesn’t change the page that much.” Instead, users with strong biases were more likely to click on biased links that matched their pre-existing worldview.
This does not mean that Google’s algorithm is flawless. The researchers noticed that unreliable or outright misleading news sources still show up in the results, regardless of whether users have interacted with them. “There are other contexts where Google has done some pretty problematic things,” says Robertson, including the dramatic underrepresentation of women of color in image search results.
Google did not immediately respond to a request for comment on the new study.
Shmargad notes that these studies are not completely unbiased when broken down to a more detailed level. “There doesn’t seem to be that much algorithmic bias between party lines,” he says, “but there may be some algorithmic bias in different age groups.”
Users aged 65 and over were exposed to more right-hand links in Google search results than other age groups, regardless of their political affiliation. However, because the effect was small, and the oldest age group only accounted for about one-fifth of the total participants, the larger impact of exposure on overall study results disappeared in the macro analysis.
However, the findings reflect a growing body of research that suggests that the role of algorithms in creating policy bubbles may be overstated. “I don’t mind blaming the platforms,” Trielli says. “But it’s a bit confusing that it’s not just about making sure the platforms behave well. Our personal motivations to filter what we read to fit our political biases remain strong.”
“We also want to be separate,” Trielli adds.
On the positive side, according to Ognyanov, “this study shows that it’s not that hard for people to avoid [ideological] bubble.” That might be the case. But first they have to want to.