The use of news recommendation algorithms is becoming more widespread, raising concerns that their implementation could lead to a “filter bubble”, a biased and closed information environment tailored to individual preferences. Previous literature ...
The use of news recommendation algorithms is becoming more widespread, raising concerns that their implementation could lead to a “filter bubble”, a biased and closed information environment tailored to individual preferences. Previous literature presents mixed findings about whether or not recommendation algorithms contribute to constructing filter bubbles. While a body of literature has confirmed that an algorithmic environment can encourage attitude polarization and a homogenous information environment, some research has shown contrasting evidence that the algorithmic environment can increase incidental exposure to diverse information or cross-cutting viewpoints. Each body of literature offers valid points about recommendation algorithms and their filter bubble possibilities, however, little work has considered users’ agency in using algorithms, and how individuals’ use of algorithms can result in making differences in the information diversity level. Drawing on the possibility that users’ understanding and perceptions of algorithms can influence the use of algorithms, this study explored whether users’ information environment is distinctively formed by users’ understanding and perceptions of the algorithms. We first examined the relationship between the level of algorithm understanding and three algorithm perceptions that are frequently discussed, namely, convenience, privacy risk, and bias perceptions. Convenience perception is defined as individuals’ belief that the utilization of a recommendation algorithm service would enable users to effortlessly locate the most recent information aligning with their desires and needs, ultimately contributing to the enhancement of their goals. Privacy risk perception concerns users‘ recognition that the recommendation algorithm may compromise personal information during the collection and analysis of users‘ private data. Bias perception pertains to the awareness that the employment of news recommendation algorithms may hinder users’ exposure to a diverse array of social issues, potentially leading to the consumption of information that predominantly emphasizes one-sided perspectives or political viewpoints. We also examined how these algorithm perceptions are respectively associated with the following three predictors of a closed information environment: (1) exposure to attitude-consistent news, (2) news trust (trust in algorithm-recommended news), and (3) news-seeking behavior. The analysis of 1,169 online survey responses revealed that first, individuals with a higher understanding of news recommendation algorithms perceived a greater level of convenience and bias in algorithms, but algorithm understanding had no significant relationship with privacy risk perception. Second, convenience perception had a positive relationship with recommendation algorithm-accepting behavior, and this, in turn, was associated with an increase in exposure to attitude-consistent news. Convenience perception also showed positive relationships with news trust and news-seeking behavior respectively. Meanwhile, privacy risk and bias perceptions - while having no significant relationships with news trust - had negative relationships with attitude-consistent news exposure through a lowered algorithm-accepting behavior. Bias perception showed a positive relationship with news-seeking behavior while privacy risk perception was negatively linked to news-seeking behavior. We confirmed the importance of users’ algorithm understanding and perceptions in achieving information diversity in the customized environment. The findings present meaningful insights into the necessary enhancements of users’ perceptions concerning news recommendation algorithms, and the corresponding corrections needed to foster a more diverse information environment. Implications for algorithm literacy education were also discussed.