Location
Culp Center Ballroom
Start Date
4-25-2023 9:00 AM
End Date
4-25-2023 11:00 AM
Poster Number
66
Faculty Sponsor’s Department
Sociology & Anthropology
Name of Project's Faculty Sponsor
Candace Bright
Additional Sponsors
Kelly N. Foster, Ph.D., Department of Sociology and Anthropology, College of Arts and Sciences, East Tennessee State University, Johnson City, TN.
Competition Type
Competitive
Type
Poster Presentation
Project's Category
Sociology
Abstract or Artist's Statement
This research was motivated by the growing concerns regarding social media platforms roles in the presentation of conspiracy theories as compelling alternative narratives regarding contemporary social events. Specifically, this research asks whether Facebook’s algorithm autonomously recommends groups that are organized around conspiracy theories regarding contemporary political and social events to end users, despite the user not previously indicating interest in those topics. Based on coverage of biases in Facebooks content ranking and recommendation algorithms surrounding the 2020 presidential election, it was hypothesized that, over time, Facebook would increasingly seek to recommend conspiratorial content to its users to retain them on the platform. Eight test profiles were created under three observation protocols over the course of the fall 2022 semester. Each profile created was restricted in how it interacted with the Facebook platform. In creating the first three profiles, only the information that is necessary for profile creation was given, name, gender, birthdate, phone number, and email address. The first profile would only view the first twelve recommended groups, without joining them. The second profile would view and join the first twelve recommended groups, and the third profile would view and join the first twelve recommended groups, and then would also leave five random groups from the previous day. A second protocol was designed to expedite the observation process and expand the sample of groups observed per profile. Five profiles were made under this protocol, wherein the profiles would follow two Facebook pages prior to observation and have slight variations in their demographic information. These profiles only viewed and followed the first recommended group, and then refreshed the recommendation list to generate updated groups in real time. Over the course of this project, Facebook did not recommend a single group to any of the test profiles that principally discussed or recommended conspiracy theories. Instead, it was found that Facebook seeks to recommend groups that are popular in the users local area first, but only until the user has indicated enough unique interests to begin steering the recommendations. Each profile was recommended a geolocation based set of groups, but within days of observation, those groups were no longer chiefly recommended. Instead, it was found that user behavior is heavily influential regarding what groups and topics are recommended. Secondly, among all profiles, the topics featured among the groups being recommended homogenized over the course of observation, to the exclusion of all other topics. Further research should look into whether this homogenization is a function of having so few user inputs, or typical behavior of Facebook more broadly. If it is the latter, this may contribute to the propensity for individual users to arrive in digital echo chambers.
Follow the Algorithm: Assessing Facebook's Group Recommendation Behavior Regarding Conspiracy Theories and Echo Chambers
Culp Center Ballroom
This research was motivated by the growing concerns regarding social media platforms roles in the presentation of conspiracy theories as compelling alternative narratives regarding contemporary social events. Specifically, this research asks whether Facebook’s algorithm autonomously recommends groups that are organized around conspiracy theories regarding contemporary political and social events to end users, despite the user not previously indicating interest in those topics. Based on coverage of biases in Facebooks content ranking and recommendation algorithms surrounding the 2020 presidential election, it was hypothesized that, over time, Facebook would increasingly seek to recommend conspiratorial content to its users to retain them on the platform. Eight test profiles were created under three observation protocols over the course of the fall 2022 semester. Each profile created was restricted in how it interacted with the Facebook platform. In creating the first three profiles, only the information that is necessary for profile creation was given, name, gender, birthdate, phone number, and email address. The first profile would only view the first twelve recommended groups, without joining them. The second profile would view and join the first twelve recommended groups, and the third profile would view and join the first twelve recommended groups, and then would also leave five random groups from the previous day. A second protocol was designed to expedite the observation process and expand the sample of groups observed per profile. Five profiles were made under this protocol, wherein the profiles would follow two Facebook pages prior to observation and have slight variations in their demographic information. These profiles only viewed and followed the first recommended group, and then refreshed the recommendation list to generate updated groups in real time. Over the course of this project, Facebook did not recommend a single group to any of the test profiles that principally discussed or recommended conspiracy theories. Instead, it was found that Facebook seeks to recommend groups that are popular in the users local area first, but only until the user has indicated enough unique interests to begin steering the recommendations. Each profile was recommended a geolocation based set of groups, but within days of observation, those groups were no longer chiefly recommended. Instead, it was found that user behavior is heavily influential regarding what groups and topics are recommended. Secondly, among all profiles, the topics featured among the groups being recommended homogenized over the course of observation, to the exclusion of all other topics. Further research should look into whether this homogenization is a function of having so few user inputs, or typical behavior of Facebook more broadly. If it is the latter, this may contribute to the propensity for individual users to arrive in digital echo chambers.