Award Abstract # 1217212
III: HCC: Small: Effects of Automated Information Selection and Presentation in Online Information Systems

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: MICHIGAN STATE UNIVERSITY
Initial Amendment Date: August 31, 2012
Latest Amendment Date: July 14, 2017
Award Number: 1217212
Award Instrument: Standard Grant
Program Manager: Maria Zemankova
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: September 1, 2012
End Date: August 31, 2017 (Estimated)
Total Intended Award Amount: $486,093.00
Total Awarded Amount to Date: $502,093.00
Funds Obligated to Date: FY 2012 = $486,093.00
FY 2013 = $16,000.00
History of Investigator:
  • Emilee Rader (Principal Investigator)
    ejrader2@wisc.edu
Recipient Sponsored Research Office: Michigan State University
426 AUDITORIUM RD RM 2
EAST LANSING
MI  US  48824-2600
(517)355-5040
Sponsor Congressional District: 07
Primary Place of Performance: Michigan State University
430 Communication Arts & Science
East Lansing
MI  US  48824-1212
Primary Place of Performance
Congressional District:
07
Unique Entity Identifier (UEI): R28EKN92ZTZ9
Parent UEI: VJKZC4D1JN36
NSF Program(s): Info Integration & Informatics,
HCC-Human-Centered Computing
Primary Program Source: 01001213DB NSF RESEARCH & RELATED ACTIVIT
01001314DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7364, 7367, 7923, 9251
Program Element Code(s): 736400, 736700
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Socio-technical systems provide access to ever-increasing quantities of information online. To help people cope with information overload, these systems implement "algorithmic curation": automated selection of what content should be displayed to users, what should be hidden, and how it should be presented. Virtually every Internet user who reads online news, visits social media sites, or uses a search engine has encountered algorithmic curation at some point, probably without even realizing it. In a socio-technical system, user contributions, social relationships and behavior, and features of the technology are interdependent, and determine what the system is used for, how it is used, and how it evolves over time. The goal of this research project is to investigate the relationship between social behavior and algorithmic curation, in order to better predict the effects of this pervasive practice on what we read, contribute, and communicate about online.

This project uses a multi-method approach to identify ways in which social and technical mechanisms influence individual users' information production and consumption, and thereby shape system-level properties of the user population and the corpus of contributions. Lab experiments investigate how social processes, such as obeying social norms and altering communications for an intended audience, are affected by different types of algorithmic curation. Field studies augment the lab experiments, using technology interventions to demonstrate how these changes play out for people in the real world over time, and as algorithms change. At the system level, agent-based models connect individual-level processes with system-level effects of algorithmic curation, and large-scale data collection looks for signs of those effects on real systems.

This project advances the current understanding of forces that shape information access and use in an increasingly connected and automated environment. Results will be used to provide guidance to system designers who create and manipulate algorithms, in the form of design patterns that will support a systematic, generalizable way of planning for effects of algorithmic curation at different scales. The project Web site (http://bitlab.cas.msu.edu/curation) provides information. Undergraduate and graduate students involved in the project will become better problem solvers, and work effectively on collaborative interdisciplinary projects.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Emilee Rader "Examining User Surprise as a Symptom of Algorithmic Filtering" International Journal of Human Computer Studies , v.98 , 2017 , p.72 1016/j.ijhcs.2016.10.0055

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

To help people cope with information overload, online systems use "algorithmic curation", or automated selection and prioritization of the content that is displayed to users. Internet users who read online news, visit social media sites, or use a search engine encounter algorithmic curation, probably without even realizing it. This project investigated the relationship between social behavior and algorithmic curation, to identify its effects on what people read, contribute, and communicate about online.

We discovered that users of the Facebook News Feed, a system that uses algorithmic curation, form their own theories about how the Facebook News Feed works just by interacting with it. However, their theories do not always involve a curation algorithm. User beliefs about why they see the posts they do ranged widely, from thinking that the News Feed shows all possible posts from their friends, to automated selection by an algorithm. Some people believed that the system could make inferences about their preferences based on which posts they read and whose pages they visited. Others noticed the News Feed showing them posts "out of order" or showing more posts from certain friends and few posts from others. People disliked missing posts from their friends, and believed that when they did it was evidence of system intervention.

When we made people aware that they had missed posts, by asking them to visit the pages of specific friends and report whether they saw posts that they had not seen before, most reported that they had missed posts from at least one friend. But, how close they felt to specific people had no bearing on whether they were likely to notice missed posts from someone, indicating that the system may not base it's content prioritization on an accurate model of relationship closeness. Because Facebook posts present opportunities for feedback important for social support and maintaining social ties, any bias or inaccuracy in the way the algorithm promotes content could affect users’ ability to maintain relationships on Facebook. Also, missed posts from close friends were more surprising, even when participants believed that the actions of the system caused the missed posts. Stronger beliefs that missed posts were because of system intervention were associated with more surprise, showing that users expect the algorithm to do a good job showing them posts that they expect to see.

We learned that the rank at which content is displayed in the News Feed interacts with how far down users scroll to determine which stories they see, making it very difficult to measure these influences separately. Measuring an algorithm's effects as if it were a filter or gatekeeper that acts before users make choices about what content to consume may lead to mis-estimating its impact on the overall diversity of the information users see. Our results also showed that the algorithm's control over what users see is greatest at the top of the News Feed (high ranks) because those are the posts users are most likely to view and capture their attention. Our model, which allows us to quantify interdependence between ranks and rank-related user behavior, identifies a source of biased exposure to diverse information that previous literature has not addressed.

As more systems begin using algorithms to sort and rank information for users, it is increasingly important that users are able to understand how their access to information has been affected. We discovered that blog posts published by Facebook describing how the News Feed works focus mostly on why the algorithm works the way it does, and on the types of "signals", or the data that the algorithm considers, when it calculates the rank for a particular News Feed post. We learned that this information, when shown to Facebook users, caused them to become more aware that there is an algorithm that affects what they see, and helped them to make judgments about whether the system is biased. However, it didn't work as well at helping people understand how the system prioritizes content. People did not gain new understanding that would help them to take action to change their behavior with respect to the system, which calls into question whether transparency in systems that use algorithmic curation would be an effective strategy.

Our interdisciplinary work emphasizes the sociotechnical nature of these systems -- the people, information, and algorithms all interact -- and that this combination is both difficult to measure and can have unexpected effects. Our work shows that these systems are different from recommender systems in that the feedback loop is intentionally hidden from users, yet it shapes their experience in measureable ways. This project also trained 6 graduate students and 6 undergraduate students, including both computer science students and social science students. Their training included helping the students learn how to communicate across disciplines and work together on a interdisciplinary team.


Last Modified: 11/30/2017
Modified by: Emilee J Rader

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page