Project Context
Problem
Complex interactions at the individual, group, algorithm, platform, and sociotechnical system level are Factors contributing to the spread and belief of misinformation and disinformation.
Rabbit hole behaviors on social media, combined with algorithmic personalization threats on social media platforms, contribute to echo chambers and filter bubbles.
Solution
Through an eight-week research and design exploration, we examined a potential social media platform design solution for users who are in an information-seeking “rabbit hole.”
Audience
Youtube Platform
UW Professor
Project Type
Graduate School
Constraints
Struggled to find any academic literature defining a “rabbit hole” or why this behavior places online platform users at risk of entrenchment in an informational echo chamber through exposure to misinformation and disinformation algorithmically.
Time Frame
8 Weeks
My Role
Product Designer
User Researcher
Project Manager
Team
Three UW students
Project Breakdown
01
Problem Context
Misinformation and disinformation are complex threats with both human and technical roots. Factors contributing to the spread and belief of mis- and disinformation arise from complex interactions at the individual, group, algorithm, platform, and sociotechnical system level. Such factors are user information and rabbit hole behaviors on social media, combined with algorithmic threats on social media platforms that contribute to echo chambers and filter bubbles.
Through a nine-week research and design exploration, we examined a potential social media platform design solution for users who are in an information-seeking “rabbit hole.” We struggled to find any academic literature defining a “rabbit hole” or why this behavior places online platform users at risk of believing or sharing misinformation and disinformation through an entrenchment in an informational echo chamber or exposure to misinformation and disinformation algorithmically.
*Disclaimer – Due to the prevalence of misinformation in our society, the length of this portfolio project is longer than usual to provide additional context*
02
Research Process
Established research notions of echo chambers, filter bubbles, social media fatigue, nudges, and inoculation theory grounded our research approach for the YouTube app’s intervention design. Our human-centered design inquiry asked, “How might we disengage a user from content with which they are currently engrossed?”
To address our design question, we followed the Human-Centered Design Process, which has four key activity phases 1) Identify the user(s) and their specific context(s), 2) identify the user(s)’ needs and requirements, 3) design solutions, and 4) evaluate those solutions against the identified needs and requirements (Zimmerman & Grötzbach, 2007). Given the limited timeline for this project and additional challenges related to COVID-19, we engaged all four phases but in a restricted manner so that we could complete each.
To our knowledge and surprise, there is little academic literature on the “rabbit hole” phenomenon, although colloquial definitions are in line with ours. For this research, we define the phenomena of time-consuming, tangential information-seeking experiences from which it is difficult to disengage oneself as “information rabbit holes.”
Users
We chose to design for a user group with similar demographics to ourselves: college-educated, U.S. residents aged 23 – 45 with active social media accounts, regular YouTube usage, and reliable access to personal, internet-enabled devices. Due to the pandemic, we used convenience sampling with friends, family, and classmates who met the criteria above. We conducted virtual semi-structured interviews with nine participants in November 2020.
Stakeholders
Additional stakeholders we considered were YouTube platform operators, content creators, and advertisers. As we had no means of meaningfully interviewing these stakeholders, we assumed the platform’s principal needs; to keep users positively engaged with the website to monetize them via advertising or premium memberships better. Content creators want users watching to watch, like, and subscribe to their videos. Advertisers want to make sure that their ads do not run against questionable content and serve to engage viewers who will not just gloss over their ads. In considering the needs of these stakeholders, we also had to address, to some extent, the potentially conflicting design question of, “How might we nudge a user to disengage from the topic from which they are currently researching, but not cause them to leave the platform?”
Figure 2: Oxford Dictionary defines a Rabbit Hole as "used to refer to a bizarre, confusing, or nonsensical situation or environment, typically one from which it is difficult to extricate oneself."
03
Synthesis
Our team collaborated to virtually conduct the affinity diagramming method using the digital whiteboarding software, Miro. We identified common themes across participants and significant points of difference, ultimately constructing two rough personas and a user journey map to guide our design decisions. We used the user journey map to visually illustrate the user flow, starting with the initial contact through the nudge and continuing throughout the interaction process.
Using information provided to us through user interviews, we were able to identify the start and stop points for our design intervention, possible contexts, and potential paths a user might take. We used the customer journey map to identify which prototype screens to build and estimate when the nudge intervention should occur.
Interview Findings
Our initial interview found some similarities from interview participants. The participants were relatively heavy social media users. Most tended to engage with various social platforms daily five or more times per day on their mobile phones primarily. Furthermore, we found that extended media consumption sessions were often a product of either general boredom or curiosity about a specific topic of interest. We found that the most contributing factors leading users to disengage with media content were external to the app or platform. Time and personal obligations, such as household duties, were more likely to drive users off the app than the content itself.
While the initial interviews found some similarities across interview subjects, we also identified three notable differences. First was the awareness (or lack) of the sociotechnical dangers innate to social media platforms themselves. These include awareness of intentionally deceptive content on social media and awareness of their own potential social media addiction, and awareness of algorithm supported echo chambers or filter bubbles. The second significant difference was whether or not a user had strong preferences for specific platforms or formats for certain content (e.g., written news articles vs. video news). The third difference was whether or not the user felt a particular responsibility to consume or share certain content types by merely having a social media account connected to others.
04
Design
We elected to design only for our rabbit hole-unaware persona, on YouTube’s mobile app interface, using visual and textual cues and standard tap or swipe interactions. Our design was a visual overlay that proactively and automatically appeared if a user had been watching 45 minutes or more of videos consecutively from a similar content stream.
Based on initial user research and literature review, our design attempts to use time management and a visual hint of being in a rabbit hole to nudge users out of extended periods of compulsive video watching in hopes of making them more aware of potentially dangerous behavior.
To motivate users to make a particular choice, we first focused on time-on-platform awareness as a means of disengagement, based on interview findings. Our second focus regarding nudge intervention was from a preventative health standpoint, intending to help users build good online media consumption habits across the board.
Design Framing
We envisioned nudges as a dynamic and continually evolving interaction with the potential to adjust to user behavior. Within specific design guidelines, self-learning nudges would allow the platform to address multiple factors contributing to information rabbit holes and radicalization more broadly, such as user-specific behavior patterns, specific content creators, and problematic content (existing, emerging, and old).
We initially prioritized the notion of user “freedom of choice” (FOC), which stresses a user’s ability to choose and embodies the classic liberal view of democracy (Bozdag, 2015). Bozdag describes this as freedom for the user to walk away from “entrapment” from an algorithm that allows the user the choice or agency to decide the best course of action once a nudge has been triggered (Bozdag, 2015).
We chose to address the nudge’s visual design elements beyond text based on research supporting the efficacy of corrections and interventions when they are both visual and textual (Birdsell & Groarke, 1996). We initially used design elements to emphasize time due to the interview findings and literature pointing towards self-learning and content-aware nudges (von der Weth, 2019).
Design Decisions
Given our assignment to design specifically for an online media platform, class timing, and our limited ability to explore haptic feedback due to COVID-19 safety precautions requiring us to work remotely, we chose a narrow design scope.
We explored variations of the interaction such that it matched Youtube’s visual style. YouTube’s current graphic design schema uses rectangles and hard angles entirely, so we chose to create visual contrast by making the nudge curved. At the time of this project, YouTube uses a pale blue in banners and other in-app content notifications about COVID, U.S. presidential election information, and vaccines, so we opted to mimic this palette.
Wordsmithing
Our nudge intentionally has a mild tone; we purposely wanted to avoid agitating users by evoking the rabbit hole concept by name. Surprisingly, there is a notable literature gap regarding the term rabbit hole. A significant portion of the literature references the phrase without definition. Out of caution, we opted to avoid explicitly referencing the words “rabbit hole,” considering that the term may be too emotionally loaded, which could, in turn, affect users’ willingness to engage with the nudge (Nabi, 2003).
We considered different variations of language tone to maintain neutrality. In one draft version, we asked, “It looks like you have been watching similar videos for X minutes today/this week. Would you like to try something new?” and gave the user options to continue or “try something new.” Another option we considered was “You’ve spent X minutes watching Y videos, would you like to take a break?” with the user being able to respond with Option A) “Keep going” or Option B) “Take a break” or Option C) “Remind me later.” Our design went in the latter direction as the language tied to our interviews’ findings that users value their time.
Prototype Design
The nudge interaction itself requires a tap; based on user interview feedback, a swipe was too easy to plow through. Tapping requires the user to slow down and be slightly more considered. We considered where people rest their thumbs when watching videos and how our nudge would require them to adjust their thumbs to get them to move so that the tap wouldn’t be too simple. Given our limited time and virtual meeting requirements, we did not test out other types of interactions.
We wanted to stake out an initial starting point for making our design content-aware and user-dependent; however, we did not flesh out all of the content categories or behavior categories that would trigger the nudge due to time constraints. We recognize there will be many gray areas. Satirical content, for example, will be challenging to identify and categorize accordingly. Additionally, the question of appropriate nudge timing could be a separate study in itself.
However, in the end, various factors narrowed and influenced the nudge intervention’s design, including current YouTube features or graphic elements, content-language, interaction types, user options, and Youtube Ad policies. Our final design included a 15-second unskippable advertisement to build up the immersion of remotely user testing and consider Youtube’s ad policy. We introduced the 15-second advertisement countdown to mimic the emotional state of being bored (in a controlled environment) before the video would play and the nudge intervention occurred. Being bored was an emotional state many of our participants identified with when they continuously watch videos on their own. The ad required the participant to sit through the countdown timer before their video continued playing, which triggered the nudge intervention.
05
Deliverable
At the start of our design exploration, cursory internet searches did not unearth any similar in-platform features. However, in the very late stages of the design process, we discovered a similar tool was launched in 2018 as part of Google’s Digital Wellbeing campaign. The Take A Break (“TAB”) feature nudges users about their time on the platform, functioning as an alarm.
Instead of removing ourselves totally from the TAB tool, we chose to iterate on our design and include some of their features and playful visual elements. Hence, we proposed a nudge intervention on YouTube, notably different from the existing TAB feature that Youtube offers. Our proposed nudge intervention approaches digital well-being as a function of time-on-platform and good information hygiene and education, while YouTube’s existing tool focuses primarily on time. I used the design software Figma to create an interactive prototype to be used during the second round of interviews.
von der Weth's Nudge Design Guidelines
There are limited empirically tested approaches to designing nudge solutions to combat mis- and disinformation. We chose to rely on six guidelines specifically proposed to combat algorithmic threats on social media with effective nudges developed by von der Weth et al. (2019).
Their work was the most comprehensive summary of nudge design and efficacy we found in the literature.
- Content-Aware – Nudges should only be displayed if necessary and tailored appropriately to content.
- User-dependent – Nudges should trigger based on individual user behavior.
- Self-learning -This is a consequence of #1 and #2; a nudging engine should adapt to users’ behaviors over time.
- Proactive – Nudges should be triggered before potential harm rather than after.
- In-situ – Nudges should be integrated into users’ day-to-day social media usage since research has shown interventions are most effective when given at the right time and place.
- Transparent – Users must comprehend why a nudge was triggered to adjust their social media behavior in the future. This is critical to establishing trust, unlike the hidden algorithms that currently power most online social media platforms.
Comparison
When we compare YouTube’s existing TAB tool and our proposed nudge solution, we see that the YouTube tool meets only two of von der Weth et al.’s guidelines while ours would meet all six. One flaw of the TAB tool was the exclusion of risks about information overload from their digital wellbeing definition. This tool is also opt-in only, making it useless when users may be unaware of their consumption habits. Furthermore, the TAB tool does not expose users to new information streams and, therefore, does not meet an unaware population’s needs, our target audience.
06
Outcome
Our initial human-centered design inquiry asked, “How might we disengage a user from content with which they are currently engrossed?” which we later modified to be “How might we empower users to make decisions that preserve their digital and social well-being when they are engaged in information rabbit hole behaviors online?”
Future researchers, designers, and developers furthering this topic can find significant opportunities within our research findings, indicating that nudges should be more direct about their warning efforts and educational rabbit hole content. Through the development and initial testing of a prototype, we found that self-awareness cultivated through nudged behavior may potentially be an effective method for promoting disengagement within this context.
While awareness of time-on-platform can be a sufficient motivator for users to change behavior, subtle hints at informational risks were not effective. It is important to allow users to make informed decisions about their information hygiene, more direct warning, and education about a user’s proximity to algorithmic threats.
Recommendations
Our design project was initially scoped and planned to go through all of the phases in the human-centered design process in a short amount of time. As such, more in-depth qualitative research across a broader demographic is warranted to identify their corresponding needs, motivations, and platform usage contexts.
Our recommendations for future work begins with more thorough generative research on these topics:
- Rabbit Holes:
- We were surprised by the lack of literature about the phenomenon of social media rabbit holes. Our research used dictionary definitions and second-round user interviews to indicate some standard features: an endless aspect, loss of control (how much is unclear), a need to keep going, and a tangentiality or unexpectedness in user experience. Further work is needed to define the term, examine its theoretical connections, and understand user attitudes toward it.
- Engaging with Content the Periphery of Filter Bubbles:
- We recommend more generative research on why and how users might be diverted towards their social media filter bubbles’ outer edges. We intended to focus our design inquiry on stopping people from disengaging with content once within a rabbit hole and not explicitly how to get them to move on to other content. However, this latter design question had unavoidable overlap with ours when trying to satisfy YouTube, content creators, and advertisers’ need to keep users engaged on the platform.
Future Considerations
Depending on the outcomes of broader and more in-depth generative research, future design directions may change considerably. If it does not, then future design considerations should incorporate six main adjustments:
- The nudge and Youtube’s Digital Wellbeing landing page should be modified to explicitly call out and educate users about the phenomenon of social media rabbit holes and the information dangers they present.
- The design should be adapted to consider both personas’ needs, rather than just one, considering how those needs might change over time. Mis- and disinformation will be an ongoing threat, so designers must design with consistent, long-term feature adoption in mind. User needs will likely change as they become familiar with or tired of the feature, and users may also have varying requirements depending on usage context or specific content.
- A more comprehensive range of interactions should be considered beyond our initial prototype design, such as complete integration within the YouTube video rectangle (versus an overlay) or a full-screen overlay.
- Designers should consider other sensory inputs such as sound. Other research suggests haptics may be particularly effective (Okeke et al., 2018)
- The “try something else” means of nudging a user out of the rabbit hole should be re-designed given how many users were confused by or skeptical of this option.
- To better adhere to the content-aware and user-dependent, and self-learning guidelines proposed by von der Weth et al. (2019), the design’s content and context triggers must be considered and mapped out more precisely.
In Retrospect...
I appreciated being grounded by the misinformation literature during the design process, which heavily scoped and influenced our design elements and visuals. Given more time, I would have liked to explore ways to improve the nudge’s design through text hierarchy, color composition, and visual cues. Additionally, I value the positive collaborative remote-team environment that we created during a stressful time in the world.