Skip to main content
SearchLoginLogin or Signup

One (Adult) Size Does Not Fit All: The Importance of Development in Digital Design and Utilization

Published onJul 17, 2024
One (Adult) Size Does Not Fit All: The Importance of Development in Digital Design and Utilization
·

The growing concern about the appropriateness of online space for children has sparked efforts to connect features of developmental ages with digital features (Kidron and Rudkin 2017). Children regularly access digital spaces, but unlike physical spaces children inhabit (e.g., playgrounds, schools, cars), there are limited regulations or designs focused on children’s safety. Instead, efforts to protect and support youth online rely on them and their important adults to navigate platforms, sites, and ages of use. The most popular online spaces largely prohibit use until age 13, and then youth are treated no differently than adult users.

We challenge the one-(adult)-size-fits-all-users approach and, by focusing on the second half of childhood, describe how common design techniques often undermine youths’ developing cognitive and socioemotional skills, for those over and under 13 years. We argue that digital spaces should be designed for wellbeing and that default features should scaffold and support development. Rather than place all the burden on individuals (e.g., children, families, teachers) to navigate use, we draw attention to the mismatch between design and development and indicate ways technology companies could shift their practices to support youth wellbeing.

We believe that digital spaces, like physical spaces, can be safe and supportive of exploration, learning, and play when children’s needs are centered in the design process. Parks, for example, are designed to provide many ways for different aged and abled people to interact, play, or be alone. Though some features may seem prescriptive, they are not always used in the way intended. For instance, many people will sit on swings and move back and forth. But some will use the seat as a doll bed and others as a platform to bump into friends on neighboring swings. Parks are designed to be safe for all ages while also offering differing ways to socialize, explore, play, and relax. Though adults may help children learn to use these spaces, their design is inclusive of children’s needs and diverse uses. Digital spaces should have these same key characteristics – of purposeful design for safety, play, creativity, exploration, and connection.

Children’s Development and Media: Policy and the Attention Economy

In the U.S, the Children’s Online Privacy Protection Act (COPPA) is the only federal child protection regulation, limiting data collected about users under 13 years. Several states have proposed or passed new legislation (e.g., California’s Age Appropriate Design Code Act), but the requirements of such legislation and how it will be enforced are unclear. Due to COPPA, popular online platforms have established age-gates, requiring youth to be 13 years or older to use. However, major developmental differences exist from 13 to 18 and from adolescence to adulthood. Further, many children under 13 still engage with these sites. With this in mind, we focus on how development intersects with design and use, providing suggestions for how to tweak digital design to better support youth’s wellbeing. Rather than focus on youth’s healthful or risky uses, we argue for a park-like model of design and access that ensures diverse users can explore, play, connect, and create.

Designed for Dollars

Currently, the most popular digital spaces for youth make their money from the time and attention users spend online. Platforms are available for free because they employ advertising-driven business models, with profits coming from selling advertisements and users’ data. Attention drives the business. Companies collect more data when users spend more time on their platform, and those data are used to target advertising and provide content that will keep users online longer.

Companies prohibited from collecting data about children under 13 years often bypass this rule by not directly asking about age. However, sites clearly recognize users as children because they tailor ads to younger audiences (Radesky et al. 2020). Rather than using these same mechanisms to tailor more age-appropriate online experiences, many platforms capitalize on children’s developmental needs, opportunities available to them online, and lack of strong government regulation.

In this chapter, we focus on youth 10-18 years. We start by highlighting the key developmental capacities and challenges of these ages and then move to design features that connect to developmental processes. In each section, we propose design tweaks to promote wellbeing and help empower youth in their digital use. As we draw attention to the connection between developmental capacity and platform design, the bulk of our recommendations are at the platform—rather than the user—level, offering guidance to designers and insights for educators, youth, and caregivers.

Developmental Timing

The end of middle childhood through adolescence (10-18 years) is a critical period with tremendous growth in physical (including puberty), cognitive, and social-emotional skills, as well as increased media use (NASEM 2019). This period includes extensive brain development (e.g., better hemisphere coordination), with additional neural pathways being formed from the diverse experiences youth have in new environments (e.g., educational settings) and social interactions (e.g., peers, romantic partners) (Mah and Ford-Jones 2012). This greater functionality is coupled with new social responsibilities (e.g., chores, babysitting) and increasing independence. It is also when children typically get their first cellphone and engage more with media unsupervised (Pew 2020).

As children move from childhood through adolescence, peer relationships take increasing importance and digital technologies provide opportunities for different forms of social connection (Allen et al. 2014; Ehrenreich et al. 2020). This age range involves more frequent peer interactions, investment in peer feedback, and reliance on peers for social and emotional support (Brechwald and Prinstein 2011). Digital media connects well to these needs, offering ways to stay in touch, make more friends, experience validation from peers, and facilitate self-disclosure (Allen et al. 2014; Yau and Reich 2019). 

With changes in cognition and sense of self, older children begin to consider themselves in relation to others. As part of this, they have a tendency to assume everyone is watching or thinking about what they are doing, creating a sense of an imaginary audience, concerned with youth’s appearance and behavior (Elkind and Bowen 1979). Digital media can intensify the self-conscious characteristic of imagined audiences (Yau and Reich 2019), when youth are unsure of who views their posts/profiles. Further, as youth get better at understanding other people’s thinking and perspectives (Hall et al. 2021), they can better conceive of themselves in relation to others (Van der Graaff et al. 2014). As such, social comparisons increase, including concerns that others are watching and judging (Buunk and Gibbons 2007). Thus, identity development (e.g., exploration, presentation) is a key undertaking at these ages (NASEM 2019) and a highly prevalent component of online activities, from avatar personalization to image sharing (Granic, Morita and Scholten 2020; Odgers, Schueller and Ito 2020). It is also why youth are more vulnerable than younger children or adults when making social comparisons online to other users, advertisements, and influencers (Jones 2001; Kleemans et al. 2018).

The transition through adolescence also involves increased risk-taking. Neuroscientists attribute this to dual developing brain systems. The socioemotional system is sensitive to rewards, increasing motivation to pursue exciting, rewarding, and new experiences. The cognitive control system is responsible for stopping impulsive behavior. These systems develop at different rates, with the socioemotional system activating earlier than the cognitive control system (Shulman et al. 2016; Steinberg 2010). Thus, youth are increasingly interested in risky activities that feel rewarding, and are unlikely to stop doing them because of their lagging cognitive control (Shulman et al. 2016). Some developmental psychologists liken this dual systems model as driving a car with all gas and weak brakes.

Late middle childhood into adolescence involves the continued development of executive functioning (EF), cognitive processes that help youth regulate their time, attention, and impulses (Friedman and Miyake 2017). EF develops primarily in the prefrontal cortex and undergoes significant change throughout development, not reaching full maturity until around age 25. Studies find that many digital activities in late childhood and early adolescence (e.g., video-game playing, texting) rely on EF skills, particularly controlling impulses and keeping information readily accessible in working memory (Ricker and Richert 2021; Rideout et al. 2022).

As children progress through adolescence, they become increasingly better at recognizing their and others’ emotions and regulating the intensity of their feelings (Holodynski and Friedlmeier 2005). However, these emotion regulation skills are influenced by changing hormones through puberty (Dahl and Gunnar 2009; Silk et al. 2009). Thus, youth might display inconsistent emotion regulation, as hormones and cognitive skills facilitate control of complex and multifaceted feelings. It is why adolescence is a critical period for emotion regulation skills development (Silvers 2022). These changing emotion recognition, regulation, and processing skills likely contribute to how youth respond to content, interactions, and experiences online as well as select what media to use and when (Hoffner and Lee 2015; Yau and Reich 2019).

Youth make tremendous gains in cognitive, physical, and social-emotional skills between ages 10-18. But development is uneven, with some changes more rapid and others more gradual. As such, the skills of a 10-year-old are quite different than a 15-year-old, even when using the same device or app. Similarly, a 13-year-old, who is age-eligible to use social media, will be differentially affected by that use than an 18-year-old who has more developmental capacity for using social media. With this in mind, we highlight some of the most commonplace online activities and indicate how they connect to youth’s developmental capacities.

Digital Activities of 10-18-year-olds

Youth’s daily use of media is increasing (Ofcom 2022; Rideout et al. 2022), with TikTok, Snapchat, Instagram, YouTube, and Netflix being the most popular platforms (Ofcom 2022; Pew 2020). Even with age restrictions, social media use has increased for children younger than 13 years and continues to grow for adolescents (Ofcom 2022; Rideout et al. 2022).

Young people use digital platforms in a variety of beneficial ways. The recent United Nations Convention on the Rights of the Child (General Comment No. 25, 2021) highlighted how access to the internet is increasingly considered part of children’s rights, enumerating several key purposes of technology including access to information, freedom of expression, freedom to form communities and identities, and access to education and play in digital spaces. Popular digital platforms have unique affordances and youth use them to meet developmental needs for social connection, social skill building, autonomy, and identity exploration (Stockdale and Coyne 2020; Throuvala, Griffith and Kudss 2019). However, both the design of platforms and the age of the users have implications for how and when young people use technologies (Ofcom 2022).

Children differ from adults and specific design features may interact with youth developmental susceptibilities to make some behaviors more likely than others. For example, a young person may go on YouTube to learn how to make something, planning to only watch one video. However, the pushed content from autoplay coupled with immature EF skills make it more difficult for them than an adult to disengage. Based on their still-developing cognitive skills, even youth older than 13 years are more susceptible to such design features.

Digital Design and Development

Digital platforms collect information about users for a variety of reasons, but the main goal is to increase users’ time and engagement online. When features manipulate the user into doing something to benefit the business’s rather than users’ best interest, this is referred to as dark patterns in user experience design (Gray et al. 2018). Examples include push notifications, banner notifications, autoplay, infinite scroll and badges that alert users to new information, messages, likes, comments, follow requests and more (5Rights Foundation 2021). Such uses of datafication, Artificial Intelligence (AI), and engagement tactics interact differentially with children’s developmental needs and capacities.

Some design features are used in services of youth’s needs, introducing opportunities for connection and creativity. For instance, in the YouTube how-to search example above, the child might be recommended videos that promote their learning or help identify other users with aligned interests. Other design features may introduce risk. We believe that youth can reap benefits from, disrupt, and co-opt digital spaces while also being subject to the extractive business model underlying technology platforms. Though not an exhaustive list, we highlight design features of popular platforms, how they connect to developmental capacities, and how exploitative features can be reimagined and rebuilt as something better. Given the expansiveness of online spaces and the limits of this chapter, we focus on social media and streaming, omitting other spaces (e.g., gaming, blogging, and texting).

Feeds, Nudges, and Pushes: Attention Economy and Development

With notable exceptions (e.g., Vero, Mastodon), digital spaces, especially social media and streaming platforms, are owned and operated by for-profit companies that make money from advertising. As such, there is a financial incentive for companies to want users to share information, stay on platforms longer, and produce content that will help keep others online longer too. This business goal tends to prioritizes profit over protection, treating all users as impersonal revenue sources (Mascheroni and Holloway 2019) and is often in conflict with the wellbeing of children.

Companies use various tactics to keep users online and young people are vulnerable to such design features for several reasons. First, youth are highly motivated by social connections and peer acceptance. This makes it difficult to refrain from checking to see who just posted, what message they received, etc. Second, they are actively shaping their identity and testing how others respond to their presentations of self. They want validation from peers, resulting in frequent checking for comments, likes, and reposts. Third, social comparisons and needs for affiliation can result in feelings of missing out if not checking others’ posts often or responding quickly. Fourth, still-developing EF makes it difficult for youth to control impulses such as clicking on notifications or stopping autoplay or infinite scroll. These developmental susceptibilities interact with design features, making regulating use difficult. Behavioral psychology is used to manipulate young users into returning to apps even when they had not intended to or persisting in use beyond the intended time or purpose (e.g., wanting to snap a friend or find a video).

We see some of these developmental vulnerabilities in practice when children and teens struggle to accurately estimate their digital time use (Wade et al. 2021) and report difficulties pulling themselves away from devices, due to the constant influx of likes, alerts, notifications, and autoplay features (James, Weinstein and Mendoza 2021). Youth who spend excessive amounts of time with screen media are more likely to report poor psychological and social functioning (Przybylski, Orben and Weinstein 2020), though the nature of media experiences might matter more than overall time. Youth report wanting to feel more in control of their digital experience and wanting help to spend more time offline (Kidron and Rudkin 2017). Instead, various developmental immaturities (e.g., immature self-regulation, high importance of peer acceptance) are often exploited to capture and sustain attention.

Developmental differences and digital design

Specific design features may be more detrimental to younger than older users. Many platforms allow friends/followers to like posts and tally the number of likes received. Research consistently finds these features hold strong and changing importance to young (12-18 years) users (e.g., tweens are more concerned with low acceptance/likes than teens; Yau and Reich 2019), with many viewing it as evidence (or lack) of support from friends, popularity, and identity validation (Jong and Drummond 2016; Timeo, Riva and Paladino 2020). Studies also find few likes can be emotionally distressing to young users, especially subgroups with experience of victimization or mental health problems (Lee et al. 2020; Radovic et al. 2017). Further, youth view likes as validation by friends, which over time, is associated with higher dependence on likes for validation (Meeus, Beullens and Eggermont 2019). Receiving likes activates regions of the brain associated with reward processing, imitation, attention, and cognition (Sherman et al. 2016). Thus, older children and adolescents, who are highly motivated by peers, still forming their sense of self, and neurologically immature, interpret visual digital peer feedback differently than adults and younger children. Though numerous likes and comments can promote positive feelings (Sherman et al. 2016; Yau and Reich 2019), low numbers and negative feedback can be quite detrimental (Crone and Konijn 2018; Lee et al. 2020).

Youth risk-taking increases in social contexts with friends, when there are heightened emotions or an immediate reward (Shulman et al. 2016). These risk-taking conditions magnify in digital spaces with likes, comments, DMs, and when emotionally-charged content is prioritized in content feeds. When social media platforms are not designed for youth, but connect strongly to their developmental needs (e.g., social connections, identity presentation) and age-specific vulnerabilities (e.g., risk-taking behaviors, low inhibitory control), these spaces can be detrimental to wellbeing. Our attention to dark patterns of design is not to minimize the importance of connection, creation, and expression social media can provide for young people or trivialize their creativity and agency in these spaces, but to acknowledge the ways in which these platforms, as a type of digital playground, should consider younger users too and design with their wellbeing in mind.

Design Tweaks

  • Limited or No Notifications:

    • No or minimal push notifications should be the default - requiring youth to actively choose to open an app, rather than quickly clicking on notifications that launch the app.

    • Time use notifications could scaffold more self-regulation and increase more mindful engagement with platforms (e.g., “You’ve watched 10 suggested videos. Want to search for something specific instead?”).

  • Turn off Auto-play, Infinite Scroll and Non-Age-Specific Suggested Content:

    • Youth should be able to select suggested content from static thumbnails or easily remove auto-play features completely. Content should not just begin when idle or scrolling.

    • Disabling notifications, pushes, autoplay, and infinite scroll completely should be easier.

  • End Tallies of Likes and Comments:

    • Public presentation of peer validation via tallies should be turned off and only viewable to the user, if wanted. Though some sites have the option to hide like counts, it is often hard to do (e.g., requiring clicking through numerous menus) and youth may not think of it or want to when their friends have not.

Supports for Youth

Critical thinking about persuasive designs is beneficial to youth, caregivers, and educators, and should be offered in educational settings and through youth-led interventions. These discussions should go beyond “screen time.”

Privacy and Age Verification

Privacy features, and lack thereof, intersect with developmentally-appropriate youth desires to take more risks, meet new people, try out ways to present one’s self, and disclose personal information. Risk-taking is an essential part of development as youth are learning to be more independent and exploring their capacities and identities (Steinberg 2010). Unfortunately, youth’s social and risk-taking needs can be exploited in the current digital privacy practices. For young users, social media broadly lacks significant, user-friendly, and desirable privacy protections, which introduces both short-term (e.g., grooming, harassment) and long-term (e.g., profiling, discrimination) risks (Livingstone, Stoilova and Nandagiri 2018).

Ten-to-18-year-olds’ investment in their peer relationships and social connections can help in finding affinity groups and making friends – but could also lead to friending predators or sharing sensitive information about oneself online (Notten and Nikken 2016). With young users treated no differently than adults, privacy is an issue when anyone can find youth online or initiate and maintain contact through direct messaging or comments. The ability to connect with others is important, but better parameters around privacy and access are needed, instead of relying on youth and their caregivers to manage privacy settings.

Privacy on platforms

The default settings for many platforms popular among 10-18-year-olds, including TikTok, Instagram, and Snapchat, is for accounts to be public with the ability to opt-out and make them private. Recently and largely in response to youth’s workarounds (e.g., multiple accounts for different connections), platforms have included more features allowing users to decide who sees which posts. Although youth can choose to only allow close friends to see certain content, there are enticements to making posts public to larger audiences, such as desires for more followers, likes, comments, and streaks. Such feedback is particularly attractive given 10-18-year-olds’ previously mentioned highly sensitive reward system and want for peer connection (5Rights Foundation 2021).

Some platforms, in response to concerns, have created some age-appropriate settings for younger users. TikTok recently announced stronger default privacy settings for users 13-17 years and maturity scores for videos to reduce exposure to inappropriate content. Discord, Instagram, and Snapchat created Safety Centers and/or Parent Hubs to learn more about privacy options. However, the burden is on users and their families to protect privacy online. Further, privacy options focus more on person-to-person privacy management (e.g. show to close friends) rather than person-to-corporation (e.g. do not personalize ads) privacy controls, which are often difficult to locate.

Age verification

Because of COPPA, an age-gate of 13 years or older is used, requiring users to simply click an age-verification box or enter a birthdate. However, children are often motivated to lie about their age, especially when most of their peers are already using the platform.

In theory, the 13-year-old age-gate exists to protect children from negative experiences and increase privacy online. In practice, age-gates allow platforms to treat everyone as a homogenous group with the same (lack of) protections and policies. Further, companies have little incentive to verify users’ ages. A quick search on TikTok, Instagram or Snapchat of “fifth grade” or “9 years” will generate many public accounts of children whose profile descriptions and pictures make clear that they are younger than 13 years.

Open privacy settings

As mentioned earlier, many platforms default to open privacy settings, making accounts and profiles public (i.e., anyone can see, message). Privacy-protecting options are often difficult to find and in-app pop-ups’ visuals and wording discourage strong privacy settings, which may help explain why most youth accounts remain public (Livingstone, Davidson and Bryce 2017). From a developmental standpoint, 10-18-year-olds want to connect with other youth and feel positive emotions from larger numbers of likes, which contributes to not only open accounts but requests like “follows for follows.” Open profiles and the ability for anyone to direct message (DM) introduce risks of more ad targeting and potential victimization. Especially concerning are the ways in which this default publicness can lead to the unwanted sharing of private information such as “being outed by the machine” (Cho 2018 p.3187).

Open privacy settings facilitate online harassment and are especially risky for girls, people of color, youth with disabilities, and sexually minoritized youth (Anti-Defamation League 2022; Eldridge et al. 2021). Sexual harassment and image-based sexual assault are more common with open privacy networks (Ringrose, Regehr and Whitehead 2021). Multiple design decisions might facilitate image-based sexual assault including the (1) gamification and incentives for open privacy settings, (2) ease of quickly adding unknown others, and (3) lack of effective reporting methods. These design features leave young people to manage their own harassment online and many do not report victimization for fear of being punished by losing their device or platform access (Ringrose et al. 2021).

Complicated privacy agreements

Privacy policies are difficult to understand and many privacy agreements allow users to agree with one click, without seeing the actual policy. There are major developmental differences in understanding of abstract concepts (Duthie et al. 2008) and though privacy policies are difficult for even adults, the sophisticated vocabulary and legal concepts are especially challenging for children. A recent analysis of 1700 edtech/digital products found 75% to be written above the average US adult reading ability (Lee 2018). As such, youth likely do not understand what they are agreeing to and cannot conceptualize how their data are being extracted and sold. When children do understand, they are often left with the choice to either have their data extracted or self-exile themselves from the places popular with peers.

Design Tweaks

  • Privacy as Default:

    • Privacy-protecting settings should be default and easily accessible, including private accounts, limited DM partners, disabled location tracking, tagging, personalized ads, and contact syncing.

  • Kid-Friendly Privacy Policies:

    • Make privacy policies and data use transparent and child-friendly. Pop-up text or videos could ensure children understand the implications of choices (e.g., “This means anyone can see your posts and send you messages. Is that what you want?”).

  • Real ages:

    • Age-gates are not effective. Actual birthdates, image recognition and AI verification processes could ensure more accurate identification of minor users.

    • Monitoring by regulators could help identify underage users and hold companies accountable.

Supports for Youth

Resources for learning and systems of reporting need to be in place so youth can understand their choices and be supported in reporting unwanted digital experiences (e.g., cyberflashing, discriminatory pushed content).

Algorithmics and Developmental Vulnerabilities

Children’s media experiences are deeply and, to the user, invisibly filled with algorithmically-selected content. These algorithms are informed by data collected (illegally for children under 13) by the platform and combined with vast other pools of data pulled from devices, ultimately manifesting as curated social feeds without a lens for wellbeing and overrun with biased content and targeted ads. Young users are tracked and profiled using cookies, location tracking, device fingerprinting, and metadata to power algorithmic decision-making and sell to third parties, and research shows that most minors do not understand these processes (Livingstone et al. 2017). Given the importance and prevalence of social media in youth’s lives, we consider how a driving feature of most online platforms–artificial intelligence (AI)--shapes youth’s online experiences.

Algorithm-curated feeds

By capturing likes, time on site, clicks, and one’s friends’ likes, digital platforms use AI to recommend other accounts to follow and push content at users. These algorithmically-generated recommendations can help youth identify people and content they want, but are designed without safety parameters around wellbeing or sensitivity to content. Several news outlets have found that algorithms increasingly push more extreme, misleading, and outright false content to users (5Rights Foundation 2021; Morris 2018). Such content keeps youth on apps longer, as youth often discuss having trouble turning away from feed content they do not want to see (Barry et al. 2021; Headstream 2022). Try searching something benign on YouTube, like “Do sharks lay eggs?” and you will get pages of shark attack videos. Pushing extreme and often false content to youth is problematic, especially for younger users who are still forming their critical thinking skills, have less content knowledge, are more physiologically aroused by stimuli, and are actively forming their identity. Though some notable work with youth has helped to support understanding of algorithms and the biases that underly them (e.g., Erete et al. 2021; Lee et al. 2022; Pybus, Coté and Blanke 2015), commercial platforms are not transparent about what data are collected or how they are used.

Algorithms and bias

Researchers have argued that AI-pushed content plays a significant role in promoting certain types of identity performances while silencing others (Simpson and Semaan 2021). Studies have found that TikTok’s algorithm suppresses and oppresses the identities of its growing LGBTQ+ user population through algorithmic and human moderation (Bacchi 2020; Botella 2019). Without careful design, algorithmic moderation could prevent marginalized youth from safely or easily finding others who share their identities and lifestyles. This is especially concerning when research underscores the importance of accepting and likeminded communities for marginalized youth (Rofofsky et al. 2016).

Scholars have described how algorithms, including those that power social feeds, reproduce the biases of the people involved in creating them, which means that racism and sexism are designed into our everyday technology uses (Noble 2018). Youth’s experience of harmful content likely depends on aspects of their identity including gender, race, ethnicity, immigration status, neurodiversity, and sexuality (Ito et al. 2023).

Given that developmental processes and needs occur at different times, vulnerability to pushed content varies with users’ developmental capacities. For instance, both a prepubescent 10-year-old whose identity development exploration is just beginning and an 18-year-old who feels more confident with her identity as a woman will likely be less affected by pushed images of thinness, dieting, and fitness than a 13-year-old in the midst of puberty and identity development. However, design of digital spaces does not often consider age and when it does, views maturation as a linear progression only.

Targeting ads

Advertising that is based on users’ data (e.g. personal information, click patterns, network members’ interests) often exploits individuals’ personal and psychological vulnerabilities (Gak, Olojo, and Salehi 2022; Radesky et al. 2020). For instance, targeted weight-loss ads reinforce low self-esteem and deepen pre-existing anxieties around food and exercise among users who have eating disorders (Gak et al. 2022). Sadly, targeting advertising is core to mainstream social media companies’ business models, as every single click on ads results in more profits for companies.

Ample research has documented the risks of advertising directly to children (e.g., tobacco, alcohol), with newer work considering the risks of tailored ads (Lapierre et al. 2017; Radesky et al. 2020) to populations that are highly influenced by peers and still developing critical thinking skills. Middle childhood and adolescence are when youth’s values, interests, and actions are increasingly shaped by outside voices and opinions. Without recognizing children’s age and inherent vulnerabilities, targeted ads can promote unrealistic standards to users who are easily swayed, especially when they have significant vulnerabilities (e.g., low self-esteem, disordered eating, high impulsivity, or depressive affect). Such targeted algorithmic advertising can be especially problematic when reifying heteronormative, ableist, gender-conforming, and white-privileged norms and values.

Design Tweaks

  • Youth-shaped Algorithms:

    • Involving diverse youth in algorithm design and audit can help identify pro-wellbeing and harmful (e.g., racist, ableist) content.

    • Sites should solicit feedback from youth to shape algorithms to minimize harm (e.g., flagging content that made them feel worse) and prioritize uplifting material.

    • Have choices for youth to opt-in to data-driven algorithms, choosing whether and how data are collected and used.

  • More Inclusive and Child-Appropriate Platform Moderation:

    • Moderation should be age and diversity-sensitive and include and compensate experts and community members, including youth.

  • New Business Models:

    • Platforms should work with policymakers and child experts to find new business models and potentially ban targeted ads for children and teens when browsing and playing online.

    • A public option, much like PBS or Mozilla, could enable connectivity without reliance on attention for revenue.

Supports for Youth

Teach children that algorithms are not neutral through digital literacy education in physical (e.g., schools, libraries) and online spaces (e.g., YouTube, TikTok).

Conclusions

Connectivity and online spaces are important for young people, providing ways to explore, learn, communicate, and simply geek out (Odgers, Schueller and Ito 2020). For marginalized youth and those with niche interests, digital spaces provide invaluable opportunities to find support, express oneself, and play with different identity presentations. From sharing images of one’s day on Instagram to recording TikTok dances with friends to learning how to do something from YouTube, these spaces connect well to the developmental needs of youth and provide opportunities to cultivate greater skills. However, design tweaks could make online spaces healthier for the developing child.

Currently, ways to promote positive digital experiences are largely dependent on youth, parents, and educators to select platforms, enter ages (13+), select and manage privacy settings (if any), agree to complicated user terms, and hunt for ways to disable unwanted features. Though developmentally-appropriate design legislation is underway, how this will be evaluated or enforced is unknown. Until then, users are reliant on designers and owners of these platforms to consider users’ ages, capacities, or consequences of use. The current one-(adult)-size-fits-all framework that prioritizes revenue is not adequate for supporting youth’s wellbeing in digital spaces. When considering development from middle childhood through adolescence, it is clear that some digital media features will be more impactful, enjoyable or challenging and these responses may vary across different identity groups, ages, and youth experiences.

With limited space, we tried to highlight what is developing from 10-18 years and how these developmental needs and capacities interact with ubiquitous features in digital spaces. We suggest design tweaks that could support youth wellbeing and encourage regulatory bodies to create policies that prioritize children over corporations. No parks are designed only for adults, particularly heterosexual, affluent, white, able-bodied adults. Instead, they are designed for a diversity of users and uses. We believe digital spaces should similarly be designed for a range of ages and intersectional identities. Though youth and their important adults can try to select safe spaces and features, it takes buy-in from companies and their engineering team to improve the human experience online and policies to make them accountable.


References

5Rights Foundation. 2021. Pathways: How Digital Design Puts Children at Risk. London, UK: 5Rights Foundation.

Allen, Kelly A., Tracii Ryan, D DeLeon L. Gray, Dennis M. McInerney, and Lea Waters. 2014. "Social Media Use and Social Connectedness in Adolescents: The Positives and the Potential Pitfalls." The Australian Educational and Developmental Psychologist, 31(1):18-31.

Anti-Defamation League. 2022. Online Hate and Harassment Report: The American Experience 2020. New York, NY: Anti-Defamation League.

Bacchi, Umberto. 2020. "TikTok Apologises for censoring LGBT+ Content." Reuters, September 22.

Barry, Robb, Georgia Wells, John West, Joanna Stern, and Jason French. 2021. "How TikTok Serves Up Sex and Drug Videos to Minors." Wall Street Journal, September 8.

Botella, Elena. 2019. "TikTok Admits It Suppressed Videos by Disabled, Queer, and Fat Creators." Slate, December 4.

Brechwald, Whitney A., and Mitchell J. Prinstein. 2011. "Beyond Homophily: A Decade of Advances in Understanding Peer Influence Processes." Journal of Research in Adolescence 21(1):166-79.

Buunk, Abraham P., and Frederick X. Gibbons. 2007. "Social Comparison: The End of a Theory and the Emergence of a Field." Processes 102(1):3-21.

Cho, Alexander. 2018. "Default Publicness: Queer Youth of Color, Social Media, and Being Outed by the Machine." New Media & Society 20(9):3183-200.

Crone, Eveline A., and Elly A. Konijn. 2018. "Media Use and Brain Development During Adolescence." Nature Communication 9:588.

Dahl, Ronald E., and Megan R. Gunnar. 2009. "Heightened Stress Responsiveness and Emotional Reactivity During Pubertal Maturation: Implications for Psychopathology." Development and Psychopathology 21:1-6.

Duthie, Jill Kathleen, Marilyn A. Nippold, Jesse L. Billow, and Tracy Mansfield. 2008. "Mental Imagery of Concrete Proverbs: A Developmental Study of Children, Adolescents, and Adults." Applied Psycholinguistics 29(1):151 - 73.

Ehrenreich, Samuel E., Diana J. Meter, Kurt J. Beron, Kaitlyn Burnell, and Marion K. Underwood. 2020. "How Adolescents Use Text Messaging Through their High School Years." Journal of Research on Adolescence 30(2):521-40.

Eldridge, Morgan A., Michelle L. Kilpatrick Demaray, Jonathan D. Emmons, and Logan N. Riffle. 2021. "Cyberbullying and Cybervictimization among Youth with Disabilities,." Pp. 255-81 in Child and Adolescent Online Risk Exposure, edited by M. F. Wright and L. B. Schiamberg. Cambridge, MA: Elsevier - Academic Press.

Elkind, David, and Robert Bowen. 1979. "Imaginary Audience Behavior in Children and Adolescents." Developmental Psychology 15(1):38–44.

Erete, Sheena, Karla Thomas, Denise Nacu, Jessa Dickinson, Naomi Thompson, and Nichole Pinkard. 2021. "Applying a Transformative Justice Approach to Encourage the Participation of Black and Latina Girls in Computing." ACM Trans. Comput. Educ. 21(4):Article 27.

Friedman, Naomi P., and Akira Miyake. 2017. "Unity and Diversity of Executive Functions: Individual Differences as a Window on Cognitive Structure." Cortex 86:186-204.

Gak, Liza, Seyi Olojo, and Niloufar Salehi. 2022. "The Distressing Ads That Persist: Uncovering The Harms of Targeted Weight-Loss Ads Among Users with Histories of Disordered Eating." Proceedings of the ACM on Human-Computer Interaction 6(CSCW2):1-23. doi:10.1145/3555102

Granic, Isabela, Hiromitsu Morita, and Hanneke Scholten. 2020. "Beyond Screen Time: Identity Development in the Digital Age." Psychological Inquiry 31(3):195-223.

Gray, Colin M., Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L Toombs. 2018. "The Dark (Patterns) Side of UX Design." Paper 534 in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, edited by R. Mandryk, M. Hancock, M. Perry, and A. Cox. New York, NY: ACM.

Hall, Helen K., Prudence M. R. Millear, Mathew J. Summers, and Benjamin Isbel. 2021. "Longitudinal Research on Perspective Taking in Adolescence: A Systematic Review." Adolescent Research Review 6:125–50.

Headstream. 2022. "Why This Matters." https://www.digitaldelta.io/why-this-matters

Hoffner, Cynthia A., and Sangmi Lee. 2015. "Mobile Phone Use, Emotion Regulation, and Well-being." CyberPsychology, Behavior, and Social Networking 18(7):411-16.

Holodynski, Manfred, and Wolfgang Friedlmeier (Eds.). 2005. Development of Emotions and Their Regulation. Berlin, Germany: Springer.

Ito, Mizuko, Remy Cross, Karthik Dinakar, and Candice L. Odgers. 2023. Algorithmic Rights and Protections for Children. Cambridge, MA: MIT Press.

James, Carrie, Emily Weinstein, and Kelly Mendoza. 2021. Teaching Digital Citizens in Today's World: Research and Insights Behind the Common Sense K–12 Digital Citizenship Curriculum. San Francisco, CA: Common Sense Media.

Jones, Diane Carlson. 2001. "Social Comparison and Body Image: Attractiveness Comparisons to Models and Peers among Adolescent Girls and Boys." Sex Roles 45:645-64.

Jong, Stephanie T. and Murray J. N. Drummond. 2016. "Hurry Up and ‘Like’ Me: Immediate Feedback on Social Networking Sites and the Impact on Adolescent Girls." Asia-Pacific Journal of Health, Sport and Physical Education 7(3):251-67.

Kidron, Beeban, and Angharad Rudkin. 2017. "Digital Childhood: Addressing Childhood Development Milestones in the Digital Environment." London, UK: 5Rights Foundation.

Kleemans, Mariska, Serena Daalmans, Ilana Carbaat, and Doeschka Anschütz. 2018. "Picture Perfect: The Direct Effect of Manipulated Instagram Photos on Body Image in Adolescent Girls." Media Psychology 21(1):93-110.

Lapierre, Matthew A., Frances Fleming-Milci, Esther Rozendaal, Anna R. McAlister, and Jessica Castonguay. 2017. "The Effects of Advertising on Children and Adolescents." Pediatrics 140(2): S152–S56.

Lee, Clifford H., Nimah Gobir, Alex Gurn, and Elisabeth Soep. 2022. "In the Black Mirror: Youth Investigations into Artificial Intelligence." ACM Trans. Comput. Educ. 22(3):Article 25.

Lee, Hae Yeon, Jeremy P. Jamieson, Harry T. Reis, Christopher G. Beevers, Robert A. Josephs, Michael C. Mullarkey, Joseph M. O’Brien, and David S. Yeager. 2020. "Getting Fewer ‘Likes’ Than Others on Social Media Elicits Emotional Distress Among Victimized Adolescents." Child Development 91(6):2141-59.

Lee, Irene. 2018. "It’s Not You; Privacy Policies Are Difficult to Read." Common Sense Media, July 17.

Livingstone, Sonia, Julia Davidson, and Joanne Bryce. 2017. "Children’s Online Activities, Risks and Safety: A Literature Review by the UKCCIS Evidence Group." UK Council for Internet Safety.

Livingstone, Sonia, Mariya Stoilova, and Rishita Nandagiri. 2018. Children’s Data and Privacy Online: Growing Up in a Digital Age. LSE Media and Communication.

Mah, V. Kandice, and E. Lee Ford-Jones. 2012. "Spotlight on Middle Childhood: Rejuvenating the ‘Forgotten Years’." Paediatrics & Child Health 17(2):81–83.

Mascheroni, Giovanna, and Donell Holloway. 2019. "The Quantified Child: Discourses and Practices of Dataveillance in Different Life Stages." Pp. 354-65 in The Routledge Handbook of Digital Literacies in Early Childhood, edited by O Erstad, R Flewitt, B Kümmerling-Meibauer, and ÍP Pereira. London: Routledge.

Meeus, Anneleen, Kathleen Beullens, and Steven Eggermont. 2019. "Like Me (Please?): Connecting Online Self-presentation to Pre- and Early Adolescents’ Self-esteem." New Media & Society 21(11-12):2386-403.

Morris, David Z.. 2018. "How YouTube Pushes Viewers Towards Extremism." Fortune, March 11. https://fortune.com/2018/03/11/youtube-extreme-content/

NASEM. 2019. The Promise of Adolescence: Realizing Opportunity for All Youth. Washington DC: National Academies Press.

Noble, Safiya Umoja. 2018. Algorithms of Oppression. New York: NYU Press.

Notten, Natascha, and Peter Nikken. 2016. "Boys and Girls Taking Risks Online: A Gendered Perspective on Social Context and Adolescents’ Risky Online Behavior." New Media & Society 18(6):966-88.

Odgers, Candice L., Stephen M. Schueller, and Mizuko Ito. 2020. "Screen Time, Social Media Use, and Adolescent Development." Annual Review of Developmental Psychology 2(1):485-502

Ofcom. 2022. Children’s Media Lives 2022. London, UK: Ofcom.

Pew. 2020. Parenting Children in the Age of Screens. Washington, D.C..: Pew Research Center.

Przybylski, Andrew K., Amy Orben, and Netta Weinstein. 2020. "How Much Is Too Much? Examining the Relationship between Digital Screen Engagement and Psychosocial Functioning in a Confirmatory Cohort Study." Journal of the American Academy of Child & Adolescent Psychiatry 59(9):1080-88.

Pybus, Jennifer, Mark Coté, and Tobias Blanke. 2015. "Hacking the Social Life of Big Data." Big Data & Society July-Dec:1-10.

Radesky, Jenny, Yolanda Reid Chassiakos, Nusheen Ameenuddin, Dipesh Navsaria, and Council on Communication and Media (CCM). 2020. "Digital Advertising to Children." Pediatrics 146(1):e20201681.

Radovic, Ana, Theresa Gmelin, Bradley D. Stein, and Elizabeth Miller. 2017. "Depressed Adolescents' Positive and Negative Use of Social Media." Journal of Adolescence 55:5-15.

Ricker, Ashley A., and Rebekah A. Richert. 2021. "Digital Gaming and Metacognition in Middle Childhood." Computers in Human Behavior 115.

Rideout, Victoria, Alanna Peebles, Supreet Mann, and Michael B. Robb. 2022. Common Sense Census: Media Use by Tweens and Teens, 2021. San Francisco, CA: Common Sense.

Ringrose, Jessica, Kaitlyn Regehr, and Sophie Whitehead. 2021. "Teen Girls’ Experiences Negotiating the Ubiquitous Dick Pic: Sexual Double Standards and the Normalization of Image Based Sexual Harassment." Sex Roles 2011:558-76.

Rofofsky, Matthew, Anupama Kalyanam, Allison Berwald, and Aruna Krishnakumar. 2016. "LGBTQ Adolescents." In Handbook of Child and Adolescent Group Therapy, edited by C Haen and S Aronson. New York, NY: Routledge.

Sherman, Lauren E., Ashley A. Payton, Leanna M. Hernandez, Patricia M. Greenfield, and Mirella Dapretto. 2016. "The Power of the Like in Adolescence: Effects of Peer Influence on Neural and Behavioral Responses to Social Media." Psychological Science 27(7):1027-35.

Shulman, Elizabeth P., Ashley R. Smith, Karol Silva, Grace Icenogle, Natasha Duell, Jason Chein, and Laurence Steinberg. 2016. "The Dual Systems Model: Review, Reappraisal, and Reaffirmation." Developmental Cognitive Neuroscience 17:103-17.

Silk, Jennifer S., Greg J. Siegle, Diana J. Whalen, Laura J. Ostapenko, Cecile D. Ladouceur, and Ronald E. Dahl. 2009. "Pubertal Changes in Emotional Information Processing: Pupillary, Behavioral, and Subjective Evidence During Emotional Word Identification." Development and Psychopathology 21(1):7-26.

Silvers, Jennifer A.. 2022. "Adolescence as a Pivotal Period for Emotion Regulation Development." Current Opinion in Psychology 44:258-63.

Simpson, Ellen and Bryan Semaan. 2021. "For You, or For ‘You’? Everyday LGBTQ+ Encounters with TikTok." Pp. 252 in ACM Human Computer Interaction, edited by J. Nichols. New York, NY: ACM.

Steinberg, Laurence. 2010. "A Social Neuroscience Perspective on Adolescent Risk-taking." In Biosocial Theories of Crime, edited by S. Henry, K. M. Beaver, and A. Walsh. London: Routledge.

Stockdale, Laura A., and Sarah M. Coyne. 2020. "Bored and Online: Reasons for Using Social Media, Problematic Social Networking Site Use, and Behavioral Outcomes Across the Transition From Adolescence to Emerging Adulthood." Journal of Adolescence 79(173-183).

Throuvala, Melina A., Mark D. Griffith, and Daria J. Kudss. 2019. "Motivational Processes and Dysfunctional Mechanisms of Social Media Use among Adolescents: A Qualitative Focus Group Study." Computers in Human Behavior 93(164-175).

Timeo, Susanna, Paolo Riva, and Maria Paola Paladino. 2020. "Being Liked or Not Being Liked: A Study on Social-Media Exclusion in a Preadolescent Population." Journal of Adolescence 80:173-81.

Van der Graaff, Jolien, Susan Branje, Minet De Wied, Skylar Hawk, Pol Van Lier, and Wim Meeus. 2014. "Perspective Taking and Empathic Concern in Adolescence: Gender Differences in Developmental Changes." Developmental Psychology 50(3):881-88.

Wade, Natasha E., Joseph M. Ortigara, Ryan M. Sullivan, Rachel L. Tomko, Florence J. Breslin, Fiona C. Baker, Bernard F. Fuemmeler, Katia Delrahim Howlett, Krista M. Lisdahl, Andrew T. Marshall, Michael J. Mason, Michael C. Neale, Lindsay M. Squeglia, Dana L. Wolff-Hughes, Susan F. Tapert, Kara S. Bagot, and ABCD Novel Technologies Workgroup. 2021. "Passive Sensing of Preteens’ Smartphone Use: An Adolescent Brain Cognitive Development (ABDC) Cohort Substudy." JMIR Ment Health 8(10).

Yau, Joanna C., and Stephanie M. Reich. 2019. “‘It’s Just a Lot of Work’: Adolescents' Self-presentation Norms and Practices on Facebook and Instagram.” Journal of Research in Adolescence 29(1):196-209.

Comments
4
Polina Lulu:

Any recommendations for what to do instead? Exploring ‘light patterns’ as an alternative would be helpful.

?
Jamie Apperson:

I had trouble remembering what “dark patterns” were when reading. Maybe the term could be defined in this paragraph instead, or used a few more times between it being defined and this paragraph.

?
Jamie Apperson:

This is such an unfortunately clever way for social media companies to have their cake and eat it too. They can entice children under 13 who want to participate in more mature spaces, they can serve them content that is not age-appropriate, they can deny allowing them onto the platform, and they have a wider age range to profic from.