Skip to main content
SearchLoginLogin or Signup

Early Adolescents’ Perspectives on Digital Privacy

Published onJun 29, 2021
Early Adolescents’ Perspectives on Digital Privacy
·

Abstract

Children today are growing up in a highly connected world in which many online behaviors are traceable. Adolescents are under constant surveillance by peers, corporations, and potential maleficents. This creates a need for youth to consider their digital privacy at both a broader distal level and an interpersonal level. Using a survey of youth aged 11-14 years we collected information about youth privacy practices. We found that young adolescents' privacy behaviors, beliefs, and practices are multifaceted and complex, reflecting the trade-offs they must make between social connectedness and protecting their privacy.

Key Findings

  • Adolescents reported more proximal (e.g., not sharing passwords with known others) than distal (e.g., restricting data collection from websites and apps) privacy-protecting behaviors.

  • Adolescents were more likely to endorse prescriptive messages regarding risks associated with potential online predators more strongly than risks about corporations and data collection practices.

  • While younger adolescents (11- and 12-year-olds) reported fewer total privacy-protecting behaviors than older adolescents (13- and 14-year-olds), they engaged more frequently in privacy-protecting behaviors related to their social relationships.

  • Proximal privacy-protecting behaviors were more relaxed with age, compared to distal privacy-protecting behaviors, which increased with age.

  • White adolescents in our sample were less likely to be concerned with corporate data collection and reported fewer privacy-protecting behaviors related to corporations compared to non-white adolescents.

  • Adolescents who appeared to have internalized prescriptive cultural messages about online privacy risks related to predators and unknown peers engaged in more privacy behaviors; however, they often held contradictory beliefs, preferences, practices related to corporate surveillance.

  • Our findings suggest that issues of digital privacy, especially corporate surveillance, are complex for early adolescents to understand and emotionally navigate.

A vast array of information communication technologies (ICTs) permeate public-private boundaries in home and school environments (Livingstone, 2005; Taylor & Rooney, 2016), creating a perfect storm in early adolescence, when burgeoning needs for autonomy, exploration, self-expression, and peer connectedness make youth easy targets for ‘dataveillance’ (Smith & Shade, 2018). Dataveillance refers to the exploitation or commercialization of children’s play or social networking data as a capitalist commodity. The Federal Trade Commission restricts internet companies from collecting personal information on those younger than 13 through the 1998 Children’s Online Privacy Protection Act (COPPA) based on the belief that children and early adolescents do not have the same knowledge, experience, and self-regulatory capacities as adults to fully consent to and understand the consequences of their interactions with ICTs and should therefore be protected (Costello et al., 2016). Interventions to protect children and young adolescents should be closely informed by the knowledge, perspectives, and tendencies of youth themselves and yet, the majority of psychological research on privacy in digital media contexts has focused on adults, older adolescents, or emerging adults (e.g., Agosto & Abbas, 2017; Wisniewski, 2018). Thus, we know little about perspectives on privacy during a sensitive period of identity and autonomy development when youth begin to use social media for the first time (Shin et al., 2012; Shin & Kang, 2016; Youn, 2009).

To fill this gap, we surveyed early adolescents in two U.S. public middle schools about their privacy knowledge, preferences, and practices. The survey asked students about their beliefs and attitudes toward prescriptive messages about privacy (e.g., apps are selling your information to advertisers, digital ink is permanent), their preferences in negotiations of privacy trade-offs (e.g., prioritizing self-disclosure for peer connection and convenience over protection of personal information), and their privacy protection behaviors (e.g., turning location sharing off, keeping passwords from others). Our goal was to shed light on how early adolescents view digital privacy and how their privacy behaviors may be shaped by circumstances unique to this developmental period.

Why is privacy an issue with young adolescents?

Digital technologies blur public-private boundaries of the past, making data from within the family home more accessible to corporations, and allowing for easier transfer and collection of data about children's lives in novel ways that are challenging for families to regulate (Smith & Shade, 2018; Taylor & Rooney, 2016). Nowadays, children are wirelessly connected to others, including databases, from a very young age through smart toys, in-home personal assistants (e.g., Alexa, Google Assistant), and games and apps (e.g., YouTube Kids) (Holloway & Green, 2016). In the transition from childhood to adolescence, social media such as Snapchat or Instagram become increasingly compelling for youth to communicate with peers. The challenge of regulating the rapid proliferation of social media marketed at younger adolescents (e.g., TikTok, Amino, Discord, Whisper, Kik), plus the lack of legitimate protections baked into the Internet of Things (IoT), create new kinds of privacy risks for youth growing up in the digital age (Smith & Shade, 2018). During the COVID-19 pandemic, the transition to remote or hybrid learning has also meant that many children and their families are compelled to use e-learning software on personal or home devices where privacy settings vary widely, and data sharing practices remain opaque for families to understand (Harwell, 2020; Maalsen & Dowling, 2020; Teräs et al., 2020).

One concern about privacy is that from an early age, children are being treated as algorithmic assemblages and as a result of these “datafication” practices, their “complexities, potentialities, and opportunities are becoming restricted” (Lupton & Williamson, 2017, p.787) through multifaceted corporate surveillance. The term algorithmic assemblages refers to the reduction of children’s online activities into data points to improve algorithms’ ability to predict purchasing or viewing behaviors -- in short, how corporations develop datafied representations of children. As Haggerty and Ericson (2000) describe, these algorithmic assemblages occur when data representations of children can be constructed from “a series of discrete flows” which are “reassembled” to represent a person (p. 605) in ways that are financially valuable for corporations and data analytics firms. Children come to be treated as combinations of data points because of the various datafication and surveillance practices baked into the IoT toys, smart devices, and social media they use, reducing their digital actions to abstract demographic information and preferences for a litany of different products and online services (Rooney, 2012; Smith & Shade, 2018). As children and young adolescents increasingly engage with content on an ever-expanding variety of devices, media industries are iterating on ways to predict their behaviors and target them with personalized marketing messages based on these algorithmic reassemblages (Marx & Steeves, 2010; Regan & Steeves, 2019). Zuboff (2019) argues that human autonomy will be diminished under what she calls “surveillance capitalism,” where corporations exploit private experience for profit and trade on human behavioral futures.

Although notable legislation such as COPPA in the U.S. is meant to deter children under 13 years from creating different forms of social media accounts (e.g., Facebook, Instagram, Youtube), there is no evidence that age limits work (children can lie about their age online) and their data are often still collected via parentally controlled accounts or apps (Federal Trade Commission, 1998-2020; Smith & Shade, 2018). Moreover, social media use in the U.S. tends to begin in middle school (Anderson and Jiang, 2018; Rideout & Robb, 2019) and yet after age 13, adolescents are no longer protected under COPPA, leaving them potentially vulnerable in an essentially unregulated, commercial, digital media environment. The cutoff of protections by COPPA at age 13 assumes that adolescents have the capacity and maturity to act in their own interests with regard to privacy on social media but we know little about whether that assumption is warranted (Costello et al., 2016).

In early adolescence, youth are still developing both 'cold' (e.g., logical reasoning, analytical skills) and 'hot' (e.g., emotional, impulsive, experiential) psychosocial systems of information processing (Costello et al., 2016). Steinberg et al. (2009) showed that there were significant differences in how early and later adolescents handled decision-making tasks related to psychosocial maturity (e.g., involving risk-perception, sensation-seeking, impulsivity, future orientation), but not for tasks that tested their cognitive capacities (e.g., logical reasoning ability, information processing). The early adolescent brain is more susceptible to social and emotional factors (e.g., peer pressure, romantic attachment) and the capacity for behavioral self-regulation is still incomplete (Albert & Steinberg, 2011; Steinberg et al., 2009). Even older adolescents may still show deficits compared to adults in terms of their social and emotional maturity (Cohen et al., 2016; Costello et al., 2016), especially for decision-making and reasoning about incentives (Galvan et al., 2006; Somerville et al., 2011), threats/risks (Rudolph et al., 2017), and peers (Chein et al., 2011).

The psychosocial factors (e.g., impulsivity, peer pressure, future-orientation) that influence judgment and decision-making in early adolescence may impact youth's ability to take the time to activate 'cold' analytical decision-making skills for split-second, instantaneous digital privacy decisions (e.g., posting a sexy selfie, sharing photos of drug and alcohol use, revealing a location). Young adolescents are more likely to be motivated by 'hot' experiential desires such as social connectedness, peer pressure, and self/identity presentation (Juvonen & Murdock, 1995; Juvonen, 2007; Knifsend & Juvonen, 2013) compared to older adolescents, and less able to accurately weigh the risks versus rewards of using social media in the context of these social and emotional pressures, in addition to the immediate gratification of quantified social feedback (i.e., Facebook/Instagram likes, Snapstreaks). Therefore, young adolescents may require additional protections (e.g., more time-delay for reflection before: posting content, waiving privacy protections, sharing location) to continue developing their cognitive-control skills on social media in a way that does not compromise their needs for self-expression and identity development, so as to help bridge the protections gap of COPPA after the age of 13.

Technopanics and alarmist narratives transmitted through parents, teachers, and popular media have not been helpful in dealing with this issue because they tend to pathologize adolescents’ use of social media (Agosto & Abbas, 2017; Marwick, 2008). Livingstone (2008; 2014) argues that technopanics can result in adolescents learning prescriptive messages (e.g., don’t talk to strangers online, don’t disclose personal information) without changing their behavior, partly because they do not see social media in the same terms as adults in their lives – where their main goals are not generally to meet strangers or disclose intimate personal information, but to expand their social networks and build relationships (Livingstone, 2008; 2014). A great deal of the research focusing on youth’s safety on social media is grounded in adults’ prescriptive views of youths’ attitudes and behaviors—what adults think youth should be doing online, as opposed to an informed view of what youth are actually doing online and why. As a result, adolescents’ use of social media is framed as poor and risky (Livingstone, 2008; Marwick & boyd, 2014) leading to solutions narrowly aimed at reducing teens’ online disclosures (Wisniewski, 2018). To address the lack of youth perspectives on digital privacy, we designed a survey to tap into privacy knowledge, preferences, and practices that may be unique to early adolescence.

What are adolescents’ perspectives on privacy?

A developmental lens is necessary for understanding early adolescents’ perspectives and behaviors related to digital privacy. Developmental needs for peer intimacy, connection, and identity exploration and formation introduce unique circumstances during this period of the lifespan. Evidence suggests that although adolescents are able to understand and reason about some risks similarly as adults, they exhibit greater sensitivity to peer social rewards in risk-taking scenarios (Albert et al., 2013; Smith et al., 2014). Younger adolescents are more concerned with social connectedness and superficial self/identity presentation than older teens and adults (Juvonen & Murdock, 1995; Juvonen, 2007; Knifsend & Juvonen, 2013), which could lead to greater relinquishing of security in certain arenas to gain social validation and belonging, for example, disclosing publicly to participate in online communities and accrue large amounts of likes, comments, and followers (Yau & Reich, 2019). In short, adolescents may be dealing with privacy tradeoffs differently than adults as they negotiate incentives particular to this developmental period.

Proximal versus distal privacy is likely to be an important distinction in adolescents’ perspectives on privacy. Adolescents’ proximal, person-to-peer privacy management involves needs for intimacy, affiliation, exploration, and information control that is situated in everyday relationships, peer groups, and families (Peter & Valkenburg, 2011; Robinson, 2016). For example, adolescents’ attitudes about online privacy and safety are often shaped by the discomfort that they feel with unintended audiences seeing their personal information, yet most feel pressure to share their personal information with friends to stay socially connected (Agosto & Abbas, 2017; Shin & Kang, 2016), reflecting their desires to “be in public without always being public” (Marwick & boyd, 2014: p.1052). Young adolescents must learn to balance their desires for social connectivity with desires to restrict personal information from unintended audiences (e.g., parents, teachers, predators).

Whereas adolescents’ proximal privacy management is influenced by their needs for identity development and social connectedness, distal privacy management involves developing a more complex understanding of how corporations and data brokers collect and trade on personal information. Adolescents’ person-to-corporation privacy management could be challenging for tweens who are just beginning to think abstractly and understand issues at the level of society. For example, Shin et al. (2012) found that 9-12 year-olds tend to overestimate their understanding and invulnerability, perceiving themselves as more competent and knowledgeable in using ICTs than their parents, but more willing to disclose personal information for marketing purposes. Adolescents could also be less disturbed by abstract invasions of privacy from governments, criminals, or corporations compared to the more immediate risks of nosy parents or peers (Marwick & boyd, 2014; Tufekci, 2008). Research with older adolescents and emerging adults found that they expressed little concern about the future use of their personal data while also showing limited knowledge of the business practices involved in using such information for commercial purposes (Lapenta & Jorgensen, 2015; Montgomery et al., 2017).

In the current study, we distinguished between proximal and distal privacy by examining adolescents’ perspectives on privacy with regard to their social relationships separately from their perspectives on privacy with regard to corporations and potential criminals. That is, we asked adolescents about their beliefs, attitudes, and behaviors related to protecting oneself in social networks, from corporate surveillance, and from predators. We also differentiated adolescents’ preferences in their digital privacy negotiations in terms of their willingness to trade-off some security for rewards via corporate surveillance (e.g., trading personal information for convenience) versus peer relationships (trading personal information for more followers).

How do adolescents’ beliefs, attitudes, and preferences translate to behaviors?

Adolescents generally show less concern about privacy than adults (Gasser & Palfrey, 2008; Moscardelli & Liston-Heyes, 2004), especially in sharing personal information on social media (Jones et al., 2009), and this may be because they may have more know-how to protect their privacy. Miltgen and Peyrat-Guillard (2014) found a reverse privacy paradox in adolescence where lower privacy concerns were associated with greater use of protective strategies for personal data, in contrast to adulthood where those with higher privacy concerns engaged in fewer privacy behaviors because they had less knowledge of online privacy strategies (Blank et al., 2014). Adolescents are ‘digital natives’ (Baumann, 2010), and perhaps more self-confident internet users that take a greater degree of personal responsibility for managing their proximal online privacy, often due to their technological literacy (Livingstone, 2008; Wisniewski, 2018).

Indeed, over the course of adolescence, youth increasingly express more positive attitudes towards data management, responsible social media use, and become more confident in their ability to prevent privacy violations (Miltgen & Peyrat-Guillard, 2014). Although the likelihood of providing certain types of personal information (e.g., photos of oneself or friends, school names, hometown, screen names from other social media) online increases with age (Lenhart & Madden, 2007; Steeves & Webster, 2008), younger adolescents are more likely to adopt restrictive privacy settings for their social media profiles than older teens (Caverlee & Webb, 2008; Livingstone, 2008). Studies with adolescents aged 14-18 years have shown they often engage in privacy protecting behaviors, however they are mostly aimed at broadly controlling how information about oneself is revealed to others (proximal privacy) (Livingstone, 2006, 2008). They use false names and pseudonyms when creating social media (Miltgen & Peyrat-Guillard, 2014), restrict profile access (boyd & Hargittai, 2010), delete tags and photos from friends, and limit friend requests and social connections (boyd, 2014; Marwick & boyd, 2014).

Although adolescents may be competent in proximal privacy practices to manage their reputations and social relationships, they may be less attuned to distal privacy practices for restricting their information from corporations or criminals. Adolescents may display another privacy paradox in terms of how they resolve tension between corporate/criminal privacy concerns, and motivations for impression management and social connectedness (Utz & Krämer, 2009). In other words, adolescents may report being concerned about corporations and criminals but are more motivated when using technology to risk digital disclosure to meet their priorities for peer belonging and identity validation. To contribute a more nuanced picture of how early adolescents’ knowledge and preferences account for their privacy-protecting behaviors, we distinguished between privacy protection behaviors related to corporations and criminals (e.g., paying for apps to avoid ads, using strong passwords) and those related to social relations (e.g., letting a friend of a friend who I have not met follow me on social media).

Current Study

We administered a survey to adolescents, 11-14 years old, to answer three main research questions about their privacy knowledge, preferences, and behaviors in proximal and distal contexts.

Research Question 1: What kinds of digital privacy-protecting behaviors do early adolescents report vis-a-vis social networks versus corporate surveillance and potential criminals?

Research Question 2: What are early adolescents’ beliefs, attitudes, and preferences for digital privacy vis-a-vis social networks versus corporate surveillance and potential criminals?

Research Question 3: Do adolescents’ beliefs, attitudes, and preferences predict their digital privacy-protecting behaviors after accounting for demographic characteristics such as age, ethnicity, and gender?

Demographic considerations: Given that extant research finds differences in privacy behaviors between younger and older adolescents (e.g., Shin et al., 2012; Rideout & Robb, 2019), but little has explored heterogeneity of practices within early adolescence, we consider how our findings vary across the middle school grades. Further, some have found differences in privacy preferences between males and females (Youn & Hall, 2008; Tifferet, 2019). Thus we consider how gender may relate to the preferences, attitudes, beliefs and practices. Finally, research on online activities, surveillance, and privacy often finds different beliefs and practices between white and non-white users (Shelley et al., 2004; Madden, 2017). We suspect that whiteness and its corresponding systemic privilege will be demonstrated by white students having less concern about corporate surveillance while having greater utilization of proximal privacy-protecting behaviors than non-white students. However, given the heterogeneity of non-white samples, we do not have hypotheses about attitudes, beliefs, and behaviors between different non-white racial or ethnic groups.

Method

Youth aged 11-14 years in the Southeastern United States were asked to complete an online privacy survey during their regular school day. Two separate middle schools serving grades 6 through 8 participated in the survey in May and June of 2019. Participating schools were part of a larger school district serving approximately 14,000 students from Pre-Kindergarten through grade 12 in a blend of rural, suburban, and urban settings covering 726 square miles. The school district utilized a one-to-one model of instructional technology integration, wherein students in the participating district were assigned a personal laptop to use throughout the academic school year. Surveys were administered through these laptops using Qualtrics.

At the time of the survey, Author Three was serving as a technology coach for the two participating middle schools. Author Three collaborated with school leadership on multiple technology initiatives throughout the year, addressing a variety of instructional and technical needs related to technology integration. Specifically, school leaders were interested in digital literacy, including student privacy practices.

Participants

Schools were selected based on convenience and interest in digital privacy habits. Two middle schools, as described above, were offered a chance to participate. Both schools expressed interest in learning more about youth digital privacy behaviors, in service of digital citizenship initiatives. Middle-schoolers (N=414) ages 11-14 years, completed the survey during class time. Participating students were 54% male (n=230) and most identified as White (52%), multi-ethnic (12.3%), Latino/a (8.9%), Black (8.5%), or Asian/Asian American (8.0%). Seventy-percent of youth reported using social media. Within age groups, 64% of students under 13 years old identified as social media users, compared with 76% of students 13 years old or older.

Table 1. Student characteristics

 

Gender

N

%

Male

224

54

Female

190

46

Age 

11

38

9.2

12

164

39.6

13

155

37.4

14

55

13.4

Non-real value (e.g. 38)

2

0.4

Race 

White

216

52.2

Multi-ethnic

51

12.3

Black or African American

35

8.5

Latino/a

37

8.9

Asian/Asian American

33

8.0

Native American or Pacific Islander

2

0.5

Prefer Not to Say

18

4.3

Other

22

5.3

Measures

The privacy survey was created by the authors, with assistance from central office administrators in the school district and school leadership at each site. School technology leaders (classroom teachers, principals, and librarians) also gave feedback that led to revisions to the content and length of the survey.

The anonymous survey asked youth about their gender, age, race, ethnicity, and the specific social media websites or apps they use. Youth were asked about the frequency of privacy protecting behaviors, and their attitudes and preferences around protecting privacy on social media, websites, and devices. Of the 14 behaviors asked, 8 focused on explicit behaviors that protect against hacking or access by unknown others such as criminals or corporations (e.g., use of strong passwords, using a VPN) and 6 questions asked about behaviors related to privacy in social relationships (e.g., not friending unknown others). Of the 7 attitude questions, 3 asked about attitudes and beliefs around corporate surveillance (e.g. Apps are selling your information to advertisers), 2 questions asked about future oriented thinking (e.g. Having a positive reputation online is important for getting a job in the future), and 2 asked about risk from predators (e.g. You can never be really sure who you are talking to online). Of the 10 items addressing digital privacy preferences, 4 questions asked about corporate surveillance (e.g. I like seeing ads for things I like online), 2 questions focused on peer social relationships (e.g. I like it when a friend tags me in a positive post), and an additional 4 questions asked about peer relationship preferences that had potential risk (e.g. I like it when people like my posts, even if I don’t know them). Table 3 contains sample questions and response formats for each of these subscales. It is important to note that questions about privacy preferences were last on the survey and subsequently had lower response rates due to time limits. Only 295 out of the 414 students completed these privacy preference items before the end of class. Comparisons between those with and without preference data found no differences in age, gender, race, ethnicity, or social media activities. A comparison of demographics between the two groups can be found in Table 2.

Table 2: Comparison demographics for preferences analytic sample (n=295) and non-answering (n=119).

Preferences sample

Non-preferences sample

Girl

45%

47%

White

50%

57%

Age 

11

9%

11%

12

39%

41%

13

37%

39%

14

15%

9%

Total scores and subscale scores were calculated for each of the attitudes/beliefs, preference, and practices described above. To account for missing items, mean scores were calculated based on the number of items answered. For example, youth were asked how often they engaged in 14 different privacy protecting behaviors. If a participant only responded to 10 of those 14 behaviors, the mean number of privacy behaviors was calculated with a denominator of 10, rather than 14. Although the analysis did not include a minimum number of answered items required for a scale score, all students answered at least three items to have an average score included in the analyses. For interpretability, several privacy in social relationships items were reversed coded (noted in Table 3) so that higher scores indicate more privacy protection. Total privacy behaviors consisted of privacy behaviors across social relationships and in relation to corporations and criminals.

Table 3: Privacy Survey

Topic of Questions

Specific Items

Response Options

Summary Scores

Privacy-Protecting Behaviors

 

 

Privacy Against Corporations and Criminals

 

8 items

●      Read privacy policies for apps/websites

●      Use strong passwords

●      Clear my browser history

●      Turn location sharing off

●      Use a VPN

●      Keep my social media accounts private

●      Block push notifications

●      Pay for apps to avoid ads

Frequency

(1=Never, 2=sometimes, 3=most of the time, 4=always)

 

 

Mean=2.20

SD=0.55

Range: 1 – 4

 

 

Privacy in Social Relationships

 

6 items

 

●      share personal information (R)

●      let only people I have met in person follow me on social media

●      let a friend of a friend, who I have not met, follow me on social media (R)

●      let people I do not know follow me on social networking sites (R)

●      ask people to take down photos of me/posts about me that I do not like

●      share my passwords (R)

Frequency

(1=Never, 2=sometimes, 3=most of the time, 4=always)

 

Mean=3.03

SD=0.50

Range: 1 – 4

Attitudes and Beliefs about Privacy

 

Corporate Surveillance

 

3 items

●      Companies use information about what you do online to try and sell you things

●      Your phone is listening to what you say, so companies can target ads at you

●      Apps are selling your information to advertisers

Likert

(1=Strongly Disagree to 5=Strongly Agree)

 

 

Mean=3.08

SD=1.05

Range: 1 – 5

 

 

Future-Oriented Thinking

 

2 items

●      Having a positive reputation online is important for getting a job in the future

●      Digital ink is permanent (What is posted online never goes away)

 

Likert

(1=Strongly Disagree to 5=Strongly Agree)

 

 

Mean=3.53

SD=1.00

Range: 1 - 5

 

 

Potential Predators

 

2 items

 

●      You can never be really sure who you are talking to online

●      It is important to keep all your accounts private

Likert

(1=Strongly Disagree to 5=Strongly Agree)

 

Mean=3.82

SD=1.00

Range: 1 - 5

Preferences about privacy and sharing

 

Peer Social Relationships

 

2 items

●      I like it when a friend tags me in a positive post

●      Posting on social media is important for my friendships

 

Likert

(1=Strongly Disagree to 5=Strongly Agree)

 

 

Mean=3.00

SD=0.98

Range: 1 - 5

 

Peer Relationships with Introduction of Risk

 

4 items

●      I like it when social media tags my location

●      I like it when people like my posts, even if I don’t know them

●      It’s okay to allow people to follow me, even if I don’t know them

●      Having more followers makes me feel good about myself

 

Likert

(1=Strongly Disagree to 5=Strongly Agree)

 

 

Mean=2.71

SD=0.93

Range: 1 - 5

 

Corporate Surveillance

 

4 items

 

●      I like it when apps suggest people I should follow or connect with

●      I like seeing ads for things I like online

●      I like it when I can log in with Google or Facebook

●      I like it when a website or app already knows who my friends are

Likert

(1=Strongly Disagree to 5=Strongly Agree)

 

Mean=2.91

 SD=0.91

Range: 1 - 5

Note: (R) indicates reverse coding, higher scores on subscales mean higher privacy protection

Analytic Plan

To address research questions 1 and 2, we describe the frequency of specific attitudes, preferences, and practices around privacy within social relationships, in relation to potential criminals such as hackers, phishers, and in connection to corporations or other business or data brokering interests.

To address research question 3, we used ordinary least squares (OLS) regressions to determine if attitudes, beliefs, and preferences about privacy predicted privacy-protecting behaviors globally and by domain (e.g., beliefs that corporations are collecting information predicts behaviors that limit corporations’ ability to collect data). The regression models that included preferences as predictors utilized a smaller sample (n=295). Preferences questions were located at the end of the survey, and schools reported that some students ran out of time during their advisory period, leading to a subsample (n=295) of students who completed the entire survey. The inclusion in the regression analyses of a dummy variable for missing student data revealed no significant differences for outcomes of interest (e.g. total privacy behaviors, privacy protecting behaviors in social relationships, and privacy protecting behaviors against corporations and criminals). T-tests revealed no significant differences by age, gender, or ethnicity between the sample with preferences data (n=295) and the sample with missing preferences data. To determine how demographic characteristics relate to privacy attitudes, beliefs, preferences, and practices we included gender, age and ethnicity as covariates in the research question 3 analyses.

For all analysis, grade, gender, and race/ethnicity were included as covariates. Grade was valued as 6th, 7th, or 8th grade. Gender was dichotomized as male or female since we were limited in our ability to ask about participants who identified as non-binary, based on schools’ feedback on our survey instrument. We did add a third option to our gender question, ‘prefer not to say’, but only 19 students selected this option. Given that the majority of students were white and that whiteness is likely associated wth different privacy preferences, race/ethnicity was dichotomized as white or non-white.

For survey data cleaning, we first distinguished between different types of incomplete nominal and ordinal data (i.e., ‘select all that apply’ questions: privacy practices, preferences, beliefs, or age, gender, ethnicity) in our initial sample (n=669). Adolescents who had only completed demographics questions (i.e., age, gender, and ethnicity) without finishing the rest of the survey were removed from our analyses, as were those responses which contained gibberish or nonsensical content for self-report questions (e.g., social media used). Additionally, users who demonstrated satisficing (e.g., straightlining) in their responses or those with extremely fast response times were also removed from our analyses. This left us with a final sample size of 414.

Results

RQ1: Behaviors. In order to better understand the kinds of privacy-protecting behaviors that early adolescents engage in, we looked at the total number of behaviors, the frequency of these behaviors, and specific behaviors promoting privacy for social relationships and with corporations and criminals. On average, youth engaged in 9 discrete privacy behaviors at least some of the time (M=8.73, SD=2.63, N=414), ranging from 3 to 14 behaviors. See Figure 1 for the percentage of youth engaging in each privacy behavior. In looking at patterns of behaviors, youth engaged in an average of 5 out of 8 possible privacy behaviors (number of reported behaviors) that protect against hacking or corporate interests (M=4.71, SD=1.66, Range:1-8) and an average of 4 out of 6 possible behaviors that involve social relationships (M=4.02, SD=1.42, Range:1-6). In considering differences in privacy behaviors across different demographic characteristics, white youth reported significantly fewer total and specific privacy behaviors (M=8.43, SD=2.68, n=216) than non-white peers (M=9.05, SD=2.54, n=198; t=2.39, p=0.02). Specifically, white youth (M=4.52, SD=1.66) reported significantly fewer privacy-protecting behaviors against corporations and criminals than their non-white peers (M= 4.92, SD=1.65; t(412)= 2.43, p=0.02). Youth also varied in the average amount of privacy behaviors reported between age groups (F(3, 408)=4.24, p<0.01) with younger students reporting fewer total privacy-protecting behaviors (11 years, M=7.58; 12 years, M=8.50, 13 years, M=9.10, 14 years, M=9.01). Youth did not vary in the number of privacy against corporations or criminals behaviors by age. However, younger students reported fewer privacy behaviors in social relationships compared to older students (11 years, M=3.47; 12 years, M=3.87, 13 years, M=4.25, 14 years, M=4.18; F(3, 408)=4.16, p<0.01).

Figure 1. Frequencies of youth (n=414) reported privacy behaviors (Behaviors)

Note: Values represent a response of sometimes, most of the time, or always.

In terms of frequency, youth reported engaging in privacy-protecting behavior from corporations and criminals, sometimes to most of the time (M=2.20, SD=0.55). Interestingly, youth more frequently engaged in behaviors that protect privacy in social relationships, reporting a mean of 3.03 (SD=0.5) which equates to most of the time to all of the time, as shown in Table 3. Average frequency of privacy in social relationships scores varied with age (F(3, 408)=2.81, p=0.04) with younger adolescents reporting more frequent privacy-protecting behaviors in social relationships than older children (11 years, M=3.17; 12 years, M=3.07, 13 years, M=3.00, 14 years, M=2.91).

In looking at specific types of behaviors, 56% of youth reported sharing their location with someone when using a smartphone. Most common location sharing partners were parents (22%), friends (19%), and siblings (17%). Interestingly, 79% reported turning off location sharing on apps. While 88% of youth report using strong passwords, almost all (99%) also reported sharing their password with others.

RQ2: Beliefs, Attitudes, and Preferences. To better understand adolescents’ attitudes and beliefs about privacy (RQ2), we asked youth to rate their agreement with many well-known prescriptive messages surrounding digital privacy (e.g. You can never be really sure who you are talking to online, digital ink is permanent). These messages centered on corporate surveillance, protection from predators, and presentation online for future oriented thinking, as described in Table 3. Frequencies of agreement with each prescriptive message are presented in Figure 2.

Figure 2. Frequencies of agreement with prescriptive messaging around privacy (Attitudes)

Note: Values represent a response of agree or strongly agree

Agreement with these prescriptive privacy messages was rather low. For instance, only 30% of youth believed that “Apps sell information to advertisers.” This is especially interesting considering that 58% of youth agreed that “Corporations use information they collect about your online behaviors to sell to you things.” The highest rate of agreement was with messages that focused on contact with strangers, which we labeled protection against potential predators (e.g. It is important to keep all your accounts private). Subscale scores for attitudes around corporate surveillance, future thinking, and protection from predators (see Table 3) suggest that youth, on average, express more frequent agreement with messaging around protection against predators (M=3.82, SD=1.00) than messaging around corporate surveillance (M=3.08, SD=1.05) and future oriented thinking (M=3.53, SD=1.00).

Attitudes about these messages differed across youth, with girls agreeing more with future-oriented items (M=3.72, SD=0.89) than boys (M=3.38, SD=1.05; t=-3.47, p<0.01). White youth reported more agreement with future oriented items (M=3.66, SD=0.94) than non-white peers (M=3.40, SD=1.03; t=-2.66, p<0.01). Privacy attitudes around protection against predators differed by gender as well, with girls agreeing more with statements promoting privacy protection against predators (M=3.96, SD=0.93) compared with boys (M=3.71, SD=1.04; t=-2.51, p=0.02).

Of the 295 students who completed preference items, privacy-protecting preferences conceptually organized around exclusively peer social relationships (e.g., I like it when a friend tags me in a positive post), peer relationships with potential risks (e.g. I like it when people like my posts, even if I don’t know them), and corporate surveillance (e.g., I like seeing ads for things I like), as previously described in Table 3. On average, youth were less likely to agree with preferences that introduced risk into their social relationships (M=2.71, SD=0.93; e.g. It’s okay to allow people to follow me, even if I don’t know them), where connecting with others may allow for connecting with potential predators, compared with situations facilitating peer social relationships (F(21, 273)=14.36, p<0.001; M=3.00, SD=0.97; e.g. Posting on social media is important for my friendships). Additionally, youth were less likely to agree with preferences for corporate surveillance statements (M=2.91, SD=0.91; e.g. I like it when apps suggest people I should follow or connect with) compared with situations facilitating peer social relationships (F(21, 273)=8.07, p<0.001; M=3.00, SD=0.97). The frequency of youth digital privacy preferences are reported in Figure 3, suggesting that popular youth preferences span categories of peer social relationships (e.g. I like it when a friend tags me in a positive post) and corporate surveillance (e.g. I like it when I can log in with Google or Facebook). Youth’s preferences around corporations and peer relationships did not significantly differ by gender or ethnicity. However, girls expressed less preference for privacy in peer relationships (M=2.58, SD=0.88) than boys (M=2.81, SD=0.96; t=2.12, p=0.04).

Figure 3. Frequency of agreement with specific preferences around digital privacy (Preferences)

Note: Values represent a response of agree or strongly agree

RQ3: Predicting Behaviors from Attitudes, Beliefs, and Preferences. Next, we considered how youth’s beliefs, attitudes, and preferences, alongside sociodemographic factors, predicted their privacy-protecting behaviors.

Total Privacy Behaviors. Privacy-protecting behaviors involve negotiation with social relationships, potential predators, and corporations. Using multivariate OLS analyses to predict total privacy-protecting behaviors based on beliefs/attitudes (Model 1) and preferences (Model 2) as shown in Table 4, we found that youth with beliefs about protecting against predators online (β=0.13, p<0.001) were more likely to engage in more privacy-protecting behaviors in general, accounting for 13.0% of variance in behavior (Model 1). Including youth’s preferences accounted for 27.4% of the variance in global privacy behaviors (Model 2). Attitudes around potential predators (β=0.15, p<0.001) (e.g. You can never be really sure who you are talking to), less preference for peer relationships with potential risks (β=-0.15, p<0.001) (e.g. I like it when people like my posts, even if I don’t know them (reversed scored), and more preference for corporate surveillance (β=0.07, p<0.05) (e.g. I like it when I can log in with Google or Facebook), were significant predictors of total privacy-protecting behaviors, holding other predictors constant.

Privacy in Social Relationships. Youth engaged in different types of privacy behaviors based on whether privacy was for social relationships or to protect against corporations and potential criminals. Using adolescent attitudes around privacy and student demographics as predictors (Model 3), we found that youth with beliefs about protecting against predators online (β=0.15, p<0.001) were more likely to engage in more privacy-protecting behaviors in social relationships, accounting for 15.6% of variance in behavior (Model 3). By incorporating preferences around privacy (Model 4), along with socio-demographic characteristics, percentage of explained variance in privacy behaviors vis-a-vis social relationships (e.g., I let only people I have met in person follow me online) increased to 30%. Youth’s beliefs and attitudes related to potential predators online (β=0.11, p<0.001) and less preference for peer relationships with potential risk (β=-0.22, p<0.001) (Model 4) were all significant predictors of privacy behaviors vis-a-vis social relationships, as shown in Table 4.

Table 4. Regression predicting global and specific privacy behaviors

 

Global Privacy Behavior

Privacy in Social   Relationships

Privacy Against Corporations and Criminals

 

(1)

(2)

(3)

(4)

(5)

(6)

Age

0.02

0.03

-0.08**

-0.08**

0.10***

0.11***

 

(0.02)

(0.02)

(0.02)

(0.03)

(0.03)

(0.03)

Girl

0.09*

0.05

0.10*

0.05

0.08

0.04

 

(0.04)

(0.04)

(0.05)

(0.05)

(0.05)

(0.06)

White

-0.03

-0.08

0.04

-0.02

-0.06

-0.12*

 

(0.04)

(0.04)

(0.05)

(0.05)

(0.05)

(0.06)

Corporate Surveillance Attitudes

0.00

-0.00

0.00

0.00

0.00

-0.01

 

(0.02)

(0.02)

(0.03)

(0.03)

(0.03)

(0.03)

Future Oriented Thinking Attitudes

0.01

0.01

0.01

0.05

-0.00

-0.02

 

(0.03)

(0.03)

(0.03)

(0.03)

(0.03)

(0.04)

Potential Predators Attitudes

0.13***

0.15***

0.15***

0.11***

0.13***

0.18***

 

(0.02)

(0.03)

(0.03)

(0.03)

(0.03)

(0.04)

Corporate Surveillance Preferences

 

0.07*

 

0.04

 

0.09*

 

 

(0.03)

 

(0.04)

 

(0.04)

Peer Relationship Preferences

 

0.02

 

0.01

 

0.03

 

 

(0.03)

 

(0.04)

 

(0.04)

Peer Relationships with Risk Preferences

 

-0.15***

 

-0.22***

 

-0.09*

 

 

(0.03)

 

(0.04)

 

(0.04)

Constant

1.70***

2.17***

3.31***

4.07***

0.44

0.75

 

(0.28)

(0.33)

(0.33)

(0.40)

(0.38)

(0.45)

Observations

401

293

401

293

401

293

R-squared

0.130

0.274

0.156

0.300

0.088

0.190

Note: * p<0.05, ** p<0.01,  *** p<0.001; Standard error in parentheses. The outcome variables: global privacy behavior, privacy in social relationships, and privacy against corporations and criminals are based on numbers of different types of behaviors endorsed.

Privacy Against Corporations & Criminals. Youth’s privacy attitudes and student characteristics (Model 5) account for 8.8% of the variance in privacy against corporations and criminals, as shown in Table 4. Incorporating youth preferences (Model 6), alongside attitudes and student characteristics, accounted for 19.0 % of the variance in behaviors related to privacy against corporations and criminals, as shown in Table 4. Attitudes about protecting against potential predators were predictive of engaging in behaviors that protect against predators such as “Use strong passwords” and “Keep social media accounts private” (β=0.18, p<0.001), while higher preference for peer relationships with risk was associated with less privacy-protecting behavior against corporations and criminals (β=-0.09 p<0.05). Interestingly, more expressed preference for corporate surveillance (e.g. I like it when I can log in with Google or Facebook) was associated with more frequent privacy-protecting behavior against corporations and criminals (β=0.08, p<0.05).

Differences by Student Characteristics. To understand the degree to which early adolescents differentiate in their privacy behaviors against corporations and criminals compared with privacy in social relationships, based on what we know about development, demographic characteristics were used as predictors for privacy behaviors globally and across domains. Specifically, youths’ age, gender, and ethnicity were used as covariates. Results, shown in Table 4, indicate that age was a significant predictor of privacy behavior in social relationships and privacy behavior against corporations and criminals, though in different directions. Older students tended to engage in fewer privacy-protecting behaviors in social relationships (β=-0.08, p<0.01), as compared to younger adolescents. Alternately, behavior that protects against corporations and criminals was more frequent (β=0.11, p<0.001) among older tweens. Gender was not a significant predictor for privacy behaviors, when controlling for attitudes and preferences, while race was related to behaviors against corporations and criminals. Notably, white youth (β=-0.12, p<0.05) reported fewer privacy-protecting behaviors against corporations and criminals such as turning location sharing off or keeping accounts private, compared with their non-white peers.

Discussion

Our survey asked early adolescents to report on their privacy behaviors, their endorsement of prescriptive privacy messages, and their privacy management preferences to understand how desires for social connectivity and awareness of corporate surveillance and datafication practices shape their privacy management strategies. Results revealed that early adolescents’ understandings and motivations related to digital privacy are complex and multifaceted, as their behaviors reflect tradeoffs between privacy-protective strategies against corporations or potential predators, and disclosures to maintain social connectedness. On average, adolescents reported more privacy protecting behaviors vis-a-vis social relationships than corporations, and endorsed prescriptive messages about privacy risks associated with predators more strongly than risks associated with corporate surveillance or future reputation.

They were more likely to engage in privacy protecting behaviors related to social relationships when they endorsed prescriptive predator messages and when they did not prefer risky peer relations over security; however, they were more likely to engage in privacy protecting behaviors vis-a-vis corporations and criminals when they endorsed predator messages and preferred the convenience of corporate surveillance over security. Age was an important factor in our findings; 11- and 12-year-olds in our sample were less likely to engage in privacy protection behaviors related to corporations and criminals but more likely to engage in privacy protection behaviors related to social relationships compared to 13- and 14-year-olds. This finding aligns with other research which has observed that when young adolescents initiate social media use, they have smaller online social networks that become more expansive as they age (Antheunis et al., 2016; Valkenburg et al., 2006).

The observed differences between white and non-white young adolescents’ privacy beliefs, attitudes, and practices potentially speak to possible manifestations of white privelege related to surveillance. Some research suggests that corporate surveillance specifically is used in racialized ways, creating a digital manifestation of traditional surveillance tactics to perpetuate racism and anti-blackness in the United States (Browne, 2015). Thus, it is perhaps unsurprising that white youth were less concerned with information sharing with corporations and reported fewer privacy-protecting behaviors related to corporate data collection in comparison to non-white youth.

Our results highlight the importance of distinguishing between distal and proximal forms of privacy, as well as different forms of proximal privacy. Research finds that efforts to present oneself in a favorable light on social media is higher for older adolescents and young adults (Dhir et al., 2016; Yau & Reich, 2019). Thus, it is not surprising that the more proximal types of privacy related to social relationships loosen as adolescents get older while protective privacy behaviors for corporations and criminals may become more stringent. Early adolescents in our study also distinguished between different kinds of proximal privacy in peer relationships--they were less likely to endorse privacy preferences that introduced risk into their social relationships, yet they endorse loose privacy preferences that facilitate peer social connectedness and relationships. These findings highlight how early adolescents’ privacy preferences for different kinds of peer relationships cannot be easily grouped together, as only preferences for risky peer relationships predicted early adolescents’ privacy behaviors. Communication Privacy Management theory (CPM) (Palen & Dourish, 2003) can help us to understand how early adolescents learn that privacy is continuously negotiated and centered in peer interactions, where tensions and ‘boundary turbulence’ arise as youth learn to navigate the complex boundaries of privacy, risk, and social connectedness/disclosure (Dourish & Anderson, 2006; Palen & Dourish, 2003) in regards to risky and safer/known peer relationships. Vertesi et al. (2016), building on CPM theory, highlight the complicated intersection of boundaries to be navigated in the current digital age, where youth are just learning to balance their interpersonal relationships with individuals in their extended personal, professional, and consumer networks, while on the other hand “weighing a conflicting moral imperative to safeguard and protect their personal data and information disclosures” in networked publics (p. 487). Our results directly speak to the tensions early adolescents experience, as they must weigh looser privacy preferences to facilitate greater peer connectivity, while not introducing too much risk to their social relationships

Although early adolescents reported engaging in a wide variety of privacy-protecting behaviors, both in regards to corporations and potential predators, and in social relationships, there were contradictions in their self-reported behaviors, beliefs, attitudes, and preferences. For example, adolescents who preferred corporate surveillance (i.e., agreed with statements such as, “I like seeing ads for things I like online”) reported more privacy-protecting behaviors vis-a-vis corporations. Perhaps this is another kind of privacy paradox reflecting how adolescents who are more involved in technology and know how to clear browser histories and block push notifications both appreciate the conveniences of, and are concerned about, corporate surveillance. Similar to the paradox described in the introduction, adolescents may be motivated by convenience, so much so that they are willing to give certain information to corporations for that benefit. In addition, early adolescents’ beliefs and attitudes related to prescriptive messages for privacy protection from corporations was also contradictory. They largely agreed that corporations use your information to sell you things but tended not to agree that apps sell personal information to advertisers. These contradictions in beliefs about prescriptive privacy messages could be indicative of early adolescents’ limited understanding of the full scope of corporate surveillance tactics in their use of social media (Smith & Shade, 2018). Perhaps online spaces are viewed differently from apps that are installed on smartphones. As early adolescents grow up in networked publics, they must learn the details of corporate surveillance to inform their strategies for optimizing tradeoffs for social connectedness and identity that work at a network-level, where individual one-to-one strategies for interpersonal privacy management become insufficient.

Marwick and boyd (2014) conceptualize a networked model of privacy, where achieving privacy for youth means developing “an understanding of and influence in shaping the context in which information is interpreted” (p. 1063), emphasizing the importance of controlling information flows on social media; that individuals can no longer entirely maintain one-to-one control over personal information. Instead, these choices and practices are networked, determined through a “combination of audience, technical mechanisms, and social norms,” making privacy negotiation a continuous and ongoing process (p. 1062). Early adolescents are at a developmental moment when they are just beginning to learn to balance networked conceptualizations of privacy with interpersonal relationships that are situated in their daily lives. Importantly, their preference for some aspects of corporate surveillance, and protective strategies against other datafication practices, may represent the turbulence or tensions that youth feel in weighing information disclosure and willing participation in corporate datafication practices, with identity needs for autonomy, exploration, and social connectedness.

Social scientists have long theorized that the concept of privacy cuts across cultures, yet the ways in which privacy manifests itself culturally, how people actually practice privacy, is deeply-contextual (Altman, 1977). Often, current U.S. legal models put in place to protect young children and early adolescents (e.g., COPPA), conceptualize privacy in simplistic and individualistic models of human behavior (Cohen, 2012; Marwick & boyd, 2014) focusing on the age of 13 as an arbitrary cutoff for protections, whereas privacy management in the current digital age occurs across vast networked publics and is highly culturally-specific. Privacy management is both “contextual and relationally-accountable” in nature (Vertesi et al., 2016), and privacy is negotiated in social relationships which are embedded in culture, neither a fully interpersonal, nor fully networked process, but a combination thereof. Therefore, understanding early adolescents’ needs for social connectedness and identity exploration can help to explain the tradeoffs adolescents make, disclosing personal information and becoming vulnerable in order to grow friend networks, gain connections, and express oneself. Rethinking our implementation of age-gated protection measures such as COPPA for early adolescents, will require a culturally-situated approach that understands youth’s social and identity needs in the context of social media, instead of paternalistic protection measures that do not take youth’s developmental needs into account, past the age of 13.

It is also important to note that understanding potential risks in digital spaces could be cognitively challenging to youth (Stoilova et al., 2019). Conceptualizing risk from others, such as a hacker who might break into an account or steal private information or envisioning a sinister man pretending to be an 8th grade female to gain one’s trust, are easier than understanding that data are being extracted from sites and applications about social connections, geographic locations, online activities. As such, youth’s reasoning about risk and enacting behaviors that might mitigate those risks should be different for protecting against predators, criminals and corporations.

Limitations

Because little work on digital privacy has focused on early adolescence as a developmental context, this research represents first steps in exploring adolescents’ perspectives. Scale items were created to measure important distinctions between different types of privacy, however some of these items did not hang together well as a single construct. For example, adolescents responded very differently to the two items meant to capture their privacy-protecting preferences for peer social relationships. Our data accounted for more variability in adolescents’ privacy-protection behaviors in the context of social relationships compared to corporations and criminals and thus future work should focus on better understanding early adolescents’ perspectives and behaviors related to distal forms of privacy. In addition, our survey study was cross-sectional and therefore cannot identify causal relationships between attitudes, preferences and behaviors. Future studies should use longitudinal designs to see how these factors relate over time. Additionally, it will be necessary to use experimental methods for the study of early adolescents’ privacy attitudes, preferences, and behaviors, as longitudinal data on their own are not sufficient for identifying causal relationships between these factors. Further, we surveyed and compared three grades (6th, 7th, 8th). Future work should explore how privacy practices change as students mature from grades 6th to 8th. This is especially important as youth become more proficient social media users (Lenhart et al., 2011; Madden et al., 2013), their social networks expand in size, their need for identity formation and intimacy increase, and federal protections like COPPA decrease. Another important limitation is that we asked what youth do but did not actually observe what youth do to protect their privacy.

Our sample is composed of 6th through 8th graders at two large, east-coast United States middle schools where more than 50% of students identified as white and/or male. The demographics of our sample limit our ability to generalize these findings to more broadly to other U.S. samples, and internationally. Though these findings shed some light on white students’ preferences, they do not offer insights into the heterogeneity of practices for non-white youth. Due to time constraints, some students were not able to complete the privacy preferences questions that were located at the end of the survey. This sample likely represents students who were either more conscientious, moved through the survey more quickly, or had a teacher who allowed more time for survey completion during class time. Notwithstanding these limitations, our study contributes to a limited body of work documenting early adolescents’ digital privacy beliefs, preferences, and behaviors.

Conclusion

Children today are growing up in a highly connected world in which behaviors, in the domains of social relationships to consumerism, are traceable. This constant surveillance by peers, corporations, and potential maleficents necessitates the need for youth to consider their digital privacy at both a broader distal level and an interpersonal level. To date, the vast majority of research has not included youths’ voices and perspectives on privacy or considered how privacy attitudes and preferences relate to privacy-protecting practices. This study represents one step towards gaining a better understanding of how the current generation of early adolescents conceptualize digital privacy; how their privacy attitudes, beliefs, and endorsement of prescriptive privacy messages predict their privacy behaviors. Grounded in early adolescents’ perspectives, our work casts a light on the perfect storm that adolescents are facing in the current digital age, weighing tradeoffs between social connectedness, autonomy, identity exploration, and risky disclosure decisions.

Authors

Nicholas D. Santer is a doctoral student in developmental psychology at the University of California, Santa Cruz. His current research is focused on adolescents’ and emerging adults’ lived experiences with communication technologies and social media, focusing on the ways in which youth construct stories of the self across diverse polymedia landscapes.

Adriana M. Manago is an assistant professor of psychology at the University of California, Santa Cruz. Research in her Culture and Technology lab examines communication technologies, cultural change, social and identity development from adolescence through the transition to adulthood in diverse cultural communities, including a Maya community in southern Mexico.

Allison Starks is a doctoral student in the School of Education at the University of California, Irvine. Her research examines the affordances and limitations of technology for positive child outcomes across contexts, with a special emphasis on school-level technology integration.

Stephanie M. Reich is a Professor in the School of Education with additional appointments in Informatics and Psychological Science, Director of the Development in Social Context (DISC) Lab, and core faculty in the Connected Learning Lab at the University of California, Irvine. Her research focuses on understanding and improving the social context of children’s lives through exploring and intervening with the direct and indirect influences on the child, specifically through family, digital, and school environments.

References

Agosto, D. E., & Abbas, J. (2017). “Don’t be dumb—that’s the rule I try to live by”: A closer look at older teens’ online privacy and safety attitudes. New Media & Society, 19(3), 347-365.

Albert, D., & Steinberg, L. (2011). Judgment and decision making in adolescence. Journal of Research on Adolescence, 21(1), 211-224.

Albert, D., Chein, J., & Steinberg, L. (2013). The teenage brain. Current Directions in Psychological Science 22(2), 114-120.

Anderson, M., & Jiang, J. (2018) Teens, social media & technology. Pew Internet and American Life Project. https://www.pewresearch.org/internet/2018/05/31/teens-social-media-technology-2018/

Antheunis, M. L., Schouten, A. P., & Krahmer, E. (2016). The role of social networking sites in early adolescents’ social lives. Journal of Early Adolescence, 36(3), 348-371.

Baumann, M. (2010). Pew report: digital natives get personal. Information Today, 27(10), 18-18.

Blank, G., Bolsover, G., & Dubois, E. (2014). A new privacy paradox. In Proceedings of the Annual Meeting of the American Sociological Association 2014 (pp. 1-34).

boyd, d. (2014). It's complicated: The social lives of networked teens. Yale University Press.

boyd, d., & Hargittai., E. (2010). Facebook privacy settings: Who cares?. First Monday, 15(8).

Browne, S. (2015). Dark matters: On the surveillance of blackness. Duke University Press.

Caverlee, J., & Webb, S. (2008). A large-scale study of MySpace: observations and implications for online social networks. In Proceedings of the Second International Conference on Weblogs and Social Media (ICWSM).

Chein, J., Albert, D., O’Brien, L., Uckert, K., & Steinberg, L. (2011). Peers increase adolescent risk taking by enhancing activity in the brain’s reward circuitry. Developmental Science, 14(2), F1-F10.

Cohen, J. E. (2012). Configuring the networked self: Law, code, and the play of everyday practice. Yale University Press.

Cohen, A. O., Breiner, K., Steinberg, L., Bonnie, R. J., Scott, E. S., Taylor-Thompson, K., ... & Silverman, M. R. (2016). When is an adolescent an adult? Assessing cognitive control in emotional and nonemotional contexts. Psychological Science, 27(4), 549-562.

Costello, C. R., McNiel, D. E., & Binder, R. L. (2016). Adolescents and social media: Privacy, brain development, and the law. The Journal of the American Academy of Psychiatry and the Law, 44(3), 313-321.

Dhir, A., Pallesen, S., Torsheim, T., & Andreassen, C. S. (2016). Do age and gender differences exist in selfie-related behaviours?. Computers in Human Behavior, 63, 549-555.

Dourish, P., & Anderson, K. (2006). Collective information practice: Exploring privacy and security as social and cultural phenomena. Human-Computer Interaction, 21(3), 319-342.

Federal Trade Commission. (1998). Children's Online Privacy Protection Rule (COPPA). https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/childrens-online-privacy-protection-rule.

Gasser, U., & Palfrey, J. (2008). Born digital: Understanding the first generation of digital natives. Perseus Books Group.

Galvan, A., Hare, T. A., Parra, C. E., Penn, J., Voss, H., Glover, G., & Casey, B. J. (2006). Earlier development of the accumbens relative to orbitofrontal cortex might underlie risk-taking behavior in adolescents. Journal of Neuroscience, 26(25), 6885-6892.

Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. The British Journal of Sociology, 51(4), 605-622.

Harwell, D. (2020). Mass school closures in the wake of the coronavirus are driving a new wave of student surveillance. Washington Post, 1, April. https://www.washingtonpost.com/technology/2020/04/01/online-proctoring-college-exams-coronavirus/.

Holloway, D., & Green, L. (2016). The Internet of toys. Communication Research and Practice, 2(4), 506-519.

Juvonen, J. (2007). Reforming middle schools: Focus on continuity, social connectedness, and engagement. Educational Psychologist, 42(4), 197-208.

Juvonen, J., & Murdock, T. B. (1995). Grade‐level differences in the social value of effort: Implications for self‐presentation tactics of early adolescents. Child Development, 66(6), 1694-1705.

Knifsend, C. A., & Juvonen, J. (2013). The role of social identity complexity in inter‐group attitudes among young adolescents. Social Development, 22(3), 623-640.

Lapenta, G. H., & Jørgensen, R. F. (2015). Youth, privacy and online media: Framing the right to privacy in public policy-making. First Monday, 20(3).

Lenhart, A., & Madden, M. (2007). Teens, privacy & online social networks: How teens manage their online identities and personal information in the age of MySpace. Pew Internet & American Life Project.

Lenhart, A., Madden, M., Smith, A., Purcell, K., Zickuhr, K., & Rainie, L. (2011). Teens, kindness and cruelty on social network sites: How American teens navigate the new world of "digital citizenship". Pew Internet & American Life Project.

Livingstone, S. (2005). Mediating the public/private boundary at home: children's use of the Internet for privacy and participation. Journal of Media Practice, 6(1).

Livingstone, S. (2006). Children's privacy online: experimenting with boundaries within and beyond the family. In: R. Kraut, M. Brynin, & S. Kiesler (Eds.), Computers, phones, and the internet : Domesticating information technology. Human technology interaction series (pp. 145-167). Oxford University Press.

Livingstone, S. (2008). Taking risky opportunities in youthful content creation: teenagers' use of social networking sites for intimacy, privacy and self-expression. New Media & Society, 10(3), 393-411.

Livingstone, S. (2014). Developing social media literacy: How children learn to interpret risky opportunities on social network sites. Communications, 39(3), 283-303.

Lupton, D., & Williamson, B. (2017). The datafied child: The dataveillance of children and implications for their rights. New Media & Society, 19(5), 780-794.

Madden, M. (2017). Privacy, security, and digital inequality. Data & Society.

Madden, M., Lenhart, A., Cortesi, S., Gasser, U., Duggan, M., Smith, A., & Beaton, M. (2013). Teens, social media, and privacy. Pew Research Center, 21(1055), 2-86.

Marwick, A. E. (2008). To catch a predator? The MySpace moral panic. First Monday, 13(6).

Marwick, A. E., & Boyd, D. (2014). Networked privacy: How teenagers negotiate context in social media. New Media & Society, 16(7), 1051-1067.

Marx, G., & Steeves, V. (2010). From the beginning: Children as subjects and agents of surveillance. Surveillance & Society, 7(3/4), 192-230.

Miltgen, C. L., & Peyrat-Guillard, D. (2014). Cultural and generational influences on privacy concerns: a qualitative study in seven European countries. European Journal of Information Systems, 23(2), 103-125.

Montgomery, K. C., Chester, J., & Milosevic, T. (2017). Children’s privacy in the big data era: Research opportunities. Pediatrics, 140(Supplement 2), S117-S121.

Moscardelli, D. M., & Liston-Heyes, C. (2004). Teens surfing the net: How do they learn to protect their privacy?. Journal of Business & Economics Research (JBER), 2(9).

Palen, L., & Dourish, P. (2003). Unpacking" privacy" for a networked world. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 129-136).

Peter, J., & Valkenburg, P. M. (2011). Adolescents’ online privacy: Toward a developmental perspective. In Privacy online (pp. 221-234). Springer, Berlin, Heidelberg.

Regan, P. M., & Steeves, V. (2019). Education, privacy, and big data algorithms: Taking the persons out of personalized learning. First Monday, 24(11).

Rideout, V. J., & Robb, M. B. (2019). The common sense census: Media use by tweens and teens. Common Sense Media.

Robinson, S. C. (2016). iDisclose: Applications of Privacy Management Theory to Children, Adolescents and Emerging Adults. In Youth 2.0: Social Media and Adolescence (pp. 139-157). Springer, Cham.

Rooney, T. (2012). Childhood spaces in a changing world: Exploring the intersection between children and new surveillance technologies. Global Studies of Childhood, 2(4), 331-342.

Rudolph, M. D., Miranda-Domínguez, O., Cohen, A. O., Breiner, K., Steinberg, L., Bonnie, R. J., ... & Richeson, J. A. (2017). At risk of being risky: the relationship between “brain age” under emotional states and risk preference. Developmental Cognitive Neuroscience, 24, 93-106.

Shelley, M., Thrane, L., Shulman, S., Lang, E., Beisser, S., Larson, T., & Mutiti, J. (2004). Digital citizenship: Parameters of the digital divide. Social Science Computer Review, 22(2), 256-269.

Shin, W., & Kang, H. (2016). Adolescents' privacy concerns and information disclosure online: The role of parents and the Internet. Computers in Human Behavior, 54, 114-123.

Shin, W., Huh, J., & Faber, R. J. (2012). Tweens' online privacy risks and the role of parental mediation. Journal of Broadcasting & Electronic Media, 56(4), 632-649.

Smith, A. R., Chein, J., & Steinberg, L. (2014). Peers increase adolescent risk taking even when the probabilities of negative outcomes are known. Developmental Psychology, 50(5), 1564.

Smith, K. L., & Shade, L. R. (2018). Children’s digital playgrounds as data assemblages: Problematics of privacy, personalization, and promotional culture. Big Data & Society, 5(2), 2053951718805214.

Somerville, L. H., Hare, T., & Casey, B. J. (2011). Frontostriatal maturation predicts cognitive control failure to appetitive cues in adolescents. Journal of Cognitive Neuroscience, 23(9), 2123-2134.

Steeves, V., & Webster, C. (2008). Closing the barn door: The effect of parental supervision on Canadian children's online privacy. Bulletin of Science, Technology & Society, 28(1), 4-19.

Steeves, V., & Regan, P. (2014). Young people online and the social value of privacy. Journal of Information, Communication and Ethics in Society.

Steinberg, L., Cauffman, E., Woolard, J., Graham, S., & Banich, M. (2009). Are adolescents less mature than adults?: Minors' access to abortion, the juvenile death penalty, and the alleged APA" flip-flop.". American Psychologist, 64(7), 583.

Stoilova, M., Nandagiri, R., & Livingstone, S. (2019). Children’s understanding of personal data and privacy online–a systematic evidence mapping. Information, Communication & Society, 1-19.

Taylor, E., & Rooney, T. (2016). Surveillance futures: Social and ethical implications of new technologies for children and young people. Routledge.

Teräs, M., Suoranta, J., Teräs, H., & Curcher, M. (2020). Post-Covid-19 education and education technology ‘solutionism’: A seller’s market. Postdigital Science and Education, 2(3), 863-878.

Tifferet, S. (2019). Gender differences in privacy tendencies on social network sites: a meta-analysis. Computers in Human Behavior, 93, 1-12.

Tufekci, Z. (2008). Can you see me now? Audience and disclosure regulation in online social network sites. Bulletin of Science, Technology & Society, 28(1), 20-36.

Utz, S., & Krämer, N. C. (2009). The privacy paradox on social network sites revisited: The role of individual characteristics and group norms. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 3(2).

Valkenburg, P. M., Peter, J., & Schouten, A. P. (2006). Friend networking sites and their relationship to adolescents' well-being and social self-esteem. CyberPsychology & Behavior, 9(5), 584-590.

Vertesi, J., Kaye, J., Jarosewski S. N., Khovanskaya, V. D., Song, J. (2016). Data narratives: Uncovering tensions in personal data management. Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 478-490). doi:10.1145/2818048.2820017

Wisniewski, P. (2018). The privacy paradox of adolescent online safety: A matter of risk prevention or risk resilience?. IEEE Security & Privacy, 16(2), 86-90.

Yau, J. C., & Reich, S. M. (2019). “It's Just a Lot of Work”: Adolescents’ self‐presentation norms and practices on Facebook and Instagram. Journal of Research on Adolescence, 29(1), 196-209.

Youn, S. (2009). Determinants of online privacy concern and its influence on privacy protection behaviors among young adolescents. Journal of Consumer Affairs, 43(3), 389-418.

Youn, S., & Hall, K. (2008). Gender and online privacy among teens: Risk perception, privacy concerns, and protection behaviors. Cyberpsychology & Behavior, 11(6), 763-765.

Zuboff, Shoshana. 2019. The age of surveillance capitalism. Profile Books.

Comments
0
comment
No comments here
Why not start the discussion?