When considering the rights of children and the obligations corporations and regulators have to them in the algorithmic environment the role and needs of parents are often neglected. Media regulations are traditionally framed as advocating for the children yet, the trope fails to consider the needs of parents playing the role of familial gatekeeper. Using my perspective as a media scholar, a former television S&P executive, and a parent I highlight some of the issues and implications parents face in terms of carrying the responsibility for children’s digital media. I approach the argument from the angle of digital parental controls, then move on to discuss the positioning of good parents and the ideology behind their responsibilities. I offer a vantage point from the parent perspective to evaluate the design and affordances of parental controls. Finally, I call for parents and caregivers’ voices to be better amplified in the industry and regulatory arena.
When it comes to protecting children, we need to also consider parents.
An ideology of guilt-driven responsibility and protectionism has saturated the expectations of parents within the digital environment. Tech and media providers contribute to this ideology by offering parental controls, even when those digital tools offer little in terms of actual control.
Described as “tools” for good parents, parental controls have shifted the burden of “protecting children” first from the government to the digital media industry and finally to the parent at home navigating kids’ content.
Parents need advocates in a broader regulatory arena to voice their concerns as they are often so busy handling their own kids' media they do not have time to push for a major course-correction in digital rights for stakeholders.
Parental guilt. As a former broadcast television and children's content standards and practices (S&P) censor and as a parent of two young kids, it took entering academia to upgrade this term with what I now call responsibilization. When it comes to digital media, according to the social rules governing public scripts, I must not only provide my child with access and the skills to manage content but am also held responsible to protect them from it.
When I turn on a Disney+ program for my six-year-old to buy myself an hour’s time to do work, I take note of the “outdated cultural depictions” and “contains tobacco use” disclaimers appearing on the app. As a former television censor, I think about the negotiations and decision-making that had to happen in order to label this content and provide this type of warning over classic content carrying antiquated and racist portrayals. As a parent in a pandemic, hoping to enable my child to pick their own programming, unable to hover in the room to better explain these depictions, I wish I could take these choices off the menu altogether. Finally, as a media researcher, I recognize that the norms of the platform and the affordances of the technology allow me, the parent, very little customizable utility.
When considering the rights of children and the obligations corporations and regulators have to them in the algorithmic environment, I entreat you not to forget about the parents. Media regulation such as COPPA and the Children’s Television Act are traditionally framed as advocating for the children with the expressed intent to protect societies' most “vulnerable subjects” (Sefton-Green, 2006, p. 282). Yet, the trope fails to consider the needs of parents playing the role of familial gatekeeper where good parents are implicated as the primary wave of defense to protect children from modern media.
Parents carry limitations and hesitation on what they can control in the digital realm and how they might fashion digital platforms, apps, and parental controls to best work within their household. Technology carries constraints as well, and when it comes to streaming and social media, there is no one-size-fits-all to customize features for each individual. But, in a self-governed, data-rich environment, the platforms and industry providers possess the power to not only better protect children, but to ease the burden on parents. Parents need advocates in a broader regulatory arena to voice their concerns as they are often so busy handling their own kids' media they do not have time to push for a major course-correction in digital rights for stakeholders.
In this piece, I utilize my perspective as a media scholar, a former television S&P executive, and a parent to point to some of the issues and implications parents face in terms of carrying the responsibility for children’s digital media. I specifically approach my argument from the angle of digital parental controls in the United States, which have evolved from network television program practices and self-regulation. I first discuss the positioning of good parents and the ideology behind their responsibilities. Then, I offer a vantage point from the parent perspective evaluating the design and affordances of parental controls. Finally, I call for parents and caregivers’ voices to be better amplified in the industry and regulatory arena.
Long before the 2020 pandemic, parents have navigated their media responsibilities as part of the pull yourselves up by your bootstraps American ideology. The success of our children has been governed through a political rationale which interpolates individual families as in charge of their destiny (Willett, 2015; Cowan, 1983; Pugh, 2009). These practices, however, are not new. Meredith Bak (2020) discusses the preoccupations by parents for using new media toys (then in the form of a zoetrope!) to educate youth in the nineteenth century. In Daniel Thomas Cook’s book, designing a malleable child through consumer culture in America’s 1800s centered around “maternal responsibility,” what he calls “the moral project” (2020, p. 4). Part of the contemporary role of the caregiver is to provide technological opportunities which will ensure a child’s success—from preschool edutainment and private daycare, to coding classes and SAT-test prep, and everything in between (Ito, 2012; Hoover et al., 2012; Livingstone & Blum-Ross, 2020). Mothers and fathers do not balk at this responsibility because it feels common-sense, inherent to the job of being a good parent. But parents cannot simply provide the technology, they also must protect their children from it. Parents furnish and then must regulate the child’s digital realm. The responsibilization of parenting is a term that describes not only the guilt, but the flood of advice and assumed duties that accompany our surge in digital technology. Responsibilization is an ideology of parenting protectionism which has saturated the tech & media industry, government regulation, and public discourse. In the world of digital parenting that we’ve come to accept as normal, the parent is the gatekeeper, the censor, and the protector of children amidst an onslaught of capitalism and self-regulation.
Within the media ecosystem, good parents have been constructed as part of the dominant discourse heard throughout society to point out what a parent should be doing when it comes to monitoring children’s digital environments. Alicia Blum-Ross and Sonia Livingstone whose extensive research delved into digital parenting practices in the UK documented “confessions” of “laziness” and “sentiments of guilt” in the parents they sat down with, writing: “time and again we heard parents of young children struggle to balance the convenience of screen time with their worries about being a ‘good’ parent” (2018, p. 183). Beyond television media, caregivers have also been held accountable in media discourse as being in charge of children’s online activities, responsible for promoting educational and learning opportunities, which often favors middle- and upper-middle-class families and their media habits (Clark, 2013; Lareau, 2011). Media research has shown that the gender and classed hierarchies associated with good parenting position parents at fault if they aren’t monitoring the shows and games their children are playing, if they fail to set up parental controls, or if they overindulge their children in the consumer media marketplace (Steiner et al., 2016; Clark, 2013; Willett, 2015; Seiter, 1995).
Regulators, media providers, even organizations set up to help protect and educate parents lean on the ideal of the good parent. The international non-profit, Family Online Safety Institute (FOSI), boasting members from across government and tech sectors from Amazon to Verizon, created a free downloadable book for parents seeking guidance on technology in the home entitled “How to be a Good Digital Parent” (2020). Our media and parenting culture has grown accustomed to the idea that a good parent is one who is righteously vigilant watching over children’s media consumption and digital experiences. The use of the term, “parental control” and its utility offered by many apps and platforms may offer families a sense of empowerment through a suite of technology affordances. But in its reality, it is alleviating pressure on the part of the digital provider, shifting industry self-regulation to the home, aimed squarely at those parents aspiring to be good.
Rarely would a parent or caregiver describe the digital realms that our families operate in, particularly the parental controls offered by technology companies, as an oppressive constraint put on a family. If anything, our culture tells us it is a family’s path to freedom, a choose-your-own-adventure. We have become so accustomed to the discussion and industry-created buzz surrounding the affordances of the digital (e.g., parents have the "tools they need to make wise decisions about what is right for themselves and their families"(Netflix, 2018)) that it practically seems absurd to consider families oppressed. Yet whether parental controls are used or ignored, we must recognize that the “tools” offered represent the transfer of the regulatory burden from government to industry, to the parent at home navigating kids’ content.
When evaluating the tools offered to parents, we need to consider what is missing. Digital streamers are quick to point out the offerings and personalization their platforms and upgrades provide parents. Netflix claims its algorithmic technology helps its members be “better informed, and more in control, of what they and their families choose to watch and enjoy on Netflix” (Hastings, 2018). However, beyond a PIN code and a baseline maturity setting for my kids’ viewing profiles, I observe when navigating various digital parental controls how little control I actually can command as a parent.
Given what we know about Netflix’s use of algorithmic personalization based around our metadata (Tryon, 2015; Seaver, 2018) and the practices of tagging kids content internally (Grothaus, 2018), the limitations of my control as a parent are just that- limited. I have no power to instruct a streaming platform to remove outdated cultural depictions, or stories about fire, ghosts, or use of the term “shut up.” As a good parent, I’ve bought into the belief that I need to do my due diligence to protect my children from various depictions and references. As a subscriber, I notice that while streamers have told me how powerful their algorithms and data might be, I have very little power to filter content in the streaming environment. As a former censor studying media culture, I wonder how parents might handle these responsibilities while operating in the shadows of opaque offerings. If digital content providers are relegating self-regulation to parents, shouldn’t parents be offered more tools to do so?
Presently, there are too few safeguards or regulations surrounding platform governance to protect parents or ease their burden. The present reality of digital parenting is one where schoolteachers assign videos via Seesaw platforms on iPads for kindergartners. The old adage to “Just turn it off” won’t cut it. Traditional swathes such as v-chip ratings also will not suffice. We live in a radically different media environment than we did in the early days (the 90’s!) of the v-chip, where regulators impelled traditional television broadcasters to create standards and blocking functionality across linear TV programming. The digital environment has enveloped the child-rearing experience. Its global but opaque nature has clouded more traditional pathways of protecting the end user through industry-wide regulation. Platforms and providers, however, need not wait for top-down regulation to better serve parents, they just need to pay better attention to how their data and affordances can best help families. And families need better advocates.
When I recently suggested at an international conference on social media governance that parents would have to "take to the streets" to push back and demand better offerings and services in the form of industry self-regulation and government guidelines, I was met with sympathetic chortles and snorts. It is laughable to imagine mothers taking to picket lines for this matter of contention amidst the many issues we are all facing as American and global citizens in this moment. But that is my point. It is laughable, not because it is unimaginable, but because the guilt that technology and parental controls have created can barely be a priority when families as a whole are not prioritized within the intensified, contradictory market-logic that is parenting within the digital age in America.
When thinking of the children, we also need to consider the parents and encourage research for civic-minded justice for families through the lens of domestic media practices. If parents are being guilted into manning the controls surrounding kids’ content, I argue that we must advocate for better controls. We should demand more of our lawmakers and the tech industry to marshal and cultivate data towards personalized tools for parents. To protect children, we must start by protecting parents. Advocacy for parents should reflect their wide and varied needs and become a focus across government, technology and media sectors, working to promote increased transparency and accountability. To advocate for parents means we must recognize the limitations of technology and parental controls and work to lessen the guilt and burden of responsibility weighing on parents in the digital domestic arena.
Maureen Mauk is a Doctoral Candidate in Media and Cultural Studies at University of Wisconsin’s Communication department studying the intertwined relationship between parents, policy, and industry as it relates to television history and the current platformized media landscape. She carries a decade of experience serving in Los Angeles as a Television Standards & Practices executive and has been published in several journals including Journal of Cinema and Media Studies, and Learning, Media & Technology.
No conflicts of interest to disclose.
Correspondence concerning this article should be addressed to Maureen Mauk 821 University Ave, Vilas Hall, Madison WI 53706. Email: firstname.lastname@example.org
Bak, M. (2020). Playful Visions. MIT Press.
Clark, Lynn Schofield (2013). The parent app: Understanding families in the digital age. OxfordUniversity Press.
Cook, D.T. (2020). The moral project of childhood. New York University Press.
FOSI (2020). Good digital parenting. https://www.fosi.org/good-digital-parenting
Blum-Ross, A., & Livingstone, S. (2018). The trouble with “screen time” rules. In G. Mascheroni, C. Ponte, & A. Jorge (Eds.), Digital parenting: The challenges for families in the digital age (pp. 179-188). International Clearinghouse on Children, Youth and Media.
Cowan, R. S. (1983). More work for mother: The ironies of household technology from the open hearth to the microwave. Basic Books.
Hastings, R. (2018, March 5). Introducing PIN protection and other enhancements for informed viewing. Netflix Media Center. https://media.netflix.com/en/companyblog/introducing-pin-protection-and-other-enhancements-for-informed-viewing.
Ito, M. (2012). Engineering play: A cultural history of children’s software. MIT Press.
Grothaus, M. (2018, March 28). How I got my dream job of getting paid to watch Netflix. Fast Company. https://www.fastcompany.com/40547557/how-i-got-my-dream-job-of-getting-paid-to-watch-netflix
Hoover, S.M., Clark, L.S., & Alters, D.M. (2012). Media, home and family. Routledge.
Lareau, A. (2003). Unequal childhoods: Class, race, and family life. University of California Press.
Livingstone, S. & Blum-Ross, A. (2020). Parenting for a digital future. Oxford University Press.
Netflix (2020). How does Netflix decide maturity ratings? https://help.netflix.com/en/node/2064
Pugh, A. (2009). Longing and belonging: Parents, children, and consumer culture. University of California Press.
Seaver, N. (2019). Captivating algorithms: Recommender systems as traps. Journal of Material Culture, 24, 421 - 436. doi:10.1177/1359183518820366
Sefton-Green, J. (2006). Youth, technology, and media cultures. Review of Research in Education, 30, 279–306.
Seiter, E. (1995). Sold separately: Children and parents in consumer culture. Rutgers University Press.
Steiner, L. & Bronstein, C. (2016). When tiger mothers transgress: Amy Chua, Dara-Lynn Weiss and the cultural imperative of intensive mothering. In H.L. Hundley, & S.E. Hayden (Eds.), Mediated moms: Contemporary challenges to the motherhood myth (pp. 247-273).
Tryon, C. (2015). TV got better: Netflix’s original programming strategies and binge viewing. Media Industries, 2, 104-116. https://doi.org/10.3998/mij.15031809.0002.206
Willett, R. J. (2015). The discursive construction of ‘good parenting’ in the case of children’s virtual world games.” Media, Culture & Society 37(7), 1060–75. https://doi.org/10.1177/0163443715591666.