Skip to main content
SearchLoginLogin or Signup

Co-constructing Digital Futures: Parents and Children becoming Thoughtful, Connected, and Critical Users of Digital Technologies

Published onJun 29, 2021
Co-constructing Digital Futures: Parents and Children becoming Thoughtful, Connected, and Critical Users of Digital Technologies
·

Abstract

As researchers and parents we understand the need to build digital literacy and engagement through the digital world, but that this is counterbalanced by giving up privacy and leaving a data trail. By early adolescence, our children are internalizing acceptable internet use. Parents and teachers need to be part of the conversation with them that shapes their understanding on these concepts. This chapter presents key findings from four case studies that examined how parents and children might understand, navigate, and become more reflective about the trends, forces, and tensions around privacy, security, and algorithms in their lives and the activities in which they engage on screens.

Key Findings

  • Finding an approach point--a time, condition, or opportunity for a teachable moment, often through conversation--is an essential first step in fostering children’s understanding and reflection about the role screens and technologies play in their lives.

  • Providing media mentorship, or a guide that can help youth navigate the digital world, facilitates children’s work translating their experiences with screens and technologies into positive and productive lifelong learning skills.

  • Addressing concerns head-on is a powerful strategy for turning potential fear about privacy and/or security while interacting in digital spaces into vigilance.

  • Using language of empowerment to position the Internet as a tool and convey to children that when it is properly harnessed that it is useful for learning and entertainment and is associated with global discourse communities.

“What’s an algorithm?” This question is not one that most children would ask their parents. But when the parent is a literacy/technology researcher, interesting conversations about computers and their functions seep into the home.

Algorithms are increasingly part of everyday life, and children, as they engage on digital devices, are affected by programs written by companies whose primary goal is to sell content and products. These same companies promote apps that capture the attention of youth of all ages, often under the guise of entertainment, education, or connecting individuals in a community.

As literacy/technology researchers, we understand that children live in and shape a connected world where they have the ability to consume and create literally at their fingertips. We care deeply about preparing them to be lifelong learners with the skills they need to access, analyze, evaluate, create, and participate through digital technologies (Ito et al., 2013).

We are also parents who must navigate the realities of a digital world: every time our children log into an app on a device they are using at school, they leave a data trail. We know they engage in the affordances of digital technologies often through the price of their privacy (Berson & Berson, 2006). At the same time, we know that developing digital literacy includes the understanding that algorithms drive users to particular content (Burrell, 2016).

Historically, parents have adopted a range of strategies when thinking about children, screens, and technology. In their study of screens in the homes of children in the UK, Livingstone & Blum-Ross (2020) identified three genres of parenting practices, framed by particular values and beliefs, around the use of new digital media, tools, and technologies in their homes and in their children’s lives. The first of these Livingstone and Blum-Ross call “embrace.” Here, parents welcome new technologies and harness them for some sort of specific use. The second parenting practice, which Livingstone and Blum-Ross (2020) call “balance” is marked by parents “encouraging some digital practices and not others, often ad hoc, weighing opportunities and risks salient in the present or future” (p. 11). The final parenting practice, Livingstone and Blum-Ross call “resist.” Here, parents try to stop media, screens, technologies from becoming essential components of their lives, arguing that these present a problem for their child. National and international organizations have developed position statements for parents (e.g., American Academy of Pediatrics [2016], Zero to Three [2018]) and teachers (e.g., the National Association for the Education of Young Children and the Fred Rogers Center [2012], the International Literacy Association [2018, 2019], and the National Council of Teachers of English [2019]) as well as curriculum programs (e.g. Common Sense Media’s Digital Citizenship Curriculum [2020]) that present research findings and are designed to teach about staying safe online.

Several of these resources directly address privacy and security, offering guidance around practices and strategies to help children learn about keeping their information private and secure. What is key in each of these is that the adult holds the ultimate decision-making power in determining how, whether, why, and when a child may engage with digital technologies and media. None, however, present children with information about the tools themselves and allow for informed children to make choices that reflect their critical understanding of the issues. Some question whether children can understand, for example, how their worldviews can be limited by geofencing and other algorithmic tools that are driven by for-profit purposes. We decided to test the waters with our own nine children, who ranged in age from 4-12 years of age at the time of data collection. Cognizant of the parenting strategies of embrace, balance, resist, we viewed each parent/child dyad as a case study, we asked the following questions: (1) How can parents and children understand and navigate the trends, forces, and tensions around privacy, security, and algorithms in their lives? and (2) In what ways might children become more reflective about the activities in which they engage on screens? This article focuses on our middle school children, who at the time of this data collection were approaching the age of 13, a critical one as defined by the Children’s Online Privacy Protection Act (COPPA, 1998). Table 1 displays the one parent/child dyad from each researchers’ home that was included for this multiple case study.

In this chapter, we describe critical moments from each selected case that helped us to understand how parents might engage middle school-aged children in conversations that begin to develop important understandings about algorithms, privacy, and security in a digital world. These moments came from four dyads that included a researcher/parent and a single child (see Table 1).

Table 1

Parent/Child Dyads

Researcher/Parent

Child

Age of Child

Context of Critical Moment

Kristen

Megan

12

Reviewing the terms of use of an app that the child requested to download

Kathleen

Charlie

11

Discussion of privacy in social media apps in response to concerned emails from school

Ian

Jax

9

Addressing the challenges of interactions with strangers when the child received a message online

Elizabeth

Addy

11

Reviewing risks and rewards of Internet use in response to child’s request for a smartphone

Our parent/child dyads took a range of approaches to generate conversations and data. These included guided drawing, using graphic organizers, close reading and discussing of terms and services, and utilizing mentor texts around digital media and its use. We each video recorded and transcribed our interactions.

To analyze the data, we leveraged grounded theory and an open-coding approach (Corbin & Strauss, 2014). First, each of us open-coded our own transcripts. Then we shared and exchanged transcripts, meeting regularly to discuss the data. Our discussion centered on identifying the approaches that were

effective in eliciting discussion and critical reflections in our children, specifically about understanding and navigating the trends, forces, and tensions around privacy, security, and algorithms in our lives.

The collaborative discussions provided a space for us to build consensus across the four case studies (Yin, 2017). Finally, we each recorded a reflective discussion with our child in the form of a podcast to allow our children a voice in the research process. These conversations revealed their perspectives on what they learned as a participant and allowed them the opportunity to check our own understanding of their experience. We published versions of these recordings, as well as researcher reflections, publicly at our website (https://screentime.me/digital-futures/) and shared with our social media networks in order to solicit feedback.

Through a collaborative, inductive approach that drew from our dual roles as parents and literacy researchers, we identified critical moments that highlighted themes that appeared across the data.

Find an Approach Point

Megan (age 12) owned a smartphone but was not yet a social media user. She was vocal about the effect social media had on her friends, and she had no interest in joining the bandwagon. However, she surprised her mother Kristen (Author) after school one day by asking, “Can I get Snapchat?” Based on family rules, it would have been easy for Kristen to restrict, answer, “No,” and move on.

However, Megan’s question provided the perfect approach point for Kristen to discuss the roles of privacy, security, and algorithms on social media platforms. An approach point is a time, condition, or opportunity for a teachable moment, often through conversation. They sat together, perusing the terms of use and privacy agreements on the Snapchat website, and as they read together, Kristen clarified unfamiliar terms and concepts.

For example, the pair discussed the data that Snapchat collected and how algorithms allowed the company to use that data “to serve ads you might be interested in — when you might be interested in them” (Snap, 2019). By considering Megan’s question, rather than responding with restricting, and an immediate answer, Kristen was able to engage her daughter in conversation that helped her to understand the role of algorithms in the app that her friends were using.

Provide Media Mentorship

Identifying the approach point with children is an important first step in teaching them about technologies. During these conversations, parents can provide media mentorship, or a guide that can help youth navigate the digital world while working to translate these experiences into positive and productive lifelong learning skills (Haines et al., 2016).

Kathleen (Author) adopted a practice of “think-aloud” with her son Charlie (age 11) in order to provide mentorship. She invited Charlie to help shop for a new hockey stick, an activity that is oftentimes done in brick-and-mortar, but was shifted to the internet to invite Charlie to examine and think critically about how algorithms function. While looking at reviews on YouTube, Kathleen pointed out the ads appearing in the margin. She thought aloud as she invited Charlie to observe:

I notice these boxes here don’t seem to be related to hockey. They show me things that are a lot like what I’ve been searching for lately--vacations, proper grammar explanations to share with students. I wonder what might happen to these ads if you keep looking for things that interest you?

By engaging Charlie in a routine task - shopping online - Kathleen was able to share her own thought process as she encountered ads while simultaneously prompting Charlie to think about the underlying algorithms. Similar mentorship can be done using books, TV shows, movies, and games as parents and children create, do, and explore together in order to help children better understand the workings of the internet and how algorithms affect what they see.

Address Concerns Head-on

As conversations between parents and children evolve, it is likely that issues about “safe spaces” will emerge. Ian learned that addressing concerns directly through explanations of algorithms, privacy, and security helped turn potential fear into vigilance.

Jax (age 9) was using Google Hangouts, an instant messaging platform that allowed him to share messages, photos, and videos with his parents. Though the family thought that the account was completely private, accessible only to Jax and his parents, Jax was surprised to see a message from a stranger asking for photos of the child. Worried, Jax asked his father Ian (Author), “Daddy, who sent me this message? Is it someone from the games I play?”

Though Ian took steps to protect Jax by blocking the account, he also recognized the moment as an approach point and explained to his son (and his younger daughter, 4 years old) how this message may have appeared. Their conversation about privacy, security, and algorithms allowed the children to adopt a stance of vigilance. Ian applauded Jax for bringing the breach to his attention, and instead of simply protecting his child by blocking an account, he addressed the concern head on, bringing awareness to his children.

Use Language that Empowers

Parental instinct is to protect their children from harm, and it is tempting to use language that presents the Internet in dichotomies: good/bad, safe/unsafe. Elizabeth learned that the language she used to talk about issues of privacy and security with her daughters mattered.

After reading A Smart Girl’s Guide: Digital World (Anton, 2017) together and discussing the issues it raised, Elizabeth asked Addy (age 11) to explain what she learned. Addy said: “Not to use your real name, a photo of you, or pictures of your life. You need to be specific with passwords and accounts so you can stay safe. Sometimes you click on things that are not safe.”

In reviewing the transcript of their conversation, Elizabeth realized that her own language may have influenced Addy’s learning that the Internet is a place that may not be “safe” and that she may not have control over her encounters in unsafe spaces. Through reflection, Elizabeth understood that she needed to use language that empowered her children to be agents in Internet use, as compared to passive participants who are controlled by technology.

Language of empowerment would position the Internet as a tool, when properly harnessed, to learn, to be entertained, and to associate with global discourse communities in online settings. It would suggest that individuals can grow and develop in positive ways when they learn about themselves and the world around them, and it would celebrate individuals’ Internet use and expertise. During the recent US election, for instance, Addy used the Internet as a tool to learn about current event politics. As a result, Elizabeth and Addy were able to celebrate what she learned and use her new knowledge as an approach point to talk about history, worldviews, and policy. Instead of positioning the Internet as “bad” or “unsafe,” Elizabeth’s use of language of empowerment positioned the Internet as a tool for learning.

Make Conversations Ongoing

Our research with our children has taught us that conversations about privacy, security, and the nature of algorithms need to start early and be ongoing. Both Megan and Charlie were able to articulate insight that most people do not know how algorithms work, and virtually no one (especially none of their friends) reads terms-of-use and privacy agreements. Even so, they acknowledged that even if people knew more, it probably would not change how they use the Internet because websites and apps are such an embedded part of life. Much of this discussion is also a challenge for adults as they often do not pay attention to the responsibilities necessary as a web literate citizen. The focus of our inquiry, and lessons learned from this work point to the knowledge construction and reflective practice around the use of these digital texts, tools, and spaces to empower children to grow into adults who make informed, critical decisions.

By early adolescence, our children are internalizing acceptable internet use. Parents and teachers need to be part of the conversation with them that shapes their understanding on these concepts. Jax was able to explain his knowledge to his four-year-old sister, suggesting that this work can involve older children in mentoring their younger siblings or schoolmates. This approach ultimately requires that parents and teachers open lines of communication with children as they strive to collaboratively make sense of these new environments. A restrictive approach (Livingstone & Blum-Ross, 2020) might not allow spaces for such critical and collaborative sensemaking, and likewise, parents who take either an embracing or balancing stance might consider the critical role of conversation and child empowerment in decisions for the family.

As literacy researchers, we are parents with, perhaps, more knowledge about how algorithms and privacy work in a digital world, and we sit at an interesting intersection (Garcia et al., 2014). In this writing, we propose a more collaborative approach than what has typically been adopted when thinking about children and technology. Rather than framing the problem as technology doing harm to children, we suggest that we can empower children to advocate for their own rights in an age of screentime (Turner et al., 2017). The four strategies we name above can support this effort.

Authors

W. Ian O’Byrne is an Assistant Professor of Literacy Education at the College of Charleston in South Carolina. His research focuses on the dispositions and literacy practices of individuals as they read, write, and communicate in online and/or hybrid spaces. His work can be found on his website (https://wiobyrne.com/) or in his weekly newsletter (https://digitallyliterate.net/).

Contact: [email protected] or @wiobyrne

Kristen Turner is Professor and Director of Teacher Education in the Caspersen School of Graduate Studies at Drew University in New Jersey. She is the co-author of Connected Reading: Teaching Adolescent Readers in a Digital World and Argument in the Real World: Teaching Students to Read and Write Digital Texts and the editor of Ethics of Digital Literacy: Developing Knowledge and Skills across Grade Levels.

Contact: [email protected] or @teachKHT

Kathleen A. Paciga is an Associate Professor of Education in the department of Humanities, History, and Social Sciences at Columbia College Chicago. Katie’s work examines the ways in which media, in all of its diverse forms, are integrated into children’s literate lives as well as the ways in which these affect the ways children learn and grown ups teach. Katie served as the 2020 chair of the Excellence in Early Learning Digital Media Award sponsored by the Association for Library Services for Children.

Contact: [email protected] or @kpaciga

Elizabeth Stevens is an Associate Professor of Teacher Education at Roberts Wesleyan College. Elizabeth’s research interests include literacy teacher education, literacy teacher identity, and literacy and technology. Elizabeth is an Area Chair for Literacy Assessment, Evaluation, and Public Policy and an e-Editor for the Literacy Research Association.
Contact: [email protected] or @eystevens

References

American Academy of Pediatrics Council on Communication and Media. (2016). Media and young minds. Pediatrics, 138(5), 1–6. https://doi.org/10.1542/peds.2016-2591

Anton, C. (2017). A smart girl’s guide: Digital world. American Girl.

Berson, I. R., Berson, M.J. (2006). Children and their digital dossiers: Lessons in privacy rights in the digital age. International Journal of Social Education, 21(1), 135–47. https://www.learntechlib.org/p/60795/.

Bertot, J. C., Jaeger, P. T., & Hansen, D. (2012). The impact of policies on government social media usage: Issues, challenges, and recommendations. Government Information Quarterly, 29(1), 30–40. https://doi.org/10.1016/j.giq.2011.04.004

Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society, 3(1). https://doi.org/10.1177/2053951715622512

Children’s Online Privacy Protection Act. (1998). https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/childrens-online-privacy-protection-rule

Common Sense Media. (2020). Digital citizenship curriculum. https://www.commonsense.org/education/digital-citizenship/curriculum

Corbin, J., & Strauss, A. (2014). Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage Publications.

Dillon, R. S. (2010). Respect for persons, identity, and information technology. Ethics and Information Technology, 12(1), 17–28. https://doi.org/10.1007/s10676-009-9188-8

Emejulu, A., & McGregor, C. (2019). Towards a radical digital citizenship in digital education. Critical Studies in Education, 60(1), 131–47. https://doi.org/10.1080/17508487.2016.1234494

Garcia, A., Cantrill, C., Filipiak, D., Hunt, B., Lee, C., Mirra, N., O’Donnell-Allen, C., & Peppler, K., (2014). Teaching in the connected classroom. Digital Media and Learning Research Hub. https://dmlhub.net/publications/teaching-connected-learning-classroom/index.html

Haines, C., Campbell, C. & Association for Library Services to Children [ALSC]. (2016). Becoming a media mentor: A guide for working with children and families. ALA Editions.

International Literacy Association. (2018). Improving digital practices for literacy, learning, and justice: More than just tools [Literacy leadership brief]. International Literacy Association.

International Literacy Association. (2019). Digital resources in early childhood literacy development [Position statement and research brief]. International Literacy Association.

Ito, M., Gutiérrez, K., Livingstone, S., Penuel, B., Rhodes, J., Salen, K., Schor, J., Sefton-Green, J., & Watkins, S. C. (2013). Connected learning: Agenda for research and design. Digital Media and Learning Research Hub. https://dmlhub.net/publications/connected-learning-agenda-for-research-and-design/

Livingstone, S. & Blum-Ross, A. (2020). Parenting for a digital future: How hopes and fears about technology shape children’s lives. Oxford University Press.

National Association for the Education of Young Children [NAEYC], & Fred Rogers Center for Early Learning and Children’s Media at Saint Vincent College. (2012). Technology and interactive media as tools in early childhood programs serving children birth through age 8. NAEYC & Fred Rogers Center. http://www.naeyc.org/files/naeyc/file/positions/PS_technology_WEB2.pdf

National Council of Teachers of English [NCTE] (2019). Definition of literacy in a digital age. NCTE. https://ncte.org/statement/nctes-definition-literacy-digital-age

Snap. (2019). Snapchat Privacy Policy. Retrieved February 25, 2020. https://www.snap.com/en-US/privacy/privacy-policy

Turner, K. H., Jolls, T., Hagerman, M. S., O’Byrne, W. I., Hicks, T., Eisenstock, B. & Pytash, K. E. (2017). Developing digital and media literacies in children and adolescents. Pediatrics, 140(Supplement 2), S122–26. https://doi.org/10.1542/peds.2016-1758P

Willson, M. (2018). Raising the ideal child? Algorithms, quantification and prediction. Media, Culture & Society, 41(5), 620–36. https://doi.org/10.1177/0163443718798901

Yin, R. K. (2017). Case study research and applications: Design and methods. Sage Publications.

Zero to Three (2018). Screen-use tips for parents of children under three. Zero to Three. https://www.zerotothree.org/resources/2531-screen-use-tips-for-parents-of-children-under-three#downloads

Comments
0
comment
No comments here
Why not start the discussion?