Skip to main content
SearchLoginLogin or Signup

Algorithmic Literacies: K-12 Realities and Possibilities

Published onJun 29, 2021
Algorithmic Literacies: K-12 Realities and Possibilities
·

Abstract

Algorithmic literacy is lacking among many K-12 educators, and where it does exist it tends to be contained to curriculum already focused on computer science. Educators are often hesitant to become literate in this area due to a variety of issues including discomfort around complex issues, lack of knowledge on these topics, or a mistaken belief that the responsibility to educate students lies with someone else. It is critical that educators move beyond these barriers and take advantage of efforts to empower them to learn, and teach on these topics.

Key Findings

  • Within K-12 schooling, algorithms are primarily explored within computer science courses, creating many missed multidisciplinary and authentic opportunities for developing students’ algorithmic literacies.

  • K-12 teachers may avoid exploring algorithmic systems in their curriculum due to lack of background knowledge and basic digital literacies, discomfort in engaging in the political deliberations inherent to these conversations, or a sense that it is someone else’s job to prepare students for life in an algorithmic culture.

  • If we hope for educators to help develop their students’ algorithmic literacies, we must first attend to the algorithmic literacies of educators themselves. One place to start might be in empowering educators to critically evaluate algorithmic classroom digital tools.

Uneven digital literacies set the stage

When we speak of “algorithmic rights and protections of children,” of course what happens within schools is of particular consequence. Conversations about children and technology have evolved from a hyper-focus on “risks” to a more balanced consideration of “opportunity” enabled by digital technologies (Gasser and Cortesi 2017), with “access” -- to devices, skills, and literacies -- being central to this evolving conversation. K-12 schools continue to thread this needle, establishing policies and practices that allow students to access the opportunities afforded by digital technologies while ideally minimizing the risk of negative consequences. This means that, via educational experiences in the classroom, young people may exercise their right of access to and use of powerful algorithmic-driven technologies, but in so doing may become overly exposed to the sorting and tracking mechanisms enabled by these very same technologies. What’s more, schools and districts may have protectionist policies in place that aim to reduce student data exposure to unknown and problematic algorithmic systems, but it is not uncommon for these policies to be inconsistently enforced or widely misunderstood by school staff. Critical data literacies (Pangrazio and Selwyn 2019) help educators and students grapple with the implications of algorithmic systems and their place within those systems so that they might be able to take more informed action, but developing these critical data literacies is severely hampered by a lack of general background knowledge, confidence, or interest on the part of school staff, including classroom teachers.

As a school-based K-12 technology integrationist, I see a wide range in algorithmic literacies in educators and students. I have come to understand that the reasons for this are numerous, but let us start with a fundamental issue: the varying levels of basic digital skills of many educators. To be clear, this is not unique to the K-12 educator workforce: adults in the United States in general do not fare well on measurements of “problem solving in technology-rich environments” as compared to international counterparts (Vanek 2017), adding up to adults in many sectors in the US demonstrating limited digital literacy skills. Thinking with technology is a true barrier for many adults in the workforce, so much so that any device or software becomes a “black box” to the user. It is no wonder, then, that algorithms and the literacies that would help make sense of them feel impenetrable to those who might still be mastering basic digital skills. This, then, begs the question: where in K-12 schooling do students develop algorithmic literacies?

Finding a place for algorithms in the curriculum

Within the K-12 curriculum, mention of algorithms is often housed solely within computer science courses, in part because there is a sense that algorithmic literacy is a highly technical and specialized skill. For example, in the 2016 Massachusetts Digital Literacy and Computer Science (DLCS) curriculum frameworks and standards, algorithmic literacy is firmly situated within the Computational Thinking strand, which itself is repeatedly tied to programming first and foremost. Algorithmic literacy is of course central to the education of a future computer scientist, but given the ubiquity of algorithms and algorithmic systems, is it not also important for the user of these technologies to exercise some degree of algorithmic literacy as well? What’s more, tying algorithmic literacy to computer science primarily does not touch all students, as we know that inequalities along gender and race persist in the enrollment patterns for computer science courses (Code.org Advocacy Coalition).

Algorithmic systems are important to study not just to learn their architectures and internal machinations so that we can build them ourselves as computer scientists; they are also worth studying because these algorithmic systems have wide-reaching consequences for society. Within the Massachusetts DLCS standards, there is indeed a Computing and Society strand of standards that asks for curriculum that considers “the beneficial and harmful effects of computing innovations” (p. 47). But algorithms are not mentioned here, even though these harmful effects cannot be truly understood divorced from discussion of the “coded inequities” (Benjamin 2019) (re)inscribed by algorithmic systems. Examples of these algorithmically coded inequities abound, and engaging case studies for the classroom that consider any number of these real-life examples can be integrated into literally any content area in the K-12 curriculum. We miss these opportunities if we see algorithmic systems as only the purview of computer scientists.

For educators who may have limited algorithmic background knowledge, where to start may seem daunting, and it is currently difficult to find ready-made classroom materials that provide examples of what algorithmic literacy across the curriculum can look like. In my work, I have developed just such curriculum, in the hopes of inspiring curiosity about algorithms in both students and teachers. As one example, I have developed and implemented curriculum for 8th and 9th grade research contexts that asks students to grapple with the algorithmic bias evidenced by Google Image Search results for “unprofessional hair” and “three black kids” versus “three white kids” (Noble 2018). We discuss: For what do you turn to Google Image Search, and what process do you take to evaluate these results? How do Google Image Search results impact what you believe to be true or “standard”? What do initiatives like World White Web (http://www.worldwhiteweb.net/) demonstrate that everyday users like us can do to impact the Google Image Search algorithm? What is our responsibility as users of these algorithmic systems versus the responsibility of developers to fine-tune these algorithms?

Time and again I rediscover that young people are interested in learning about biased algorithmic systems, and I have found that my teacher colleagues across content areas are also genuinely interested as well. The motivation to empower-with-knowledge baked into algorithmic literacy education aligns well with the motivations that lead educators to the teaching profession in the first place: to make a difference to society and in a child’s life (Menzies, Parameshwaran, Trethewey, Shaw, Baars, and Chiong 2015). Yes, the teaching profession is full of idealists.

But the truth is that engaging in algorithmic literacy is inherently political work, and that can be scary for some educators. To truly examine algorithmic systems, students must consider who gets to define and categorize, why certain entities and not others get to do this influential work, and how power structures get reinscribed via these algorithmic systems. In many cases, the developers of these algorithms are not explicitly setting out to create racist, misoginyst, and bigoted systems, and biased results are more a reflection of larger societal prejudices. Algorithmic literacy curriculum demands that students grapple with intent of inventors versus impact of inventions, which can be a challenging conversation for some educators to have with students. The desire of some educators to appear apolitical in the classroom is then coupled with the reality that many educators may not be convinced that it is within their content-area purview to incorporate algorithmic topics in their curriculum due to narrow definitions of what belongs where within a traditional K-12 curriculum. This confluence of ideologies too often leads to missed opportunities. Preparing students to live fully informed lives in an algorithmic culture (Striphas 2015) becomes someone else’s job, and often, in practice, winds up being taken up by no one.

Developing the algorithmic literacies of educators

So how might we demonstrate that algorithmic literacies can (and should) be developed wherever we encounter algorithms? (Which, in fact, is everywhere.) Research from Choi, Cristol, and Gimbert (2018) suggests, “Before promoting advanced levels of digital citizenship, teachers need to successfully achieve online activities in democratic and varied ways” (p. 154). For our purposes here, this means that if we hope that educators engage students in critical examination of algorithmic systems, then we must engage educators in this work first.

One place to start could be in developing educators’ ability to evaluate classroom digital tools. Generally speaking, K-12 teachers have some degree of autonomy in choosing supplementary curriculum materials, which increasingly include algorithmically-driven apps and websites (e.g. skill-building software, digital product creation tools, digital portfolios) in which student work may be connected to personally identifiable information. Teachers may learn about new digital tools at educational conferences and start using these tools when they return to their classrooms, perhaps seeing the workshop presenter as a sufficient vetter of the digital tool. In many districts, there is indeed a process by which digital tools must be approved for classroom use: there is often a person in charge of reading Privacy Policies and Terms of Service documents, who will make the determination whether or not the app in question is in compliance with the district’s policies. There are even organizations such as the Student Data Privacy Alliance (https://privacy.a4l.org/) which can facilitate negotiated contracts with edtech vendors but also more basically can serve as a database of edtech tools that other districts around the country have found to be in compliance with their own district policies. Permission slips are very often sent home to parents/guardians, linking to the digital tool’s Terms of Service, asking for parents/guardians to provide consent for their child to use the given tool. But, as mentioned above, even when decision makers have thoughtful, values-driven policies that guide their decisions about which digital tools are in compliance with district policies (and unfortunately this is not always the case), those values are not always shared or understood by the wider district community. The result is uninformed consent at all points in the decision-making process, and a sense from classroom teachers that they could never engage in this analysis on their own.

What if, instead, through ongoing professional development, we equipped and empowered teachers themselves to make these evaluative decisions with their students’ algorithmic best interests in mind? What if we start this training in teacher preparation programs, so that teachers enter the classroom with confidence that they can ask critical questions when they encounter a supplementary classroom technology? We can equip teachers to ask of edtech products1: How does this tool determine what is relevant, correct, or worth knowing? Does this line up with my own educational philosophy? How are my classroom practices being reshaped to suit the algorithmically-driven processes of this tool? What data are being collected by this tool, and what is being done with the data? What types of predictions are being made, and do those predictions line up with my pedagogical goals?

To be sure, some districts already equip and empower teachers to ask these questions and make these determinations. And, unfortunately, in at least as many districts, classroom teachers do not have the autonomy to authentically evaluate digital tools introduced by administrator-level decision makers at all. But no matter our starting point, if we have any hope of developing an algorithmically literate generation, one able to exercise and demand their own algorithmic rights, it is clear that we can not ignore the algorithmic literacies of the educators who teach them today.

References

Benjamin, Ruha. 2019. Race After Technology. Polity Press: Medford, Mass.

Choi, Moonsun, Dean Cristol, and Belinda Gimbert. 2018. Teachers as Digital Citizens: The Influence of Individual Backgrounds, Internet Use and Psychological Characteristics on Teachers’ Levels of Digital Citizenship. Computers & Education 121, 143-161.

Code.org Advocacy Coalition. 2018. State of Computer Science Education. Retrieved from https://advocacy.code.org/

Gasser, Urs and Sandra Cortesi. 2017. “Children’s Rights and Digital Technologies: Introduction to the Discourse and Some Meta-observations.” Chapter 25 in Handbook of Children’s Rights: Global and Multidisciplinary Perspectives, edited by M. Ruck, M. Peterson-Badali, and M. Freeman. Taylor and Francis.

Gillespie, Tarleton. 2013. “The Relevance of Algorithms.” In Media Technologies: Essays on Communication, Materiality, and Society, edited by T. Gillespie, P. J. Boczkowski, and K. A. Foot. University Press Scholarship. DOI:10.7551/mitpress/9780262525374.003.0009

Massachusetts Department of Elementary and Secondary Education. 2016.

Massachusetts Digital Literacy and Computer Science (DLCS) Curriculum Framework. Retrieved from http://www.doe.mass.edu/frameworks/dlcs.docx

Menzies, Loic, Meenakshi Parameshwaran, Anna Trethewey, Bart Shaw, Sam Baars, and Charleen Chiong. 2015. “Why Teach?” LKMco. Retrieved from

http://whyteach.lkmco.org/wp-content/uploads/2015/10/Embargoed-until-Friday-2

3-October-2015-Why-Teach.pdf

Noble, Safiya Umoja. 2018. Algorithms of Oppression. New York University Press: New York.

Pangrazio, Luci and Neil Selwyn. 2019. “‘Personal Data Literacies’: A Critical Literacies Approach to Enhancing Understandings of Personal Digital Data. New Media & Society 21(2), 419-437.

Shaffer, Kris. 2019 seminar at Digital Pedagogy Lab. Seminar description and materials retrieved from https://dpl.pushpullfork.com/.

Striphas, Ted. 2015. “Algorithmic Culture.” European Journal of Cultural Studies 18(4–5), 395–412. https://doi.org/10.1177/1367549415577392

Vanek, Jenifer B. (2017). “Using the PIAAC Framework for Problem Solving in Technology-Rich Environments to Guide Instruction: An Introduction for Adult Educators.” Retrieved from https://static1.squarespace.com/static/51bb74b8e4b0139570ddf020/t/589a3d3c1e5b6cd7b42cddcb/1486503229769/PSTRE_Guide_Vanek_2017.pdf

Comments
0
comment
No comments here
Why not start the discussion?