Equity Implications of Predictive Analytics in K-12 Classrooms

Work thumb

Views: 747

All Rights Reserved

Copyright © 2019, Common Ground Research Networks, All Rights Reserved

Abstract

Educational apps and platforms that use and provide outputs from predictive analytics are being used in many K-12 classrooms, offering insights into student learning and providing student or educator recommendations. The algorithmic analysis of big data, however, brings with it implications related to algorithmic bias, a well-researched topic in computer science only recently gaining attention from educational practitioners. With algorithmic bias, there is a potential for discrimination, inequity, and prejudice to be perpetuated in the insights and recommendations provided through predictive analytics. Therefore, this article positions inequality—and the socio-technical assemblage that entangles educators with predictive technologies—within the broader data infrastructure that surrounds them to explore the question, “How are K-12 educators aware of and understand the potential implications of algorithmic bias?” Drawing on findings from a recent study called the Apps in Australian Classrooms Project, the theory and evidence provided in this article are echoed by numerous calls for greater discussion and debate about algorithmic bias. Given the increase of studies linking ethics with predictive analytics across multiple domains, this article aims to empower educators to consider how they negotiate predictive analytics as part of their educational practice.