EducationMixed Methods

Surveillance Pedagogy: When Learning Analytics Becomes Student Surveillance

Learning analytics tracks student clicks, time-on-task, attention patterns, and emotional states to improve educational outcomes. But the same data infrastructure that enables personalized learning also enables comprehensive surveillance of students who never consented to being monitored.

By Sean K.S. Shin
This blog summarizes research trends based on published paper abstracts. Specific numbers or findings may contain inaccuracies. For scholarly rigor, always consult the original papers cited in each post.

Learning analytics promises to improve education by understanding how students learn. By tracking clickstream data, time-on-task, resource access patterns, quiz performance trajectories, and even biometric indicators (eye tracking, facial expression analysis), analytics systems can identify struggling students early, personalize content delivery, and evaluate pedagogical effectiveness.

The surveillance dimension of this promise is less frequently discussed. To personalize learning, the system must know everything about the learner. To identify struggling students, the system must monitor every interaction. To evaluate pedagogy, the system must measure outcomes continuously. The data infrastructure that enables personalization is, architecturally, the data infrastructure of surveillance.

AI-Based Behavioral Analytics

Nawaz, Awan, and Ahmed (2025) examine what they call "surveillance pedagogy": the psychological and pedagogical risks of AI-based behavioral analytics in digital classrooms. The rise of AI-based surveillance in education has introduced tools that track student behavior, emotions, and attention in real time. Though marketed as innovations for improving learning outcomes, these systems risk compromising student privacy, increasing anxiety, and narrowing pedagogical practice.

The paper identifies several risks: students who know they are being monitored may alter their behavior to perform for the algorithm rather than to learn; emotional monitoring (facial expression analysis, sentiment detection) invades psychological privacy in ways that attendance tracking does not; and the normalization of surveillance in educational settings may reduce students' expectations of privacy in other domains of life.

Balancing Success and Privacy

Azra and Zeeshan (2025) examine the dual nature of big data analytics in higher education. The proliferation of analytics offers promising avenues for improving student success while simultaneously raising critical concerns about data privacy protection and ethical frameworks.

The paper documents a tension that many institutions face: the same data that enables early intervention for at-risk students also creates detailed behavioral profiles that could be misused for disciplinary surveillance, insurance risk assessment, or employment screening. Without clear policies about data access, retention, and purpose limitation, the data collected to help students could be turned against them.

Student Perceptions

Karimov, Saarela, and Aliyev (2025) explore students' perceptions of engagement data collection and usage in learning analytics. The ethical use of engagement data in online education is a growing concern as institutions increasingly rely on analytics.

The study finds that students' attitudes toward data collection are nuanced: many accept that learning analytics can improve their educational experience, but they want transparency about what data is collected, how it is used, and who has access. Students distinguish between acceptable monitoring (tracking submission patterns to identify students who need help) and unacceptable surveillance (monitoring webcam feeds during exams, analyzing emotional states, tracking physical location).

K-12 Ethical Challenges

Shukla, Pandey, and Kumar (2025) examine ethical challenges in AI use in schools, focusing on data privacy, surveillance, and bias. The rapid integration of AI in school education has transformed teaching methods but has raised significant ethical concerns.

The K-12 context amplifies ethical concerns because the subjects are minors who cannot meaningfully consent to data collection. Schools act as data custodians for children who may not understand what data is being collected, how it is being used, or what long-term implications the data profile may have for their future opportunities.

Claims and Evidence

<
ClaimEvidenceVerdict
Learning analytics improves student outcomesAzra & Zeeshan (2025): evidence of early intervention benefitsโœ… Supported
Students accept learning analytics unconditionallyKarimov et al. (2025): acceptance is conditional on transparency and purpose limitationโŒ Refuted
AI behavioral monitoring is pedagogically beneficialNawaz et al. (2025): risks of altered behavior, increased anxiety, and narrowed pedagogyโš ๏ธ Uncertain
Current privacy frameworks adequately protect students in analytics environmentsShukla et al. (2025): significant gaps in data privacy for K-12 AI useโŒ Refuted

Implications

Learning analytics occupies a contested space between educational improvement and student surveillance. The technology itself is neutralโ€”the same data infrastructure can serve either purpose. What determines the outcome is governance: clear policies about data collection, access, retention, and use; meaningful transparency to students about what is being monitored; genuine consent mechanisms (not merely terms-of-service click-through); and institutional cultures that treat student data as a trust, not an asset.

References (6)

[1] Nawaz, M., Awan, N., & Ahmed, S. (2025). Surveillance Pedagogy: AI-Based Behavioral Analytics in Digital Classrooms. Academy Journal, 4(3), 508.
[2] Azra, H. & Zeeshan, I. (2025). Harnessing Big Data Analytics in Education: Balancing Student Success with Privacy. SSRN Working Paper.
[3] Karimov, A., Saarela, M., & Aliyev, S. (2025). Ethical Considerations and Student Perceptions of Engagement Data. Proc. HICSS 2025, 572.
[4] Shukla, H., Pandey, K., & Kumar, N. (2025). ETHICAL CHALLENGES IN AI USE IN SCHOOLS: A STUDY OF DATA PRIVACY, SURVEILLANCE, AND BIAS. EPRA International Journal, 20929.
Nawaz, M., Awan, N., Ahmed, S., & Mustafa, A. (2025). Surveillance Pedagogy: The Psychological and Pedagogical Risks of AI-Based Behavioral Analytics in Digital Classrooms. ACADEMIA International Journal for Social Sciences, 4(3), 1995-2010.

Explore this topic deeper

Search 290M+ papers, detect research gaps, and find what hasn't been studied yet.

Click to remove unwanted keywords

Search 8 keywords โ†’