It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
Teenagers today are coming of age under unprecedented, pervasive networks of institutional mass surveillance technologies. Surveillance technologies spreading in youth spaces are impacting young people’s prospects today and are expected to continue in the future. Additionally, today’s teenagers will be tomorrow’s decision-makers, tasked with managing and shaping future surveillance systems. Concerns that young people are being normalized to surveillance culture carry serious political implications for just and democratic futures with technology. How do teenagers view these systems? How can educators, researchers, and fellow learners support them to develop their own perspectives and sense of agency with these technologies? This dissertation examines how teenagers learn about surveillance technologies and opportunities for supporting their learning in line with justice-centered goals. Specifically, teenagers’ views on institutional privacy and surveillance through an interview study, and learning about a specific institutional surveillance technology in Drag vs AI workshops through observational data analyses. Drag vs AI is a drag queen-led workshop created by the Algorithmic Justice League which engages teenagers in makeup use to subvert AI-powered facial recognition technologies (FRTs). We organized and studied four workshop enactments with two community partners, Out Boulder County and the Longmont Public Library. Article I, Expectations vs reality: teenager views of institutional privacy, reveals how teenagers’ complex views of institutional surveillance coexist with attitudes of surveillance realism, identifying implications for justice-centered surveillance education. The second article, Taking Play and Tinkering Seriously in AI Education: Cases from Drag vs AI Teen Workshops, examines tinkering and play as modes of interaction with FRTs in Drag vs AI workshops, evidencing an alternative learning methodology termed embodied algorithmic tinkering. Finally, Drag Pedagogy for CS Educational Work: Reflections from Drag vs AI Workshops, explores how drag pedagogical practices embodied in the design and facilitation of workshops supported playful and critical CS learning around FRTs.
In my concluding chapter, I reflect on the relationship between privacy and criticism of surveillance, and examine how the limitations identified in the first study emerged in studies of Drag vs AI using a historical action analysis, synthesizing actionable takeaways for practitioners and possible directions for future designs and research.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer






