Imagine your search terms, key-strokes, private chats and photographs are being monitored every time they are sent. Millions of students across the U.S. don’t have to imagine this deep surveillance of their most private communications: it’s a reality that comes with their school districts’ decision to install AI-powered monitoring software such as Gaggle and GoGuardian on students’ school-issued machines and accounts.

“As we demonstrated with our own Red Flag Machine, however, this software flags and blocks websites for spurious reasons and often disproportionately targets disadvantaged, minority and LGBTQ youth,” the Electronic Software Foundation (EFF) says.

The companies making the software claim it’s all done for the sake of student safety: preventing self-harm, suicide, violence, and drug and alcohol abuse. While a noble goal, given that suicide is the second highest cause of death among American youth 10-14 years old, no comprehensive or independent studies have shown an increase in student safety linked to the usage of this software. Quite to the contrary: a recent comprehensive RAND research study shows that such AI monitoring software may cause more harm than good.

  • kittenroar@beehaw.org
    link
    fedilink
    English
    arrow-up
    18
    ·
    3 months ago

    I suspect the real reason they are doing this is to condition the students to accept oppressive and unjust surveillance in future workplaces.

    • chicken@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      9
      ·
      3 months ago

      I doubt the school administrators who would be buying this thing or the people trying to make money off it have really thought that far ahead or care whether or not it does that, but it would definitely be one of its main effects.

    • The Doctor@beehaw.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      Just as securicams in schools in the 90’s conditioned a lot of people to accept on-street surveillance.