We continue to make our coronavirus coverage free to everyone at www.vnews.com/coronavirus. If you believe local news is essential, please consider subscribing or making a donation today. Learn more at the links below.

Privacy advocates raise concerns about Vermont schools monitoring kids online

Published: 10/17/2019 10:03:27 PM
Modified: 10/17/2019 10:03:17 PM

With educators on heightened alert to prevent the next school shooting and under increasing pressure to address cyber-bullying, self-harm and teen suicide, schools are turning to a new tool for help: artificial intelligence.

Students spend much of their lives online. Now, carefully calibrated algorithms can patrol the hallways of the internet to alert school officials when they might need to intervene. At least that’s the pitch from a burgeoning industry.

“It takes a village (and their bots),” reads one company’s tagline.

But privacy advocates say these technologies risk getting students in trouble for benign activity. And some experts wonder whether AI will help or hamper efforts to intervene when necessary.

“Surveillance almost always begets more surveillance. It’s never enough, right?” said Amy Collier, associate provost for digital learning at Middlebury College. “And if we haven’t asked hard questions about data and privacy at the outset, we’re going to just keep doing more and more to detract from people’s freedom and privacy rights.”

VtDigger sent a public records request to all 52 superintendents in Vermont to ask for any contracts signed by their districts for social media monitoring services. The majority said they didn’t have agreements.

Five said they had current or prior contracts with Social Sentinel, a Burlington-based firm that scans public social media posts within a certain geographic area and sends alerts to school officials when keyword-based algorithms detect signs of trouble.

Another eight said that while they didn’t scan social media specifically, they did contract with vendors — including Securly, Bark or Lightspeed Systems — to monitor activity on district devices and school-sponsored email. (For an added fee, Social Sentinel will also scan student emails.)

These technologies send alerts to school officials when algorithms flag browsing habits, chats or emails that indicate a student is in distress or could hurt others.

One popular company, GoGuardian, whose “Admin” product is used by the middle schools in the Burlington school district, allows school officials to keep granular tabs on what students search, watch and read when they’re on district devices.

In an online demo for the product, a GoGuardian representative explains how school officials can query the program for detailed profiles on the habits of every student. One tab is labeled “Most flagged students.”

“Experience shows that expanded police presence in schools and online surveillance of students does real harm, undermining student privacy and resulting in rights violations that disproportionately impact students of color,” James Duff Lyall, the executive director of the American Civil Liberties Union of Vermont, said in a statement.

Lyall isn’t the only one to raise concerns. Research out this summer found Google’s hate-speech detecting technology was more likely to be triggered by posts from black Americans.

“The algorithms of these technologies are hard-coded with biases,” Collier said.

Carolyn Stone, a professor at the University of North Florida and chairwoman of the American School Counselor Association’s ethics committee, has written that counselors should advocate against monitoring software, unless alerts are routed directly to parents or guardians.

Putting educators in the middle will lead to an “unneeded and unwarranted liability” for schools, Stone argued. And assigning mental health staff to follow up when a student’s searches or emails turn up red flags could unwittingly put them in a disciplinarian’s role and impair their ability to get students to open up.

“When the school counselor is the first line of communication with them about online activity that might have stemmed from an innocuous research paper, students will look at the trusting relationship with a jaundiced eye,” she wrote in 2018.

In an interview, Social Sentinel founder Gary Margolis bristled at questions about privacy.

“We built a technology that actually helps prevent bad things from happening. By giving information that can give context to what’s going on, in a way that respects privacy, and all I do is get questioned by you and folks in the media about privacy issues,” Margolis said. “It’s mind-bogglingly frustrating. You either want to save a kid’s life or you don’t want to save a kid’s life.” (Margolis later called a reporter back to apologize.)

Social Sentinel officials insist that their product stands apart — while, like its competitors, it sends alerts to school officials based on triggers detected by its algorithms, it doesn’t profile students.

“We’re not surveilling. We’re not monitoring. We’re not following,” Margolis said. “Monitoring is when I’m paying attention to a specific individual. Your communications. You.”

Amelia Vance, the director of education privacy at the Future of Privacy Forum, said the most frequent complaint she hears from school districts about social media monitoring is that it floods officials with useless information to wade through.

“You can’t be privacy protective and good at social media monitoring and identifying, you know, potential threats,” she said.

In Hyde Park, Lamoille Union High administrators contracted with Social Sentinel for a year in 2015. But Brian Schaffer, the school’s principal, said the daily alerts he received consisted mostly of irrelevant posts — including from Quebec tourists bragging about the packs of Heady Topper they’d bought on trips to Vermont.

“It wasn’t as functional as I had hoped it would be,” he said.

Margolis said Social Sentinel’s algorithm is continually improving and already has dramatically reduced the number of false positives sent to school officials.

“Early fire detection systems used to go off all the time. Technology gets better,” he said.

In the Slate Valley Unified School District, school officials signed a three-year contract with Social Sentinel in January. Superintendent Brooke Olsen-Farrell said the district took on the service as part of a larger package of security reforms — totaling near $1 million — put in place in the two years since a former student’s shooting plot was foiled at Fair Haven Union High.

“It’s one more tool in our toolbox,” she said.

The service sends her an alert about once a month. None have included any “actionable” information so far, Olsen-Farrell said. The district has also long used Securly to monitor student activity on district devices and accounts. The service not only blocks certain content but also scans student emails and Google docs for language that suggests self-harm or bullying.

A task force created this spring by Gov. Phil Scott to help prevent school shootings after the Fair Haven incident recommended that the state invest in monitoring software to scan social media posts statewide. Margolis testified before the committee.

Among the task force’s members were Rob Evans, a school security consultant who works for Margolis Healy, a firm co-founded by Gary Margolis. Margolis has stepped away from the firm, but Steven Healy, its current CEO, also sits on the board of directors at Social Sentinel.

Asked about the apparent conflict, Evans referred comment to the task force’s chairman, Deputy Mental Health Commissioner Mourning Fox, and co-chairman, Daniel Barkhuff, an emergency physician at the University of Vermont Medical Center. Barkhuff said Evans played no role in bringing Social Sentinel to the panel’s attention.

“In hindsight, I totally understand how the optics look weird,” he said. “But that honestly is not what happened. It was me who brought it up to the task force.”

And while Evans didn’t recuse himself from conversations about social media monitoring, Barkhuff said it wouldn’t have swung the panel’s recommendations one way or another. The recommendation, he said “was pretty unanimous.”

It’s unlikely social media monitoring on a statewide scale will get picked up as a strategy by the Scott administration. A spokesperson with his office said the governor was still deciding what policies to bring forward to the Legislature next session, but added Scott felt some discomfort toward the idea.

“The Governor shares some of the concerns that have been raised about social media monitoring software and privacy considerations,” Rebecca Kelley, Scott’s communications director, wrote in an email.

And the Democrat-controlled Legislature is even less likely to push the concept.

Sen. Phil Baruth, D/P-Chittenden, chairman of the Senate Education Committee, said online monitoring technology raises a slew of ethical questions for school districts. Do parents and students know that kids are being so closely watched? And are there equity concerns to consider when poor students must rely on district devices for all of their computing needs?

“If you follow that logic out, it seems more likely to turn up problems with low-income students,” Baruth said.

The lawmaker said he believed these debates should mostly left to local communities, not legislated from Montpelier. But he added that he worried these services were being sold to schools and politicians as “duty-free workarounds” to sidestep questions of gun control.

“Every time a gun safety bill surfaces, we can say, ‘We took care of that.’ We’ve got a cheaper, easier, more universal solution. And I just don’t think that’s it at all,” Baruth said. “I think what it will produce are false positives and an over-abundance of surveillance of young people to no good end.”

Valley News

24 Interchange Drive
West Lebanon, NH 03784


© 2020 Valley News
Terms & Conditions - Privacy Policy