There are some horrific videos and images floating around the internet, and those whose job it is to seek out and report this content need psychological support to help them cope. But two former Microsoft employees from its Online Safety Team are suing the company over claims that exposure to extreme material caused them to develop post-traumatic stress disorder (PTSD).
Henry Soto and Greg Blauert were asked to screen Microsoft users' communications for child pornography and illegal images, using the team’s “God-like status” to “literally view any customer’s communications at any time.”
The lawsuit, filed in the superior court of the state of Washington last month, claims the company acted in a neglectful manner by refusing to pay for therapy or provide a trained therapist to the employees, who suffered psychological problems following years of having to watch the internet’s “most twisted videos.”
According to the suit, the work involved viewing “horrible brutality, murder, indescribable sexual assaults, videos of humans dying and, in general, videos and photographs designed to entertain the most twisted and sick-minded people in the world.”
While Microsoft provides mental health support and protections to staff within its similar Digital Crimes Unit, which also reports illegal material to the police, it’s alleged that no such help was available to the Online Safety team. The Daily Beast reports that the only support offered was in the form of a “wellness program,” in which a undertrained counselor diagnosed them with “compassion fatigue.” Moreover, employees were only offered half-days if they broke down from emotional strain while at work.
Soto and Blauert were eventually diagnosed with PTSD and doctors recommended medical leave. Both applied for worker’s compensation but were allegedly denied coverage.
Microsoft isn’t the only firm to be accused of not supporting employees whose job it is to identify extreme content. Facebook has also been criticized for not providing enough training and help to those in the similar jobs, which often pay just above the minimum wage. And while more companies are using algorithms to identify images of child abuse that pass through their systems, a human element remain part of the process.