The tech giant is being sued by two former employees who claim they were left traumatised after viewing disturbing material.
Tech giant Microsoft is being sued by two former employees who claim they were left with Post-traumatic Stress Disorder (PTSD) after viewing disturbing material.
The two men at the centre of the lawsuit, Greg Blauert and Henry Soto, were employed as part of Microsoft’s online safety team – the division responsible for upholding the firm’s legal obligation to pass on any illegal images to the US National Center for Missing and Exploited Children.
When an image is reported, or automated software has detected an issue, a human being is required to view the material and forward it on to the authorities – it’s an undeniably grisly task that Blauert and Soto say has left them with lasting psychological scars.
Microsoft has readily acknowledged the difficulty of the job, noting that employees are limited in how long they may do the work per day and must go to a separate, dedicated office to do it – but Blauert and Soto say little was done to prepare them for the role.
The lawsuit says Soto suffered from "panic attacks, disassociation, depression, visual hallucinations" because of his employment and claims he couldn’t be around young children, even his own son, due to the "horribly violent acts against children” he had seen.
Soto also claims that, when he requested a transfer, he was told he would have to apply for a new job within Microsoft “just like any other employee.” When he was eventually moved to a different section of the safety team, he said he was still being asked questions related to his prior role.
Blauert, who had a mental breakdown in 2013, claims he was told to "smoke", "go for walk" or "play video games" when he voiced concerns.
Microsoft has been quick to deny the allegations and says it offers industry leading support for employees who are also obligated to participate in a wellness program which includes mandatory one-on-one sessions.
“Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work,” a representative told the BBC.
The firm also runs software which blurs imagery, lowers resolution, makes images black and white, separates audio from video and displays all images as thumbnails, never full size.