HRWG Newsletter May 2017


Trudy Huskamp Peterson

Date Added:

9 June 2017

Why do people post on Facebook what they do? For instance, a man in Thailand live streamed himself killing his baby daughter and then committing suicide.  A man in Memphis, Tennessee, set his phone to record as he doused himself with kerosene, lit a match and committed suicide. Videos of rapes, “revenge porn” (attempts to use intimate images to shame, humiliate or gain revenge against a person), and ISIS beheadings mingle on line with images of family feasts and frolicking kittens. And any of these can be downloaded and saved to an institutional or personal archives. 

The Guardian published a series of articles it called “The Facebook Files,” based on “more than 100 training manuals, spreadsheets and flowcharts” leaked to The Guardian that show how Facebook is dealing with violent content on its service. The company uses algorithms and is working with artificial intelligence to address the problem of content, trying to walk the line between censorship and free speech. And Facebook has a team of 3,000 people—and growing—to monitor the postings and decide what to delete. As one man described a monitor’s job: “You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see.” Several nongovernmental organizations also have teams monitoring content, particularly watching for images of child abuse. All of these organizations have “safeguard programs” to support the mental well-being of the monitors who work in these psychologically stressful jobs. (When archivists must identify and redact documents for human rights lawsuits or process records from truth commissions and criminal courts, these same stresses are apparent and the same care for the health of the staff members is essential.)