NP: Who's watching the watchers?
kelber at mindspring.com
kelber at mindspring.com
Fri Oct 24 21:13:19 CDT 2014
http://www.wired.com/2014/10/content-moderation/
This is an interesting article on a number of levels:
One likes to think that content filters work automatically, encoded into the programs. But the internet requires a lot of human custodianship to function. Before it even gets to the point where we users can tag things as objectionable or offensive, an unsung and underpaid cadre of people (in the US and the much cheaper Philippines)has to view every image - beheadings, torture, molestation - and manually eliminate the offensive ones.
The first question is, of course, about censorship. What else can they (and are they) being told to delete? But imagine if they weren't doing this. Because to spend one's day viewing one disturbing image after another takes a tremendous toll on these workers. We owe them a tremendous debt of gratitude. They should be well-payed (they are not) and they should be provided with relief and therapy and support (they are not).
Laura
-
Pynchon-l / http://www.waste.org/mail/?list=pynchon-l
More information about the Pynchon-l
mailing list