How moderation grew up on the Internet

(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: lawrence@krubner.com

Interesting:

Mora-Blanco’s team — 10 people in total — was dubbed The SQUAD (Safety, Quality, and User Advocacy Department). They worked in teams of four to six, some doing day shifts and some night, reviewing videos around the clock. Their job? To protect YouTube’s fledgling brand by scrubbing the site of offensive or malicious content that had been flagged by users, or, as Mora-Blanco puts it, “to keep us from becoming a shock site.” The founders wanted YouTube to be something new, something better — “a place for everyone” — and not another eBaum’s World, which had already become a repository for explicit pornography and gratuitous violence.

Mora-Blanco sat next to Misty Ewing-Davis, who, having been on the job a few months, counted as an old hand. On the table before them was a single piece of paper, folded in half to show a bullet-point list of instructions: Remove videos of animal abuse. Remove videos showing blood. Remove visible nudity. Remove pornography. Mora-Blanco recalls her teammates were a “mish-mash” of men and women; gay and straight; slightly tipped toward white, but also Indian, African-American, and Filipino. Most of them were friends, friends of friends, or family. They talked and made jokes, trying to make sense of the rules. “You have to find humor,” she remembers. “Otherwise it’s just painful.”

Videos arrived on their screens in a never-ending queue. After watching a couple seconds apiece, SQUAD members clicked one of four buttons that appeared in the upper right hand corner of their screens: “Approve” — let the video stand; “Racy” — mark video as 18-plus; “Reject” — remove video without penalty; “Strike” — remove video with a penalty to the account. Click, click, click. But that day Mora-Blanco came across something that stopped her in her tracks.

“Oh, God,” she said.

Mora-Blanco won’t describe what she saw that morning. For everyone’s sake, she says, she won’t conjure the staggeringly violent images which, she recalls, involved a toddler and a dimly lit hotel room.

Ewing-Davis calmly walked Mora-Blanco through her next steps: hit “Strike,” suspend the user, and forward the person’s account details and the video to the SQUAD team’s supervisor. From there, the information would travel to the CyberTipline, a reporting system launched by the National Center for Missing and Exploited Children (NCMEC) in 1998. Footage of child exploitation was the only black-and-white zone of the job, with protocols outlined and explicitly enforced by law since the late 1990s.

The video disappeared from Mora-Blanco’s screen. The next one appeared.

Ewing-Davis said, “Let’s go for a walk.”

Okay. This is what you’re doing, Mora-Blanco remembers thinking as they paced up and down the street. You’re going to be seeing bad stuff.

Almost a decade later, the video and the child in it still haunt her. “In the back of my head, of all the images, I still see that one,” she said when we spoke recently. “I really didn’t have a job description to review or a full understanding of what I’d be doing. I was a young 25-year-old and just excited to be getting paid more money. I got to bring a computer home!” Mora-Blanco’s voice caught as she paused to collect herself. “I haven’t talked about this in a long time.”

Source