Are you being a trolling slimeball on Periscope?
Get ready to face a flash mob jury of your peers who can shut you down in a matter of seconds.
Periscope, Twitter’s livestreaming app, on Tuesday announced that it was handing over comment moderation to users.
This is a marked break from the status quo, where platforms such as Twitter and Facebook take it upon themselves to be comment cops, determining what content breaks their policies. They have “report” buttons, but the platform’s moderation process can take a while.
Besides the time lag, Periscope says, the people best suited to deem whether somebody’s out of line are those in the conversation.
From the announcement:
People in a broadcast are best suited to determine what’s okay and what’s not. Context matters – for example, a comment that might be okay in a comedy broadcast might not be okay somewhere else.
As it is, Periscope is live, unfiltered and open. The positive side of that openness is it enables broadcasters to rapidly get discovered and amass large public followings.
But, as Periscope notes, it can increase the risk of spam and abuse.
Here’s how the “jury of your livestreaming peers” will work:
- During a broadcast, viewers can report comments as spam or abuse. The viewer that reports the comment will no longer see messages from that commenter for the remainder of the broadcast. The system may also identify commonly reported phrases.
- When a comment is reported, a few viewers are randomly selected to vote on whether they think the comment is spam, abuse, or looks ok.
- The result of the vote is shown to voters. If the majority votes that the comment is spam or abuse, the commenter will be notified that their ability to chat in the broadcast has been temporarily disabled. Repeat offenses will result in chat being disabled for that commenter for the remainder of the broadcast.
All of this will happen in a matter of seconds. It’s also opt-out: if you don’t want to participate, you can opt out of voting in the app’s settings, and you can also opt out of having your broadcasts moderated.
The initial time-out period for commenters deemed abusive will be 60 seconds. A second guilty sentence will lead to being banned from a broadcast entirely, according to Recode.
Senior Periscope engineer Aaron Wasserman told Recode that it’s opting for a user-based moderation mode because it matches Periscope’s fast-moving comment stream:
These comments are gone almost as quickly as they appear, and the damages happen that quickly.
Periscope’s choosing to give users a second chance, he said, rather than banning them for good, because it has faith in their ability to learn how to behave:
It was really important for us to offer a path to rehabilitation. We’re actually inviting you to stick around and do better next time.
The flash mob voting will work in tandem with other reporting tools intended to report ongoing harassment or abuse,block and remove people from broadcasts, or restrict comments to people you know.
The new voting procedure began to roll out on Tuesday as part of a free app update.
[contentblock id=74 img=gcb.png]