1948 views

0 Members and 1 Guest are viewing this topic.


Offline PA

  • One man comedy gala

  • Joined: Jan 2008

  • Location: www.club-carbon.com
Algorithm 'identifies future trolls from just five posts'

A website commenter who will end up being banned for antisocial behaviour can be spotted with 80% accuracy simply by examining their  first five posts, claim researchers.

It is possible to tell comment trolls apart from other users simply from looking at the way they write, researchers have found.
Studying the comments on three sites – CNN, Breitbart and IGN – over an 18 month period, the researchers at Cornell and Stanford universities  found that users who went on to be banned wrote differently to other  users in the same comment thread, using fewer words indicative of  positive emotion.

Future banned users also tended to write comments that were more difficult to read than typical users, the researchers found.
“We find that such users tend to concentrate their efforts in a small  number of threads, are more likely to post irrelevantly, and are more  successful at garnering responses from other users,” the researchers  add, in a pre-publication paper titled Antisocial Behavior in Online  Discussion Communities.

“Studying the evolution of these users from the moment they join a  community up to when they get banned, we find that not only do they  write worse than other users over time, but they also become  increasingly less tolerated by the community.
The researchers also discovered that antisocial behaviour was exacerbated when moderation appears to be overly harsh.

The researchers studied more than 35m posts sent from almost 2  million users on the three websites under investigation, and found  nearly 50,000 individual users who had been banned over the 18 month  period. They also examined the number of individual comments that had  been deleted or reported to the site’s moderators, with all the data  provided to the researchers by Spam, the commenting platform used by  all three sites.

They focused their investigation on the 50,000 users banned over the  period under examination, and attempted to find tell-tale signs in their  prior posts that acted as an indicator for their later behaviour.

They discovered that users who would end up being banned from the site often wrote noticeably different to the main bulk of commenters. “Users can stay on-topic or veer off-topic; prior work has also shown that users  tend to adopt linguistic conventions or jargon in a community … and that  they also unconsciously mimic the choices of function-word classes they are communicating with.” Sure enough, they found that “text similarity” of banned users was significantly lower than that of non-banned users.

Additionally, the posts of banned users had similar word counts to those of non-banned, but when tested against a standard readability index were revealed to be significantly harder to read.

On top of the information found in the actual posts, the authors also found that users who would go on to be banned interacted differently with the community at large. “For instance, [future banned users] tended to spend more time in individual threads than [users who weren’t banned],” they write.

With all the information together, they created a prediction model  which can guess with 80% accuracy whether or not that user will go on to be banned from just their first five posts. Looking at the first 10 raises the accuracy of the model by a further two percentage points, which raises the possibility of automatically highlighting potentially problematic users to moderators so that antisocial behaviour can be dealt with more quickly.

But the authors warn that overzealous moderation can have its own downside: “Taking extreme action against small infractions can  exacerbate antisocial behaviour (e.g. unfairness can cause users to write worse) … Whereas trading off overall performance for higher precision and have a human moderator approve any bans is one way to avoid incorrectly blocking innocent users, a better response may instead involve giving antisocial users a chance to redeem themselves.”

Source.




     



Offline fivesix


  • Joined: Jun 2007

  • Location: TMBA / BNE / MEL
This forum has had/has its fair share... :p



Related Topics

  Subject / Started by Replies Last post
0 Replies
8592 Views
Last post Thu, 22 Mar, 2007 - 23:10
by AshSimmonds
11 Replies
6698 Views
Last post Mon, 17 Sep, 2007 - 20:10
by flamestone
10 Replies
13880 Views
Last post Fri, 23 Jul, 2010 - 00:37
by akatyk
360 blind spot

Started by HWHL « 1 2 3 » Ferrari

23 Replies
13535 Views
Last post Sat, 29 Dec, 2012 - 19:09
by Wattens
2 Replies
1609 Views
Last post Tue, 08 Jan, 2013 - 18:13
by cjay


Latest Discussions

BOARD TOPIC MEMBER POSTED
[ Number Plates ] NSW Number Plates 360c Yesterday at 23:37
[ Cars ] Maserati Gransport Keys dodger Yesterday at 20:48
[ Cars ] Help Me to Find Best Wedding Reception Venues Sydney? PA Yesterday at 20:21
[ Gadgets and Technology ] Forget about Lily, this is cooler charliekay Yesterday at 19:01
[ BMW M Power ] BMW 8 Series Concept TJB Yesterday at 16:15
[ Racing ] Australian GT series tdc911 Yesterday at 13:01
[ British Cars ] RR Cullinan 360c Yesterday at 11:12
[ Off Topic ] Teevee and moovees 360c Yesterday at 01:02
[ Racing ] ADAC Nurburgring 24 Hours tdc911 Sat, 27 May, 2017 - 22:05
[ McLaren ] McLaren 720S mhh Sat, 27 May, 2017 - 21:24