To report or not report? That is the cyberbullying question

We all lived through moments where we have had to intervene. Perhaps it was in response to a child being bullied on the playground. Or maybe it was after we saw a peer being treated unfairly in the workplace. But as the saying goes, “If you see something, say something.” 

However, what happens when bullying is happening over the internet and social media? 

Considerable research has been conducted on what motivates whistleblowers to speak up when they see something inappropriate at work, but little research examines how bystanders respond to inappropriate online comments. Context is everything, however, and a playground or office setting is much different than Twitter or Facebook.

Jason Thatcher, the Milton F. Stauffer Professor of Management Information Systems at the Fox School, is interested in the question of what motivates bystanders to report social media harassment to platforms. 

“What we know from whistleblowing is it takes communities to help solve problems. Unfortunately, social media platforms do not seem to do a good job of encouraging users to report instances of abuse,” Thatcher says. 

According to Thatcher, online bullying is viewed very differently than in-person bullying. Online users are often dissuaded from reporting instances of cyberbullying because they don’t know what happens after a report is filed. They also do not know if reporting is confidential online as it is in other instances, such as anonymous reporting hotlines.

This topic is explored in-depth in Thatcher’s new journal article Standing up or standing by: Understanding bystanders’ proactive reporting responses to social media harassment, which was recently accepted for publication in Information Systems Research. The article was co-authored by Randy Wong and Christy M.K. Cheung of Hong Kong Baptist University, as well as Bo Sophia Xiao of the University of Hawaii.

Thatcher says that one of the best tactics for combating cyberbullying is encouraging others to intervene. However, what Thatcher and his peers found is that that’s a task that is easier said than done. 

When reviewing bullying literature, they found that bullying online is viewed very differently than bullying in person

Using a contextualized intervention framework, their study looks at data gathered from 291 active Facebook users. The goal was to better understand why individuals choose either to report or not report instances of cyberbullying. They found that four factors play a role.

  • Users perceive there is an emergency or someone is in peril due to a harassment incident.
  • Users feel a responsibility to report; they feel it is the right thing to do.
  • Functionality is key; if the reporting feature is intuitive or easy to find, users will do it.
  • Users are more likely to report cyberbullying if they think it will make a difference; if they don’t believe anything will change, they are less likely to report it. 

The findings, however, also illustrate why reporting does not happen as frequently online. Reporting bullying to social platforms is not always easy, and there is no consistent way in which reports of platform responses are shared.

“If you look at social media platforms, it’s not intuitive or easy to submit a report, and every platform has a different reporting mechanism,” Thatcher says. “This study shows that if users have strong beliefs about their ability to report, then they’re much more likely to do it. If social media platforms really believe reporting is the key to stopping cyberbullying, then they need to build users’ beliefs about their ability to report while at the same time making filing reports easier to do.”

In the study, Thatcher and his colleagues also asked users if they would be more likely to report if it was anonymous and there was no fear of repercussions. The answer was a resounding yes.

Overall, Thatcher concludes that the study shows that users want to help with regard to reporting and combating cyberbullying. That said, they could use the platforms’ help to do so.

“As we build out platforms in the future, we need to train users on reporting mechanisms. For instance, what if when you signed up for Facebook or Twitter, you then had to take a quick training on how to report abuse? That would help a lot. Then, what if each social media platform would release a quarterly or annual report that details all of the instances of reporting that happened and outlines what actions were taken,” Thatcher says. “I know that this feels a bit obvious, but the obvious answers are sometimes the right ones.”