How we increased the comments on Blick.ch by over 500 Percent

The stories we tell on Blick.ch always spark intense discussions. In 2018, we reshaped our approach to online comments. Here’s how we think about our comment section, and how we built an in-house moderation.

Janosch Troehler
Blick.log()

--

Online debates are often loud, aggressive, and publishers don’t know how to address the problem. Two years ago, the situation on Blick.ch was the same as it is in many newsrooms: The comment section is perceived as a cesspit of hate, trolling, and frankly a waste of time. Nobody felt responsible for its quality. Therefore, bulk delete was overused.

At the same time, the realization grew that user engagement via the comments is valuable to our journalism. The commenters are some of our most loyal users; they spend more time on our platform. Moreover, the comments have an impact on the general perception of a news platform. If the debates are rude or even hateful, the reputation suffers as well. The editors-in-chief also felt the importance of the comment section and were ready to invest.

That’s why we started acting. Here’s how we developed a new team, process, and mindset step-by-step.

Step 1 — Hiring a team of moderators

In August 2018, we reshaped the whole moderation setup. Our goal was to build a dedicated team to moderate the comments. There were two options: contracting an external company or building an in-house team. Both possibilities would cost us roughly the same.

Because we felt that it was necessary to stay as close as possible to the comment section and, therefore, our users, we decided to hire a team on our own. We’ve aimed to create a diverse group of different ages, cultural backgrounds, and life experiences.

Additionally, we introduced a new timeframe for moderation. Instead of approximately six hours, we organized moderation in four shifts from 7 am to 11 pm, seven days a week.

Step 2 — Developing a new moderation tool

The newly hired team not only had the task to decide whether a comment gets published or not but also scan for possible information. We implemented the idea of comments as a resource for our journalism.

However, the moderation tool was a mess. On the one hand, the performance was staggeringly bad. On the other, it was developed before we introduced the new process. The moderators had to make screenshots of potential sources and additional information. It wasn’t an efficient workflow.

The new moderation tool.

So, we went back to the drawing board. Within a month, we came up with a brand new tool that met the new requirements. Now, moderators can easily share inputs from the comment sections with our journalists. Since then, comments have contributed countless times to our reporting — from small corrections to exclusive stories.

Step 3 — Building a helping algorithm

Also, in autumn 2018, we started the training of an algorithm that has been developed by our data team. We fed it with 500'000 existing comments. Every month, we added new data and checked our algorithm’s decisions against the moderators.

In August 2019, we built the algorithm into our moderation process. Every comment gets a rating from 0 to 1. Depending on the score, the comment gets a color. Green is most likely a comment that can be published. Yellow indicates uncertainty. And red flags very toxic language.

After some iterations, the algorithm now deletes the most toxic comments automatically. It brought two significant advantages: The team doesn’t have to deal with all the comments. And it also helps the mental health, if you don’t have to go through the violent posts.

Results: Shifted perception and rising numbers

These three steps had a meaningful impact. For one, the perception of the comment section has pivoted to a more positive level. Responsible for this shift is the value we can extract from the user’s comments. We often can get in touch with people that have new information or unique stories. If we wouldn’t pay attention, we would just miss this opportunity.

Furthermore, we observe a steady increase in the number of comments we receive every month. Since January 2019, we have detailed insights into the development of the comment section. We started at modest 39'000 comments, but ever since the number of incoming comments each month has risen. In April 2020, we reached a record with over 215'000 comments. It marks an increase of comments by more than 500 percent. The latest jump is possibly linked to the Covid-19 crisis.

The rising numbers also can be traced back to the in-house moderation. The setup allows us to moderate faster and therefore spark intense discussions. To deal with the growing volume, we had to ramp up the personnel: Two moderators now cover some shifts simultaneously.

Where we go from here

Although the introduced concept has proven successful, we’re nowhere near the finish line. We still have a lot of toxic comments, and debates are not as constructive and civil as they could be.

Therefore, the community team works together with developers on new technical features that might increase quality or efficiency. Moreover, we want to merge the toxic score and information on the user’s gender with existing analytics data. We hope to gain further insights into how online debates work and develop counter-measures.

We’re also collaborating with other organizations to fight hate speech. For example, we have provided the project ‘Stop Hate Speech’ with 250'000 comments to train their public algorithm. Furthermore, we’ve exchanged experience with e-commerce companies and other newsrooms.

However, the next big step will be the introduction of the so-called post moderation, where we actively participate in the comment section.

Janosch Troehler is Head of Community at Blick.ch and responsible for user engagement throughout the whole platform.

--

--

Janosch Troehler
Blick.log()

Change is an opportunity. Product at Zeilenwerk and Hyper Island alumni.