No Penalties for Negative Comments on Social Networks of Montenegrin Media

In Montenegro there is no legal responsibility of media outlets for comments.

Author: Teodora Đurnić / Photo: Pixabay

At the moment, in Montenegro there is no legal responsibility of media outlets for comments on their social networks, so it is often debated whose responsibility it is to remove such comments.

Since the Law on Media of Montenegro was adopted in 2021, the number of comments containing illegal content in online media has decreased. The law requires that an online publication must remove illegal content in readers’ comments within 60 minutes from the time they are reported. Despite this, social network accounts of online media contain a large number of such comments, which the media are not obliged to regulate by law.

A journalist and member of the working group for amending media laws, Vesna Rajković Nenadić, comments on this legal provision as a positive practice.

“This is a good solution because comments must be the responsibility of the media. The latest research by the Media Institute, ‘Insults and Hatred in the Montenegrin Media’, shows that after the adoption of the law, the number of comments containing controversial content has decreased, but there are still some even in the established media and moderation needs to be further improved. Media outlets must have trained administrators if they want their platforms to be a place for dialogue, not for spreading hatred and controversial speech,” she says.

Article 25 of the Law on Media specifies that the founder of an online publication has an obligation to remove comments with illegal content no later than 60 minutes of receiving a report. Illegal content refers in particular to comments that provoke, spread, incite or justify discrimination, hatred or violence against a person or group of persons based on their personal characteristics, political, religious and other beliefs, xenophobia, racial hatred, anti-Semitism or other forms of hatred on grounds of intolerance, including intolerance expressed in the form of nationalism, discrimination and hostility against a minority people and other minority national communities. Fines for failure to remove such comments amount to 1,000 to 8,000 euros.

We sent a query to the Police Directorate regarding the number of reports of violations of Article 25 of the Law on Media this year and the previous year, but unfortunately we did not receive a response.

Montenegrin news websites opt for pre-moderation

According to Similarweb statistics, the Vijesti website is the most read news website in Montenegro and perhaps the only one that has a person specifically hired to moderate comments. The portal’s moderator, Branko Čupić, says that the Vijesti website was forced to hire a moderator due to the increased number of comments.

“The way it works now is that comments are not released to the website before they are reviewed, in contrast to an earlier period when comments were posted immediately and subsequently removed. Certainly, this method is a positive aspect of moderation, because it prevents comments with inappropriate content from even appearing on the website,” says Čupić.

Čupić explains that the moderator’s work is made easier because the system contains inappropriate words that the program immediately recognizes in the comment and removes.

“How much moderation is needed, or vital, is also shown by the fact that on our website the ratio of approved and unapproved comments is on average 2:1. To be more precise, we delete nearly one-third of incoming comments and sometimes that number is higher,” he said.

On the other hand, the public service Radio and Television of Montenegro (RTCG) also pre-moderates comments. However, they do not have a person hired specifically for this field.

“At our media outlet, there is no person hired exclusively for moderating comments on the networks. Such a thing would often not be feasible anyway, even if three people were engaged at the same time, since a large number of comments arrive, sometimes even organized (bots),” said the head of RTCG-s Multimedia Center (MMC), Rada Brajović.

When it comes to channels through which inappropriate speech is spread, Milica Bogdanović, researcher at the Media Institute of Montenegro, points out that illegal comments are less prevalent in established media (such as the Vijesti, CDM and RTCG websites), although they can still be read there, too.

“If we take into account the total number of comments posted daily by administrators at established media outlets, then the percentage of controversial postings is small. On the other hand, in right-wing media, and here I am primarily referring to IN4s because of its high readership, we get the impression that this media outlet does not filter comments and that its content additionally encourages the spread of insults and hate,” says Bogdanović.

Rajković Nenadić also notes that it is particularly worrying that some media outlets in Montenegro which do not remove illegal comments are not registered with the Agency for Electronic Media, so it is not known who their owners are and who is responsible for the posted content.

Who is responsible for removing comments on social networks?

At the moment, in Montenegro there is no legal responsibility of media outlets for comments on their social networks, so it is often debated whose responsibility it is to remove such comments.

“Our colleagues in the established media argue with those of us who follow their work and they believe that what users post on their social networks is not their responsibility and that they do not have the capacity to delete illegal content from their accounts. We firmly stand by the position that for sensitive topics, they can avoid sharing articles on social networks and further encourage inflammatory communication or they can monitor their accounts in more detail, delete comments and foster positive communication on Facebook and other social networks,” says Bogdanović.

On the other hand, Facebook has rules for users that specify what is prohibited, such as hate speech and harassing users. Although Facebook’s reports show that the company automatically removes a large amount of content that violates their standards before publication, research shows that a large number of such contents still remains on the platform, especially for non-English speaking areas.

Montenegro’s representative at the European Court of Human Rights, Valentina Pavličić, believes that removal of illegal comments is the moral responsibility of journalists.

“Currently, there is no legal responsibility of media outlets for comments on their social networks, but there is a moral obligation of every journalist to monitor content that is publicly posted on a certain internet platform managed by the journalist and to react effectively in cases where certain content constitutes clearly illegal speech, hate speech or incites any form of violence, thereby contributing to greater tolerance and preservation of peace in society, as well as protection of basic human rights and freedoms,” says Pavličić.

The case of Sanchez v. France

Pavličić also mentioned the case of Sanchez v. France, which raised a new issue in the practice of the European Court – the issue of criminal responsibility of the owner of a Facebook profile for comments posted on their profile “wall” by third parties.

Pavličić said that the case concerned a local councilor who was a candidate in parliamentary elections and who did not delete from his Facebook account (which he used in the election campaign) the comments of a person who was inciting hatred and violence against Muslims. She said that because of this, the applicant had been sentenced by the French national courts to a fine of 3,000 euros for inciting hatred or violence against a group or individual based on their origin and belonging to a certain nation, religion or race.

She says that the European Court of Human Rights emphasized that the domestic courts had convicted the applicant for his failure to take immediate measures to delete the illegal comments.

“The applicant knowingly set the wall of his Facebook account to public, allowing his friends to post comments on it. Therefore, he had a duty to supervise the posting of comments and was responsible for their content. In addition, the Criminal Court had emphasized that the applicant must have been aware that his profile was likely to attract comments of a political nature, which by definition were polemical, and therefore should have scrutinized them with increased care. The Court of Appeal, in a similar way, considered that his status as a politician required even greater attention and caution on his part,” said Pavličić, stating that it was also problematic that the controversial comments remained visible for over six weeks.

Negative comments harmful to the individual and community

If a comment is not removed on time, it has emotional consequences as well, and as psychologist Adriana Pejaković says, comments that have an “aggressor effect” that can further traumatize victims are especially problematic.

“Under posts where rape victims reported their aggressors, we can often read inappropriate comments at their expense, that ‘they asked for it’, ‘how did they dress’, etc. Such occurrences can lead to the victim’s re-traumatization and spread speech of intolerance and prejudice. Also, comments related to, for instance, politics and political discussions often turn on social networks into arguments, spread of prejudices, insults on national and religious grounds, and this violence, unfortunately, can spill over into the street and can especially affect young people,” she said.

Pejaković points out that when we read a large number of such comments, we get the impression that most people think that way, even though it is actually a specific sample that is not representative of the entire population.

“In order for our social space, which includes both online and real space, not to be polluted, it is important to cultivate it and remove such comments. These are mostly comments that contain insults, curses, threats, aggression… For a small number of people with serious problems who do not want to communicate adequately, banning problematic communication is the only way. In this way, we prevent the already mentioned consequences for victims, possible spread and spillover of toxic communication outside the online space and broader conflicts,” she says.

Positive practices of Germany and Austria

Pavličić cited the positive practices of Germany and Austria, which are the only countries of the European Union to date which have adopted legal provisions against hate speech on social networks. She states that France had also passed a similar law, but it was declared unconstitutional due to its disharmony with the right to freedom of expression.
Pavličić explains that Germany was the first country in Europe to introduce a law in 2017 that established legal mechanisms to combat hate speech and the resulting crimes on social networks – NetzDG, the so-called “Facebook Law”.

“This law was amended last year with amendments that simplify the procedure for reporting illegal content in terms of the complaint submission procedure, while obliging platform operators, i.e. social network providers, to issue transparency reports every half year,” she pointed out.

However, on the Article 19 website, this German law is commented on as particularly problematic. They say that its provisions are vague and too broad, without clear definitions. They comment that since the NetzDG came into effect, there has been excessive blocking and censorship of legitimate speech, including satire and political speech, without any legal remedy.

The Austrian Communication Platforms Act is based on the German law.

“This law applies to all domestic and foreign communication platforms that had more than 100,000 users in the previous year and that have a turnover of more than 500,000 euros,” she specified.

She also said that both laws aim to make the procedure of deletion simpler and more transparent and transfer responsibility to social network providers, but that a single European law, Law on Digital Services, might soon replace the laws of these countries and be given precedence.

How can better practices be implemented in Montenegrin media?

When it comes to the legislative framework in Montenegro, Pavličić believes that the recent adoption of the Media Strategy and the mentioned legislative changes represent progress in this area, but it is necessary for the state to systematically approach the issue of increasingly pronounced hate speech and harmonize its legislation and practice with international standards.

“I believe that it is necessary to strengthen the control function of the state and establish effective mechanisms for monitoring cyberspace and social networks, which are suitable for spreading hate speech. In this regard, it is necessary to strengthen mechanisms for identifying and processing hate speech on social networks and clearly define the competences of state bodies and institutions for suppressing this occurrence,” she says.

Brajović confirms that comments on social networks are very often a source of hate speech, as she witnesses this every day with media postings, especially on Facebook.

“Definitely, the area related to moderation of comments on social networks should be better regulated, but the experiences of colleagues from foreign media also show that this segment is extremely challenging for moderators, considering the number of comments and the reach of a post and, on the other hand, the insufficient number of people in newsrooms. Before legal solutions/penalties, we need to come up with a model for moderating comments on networks that will work,” said the head of the MMC Center.

Rajković Nenadić says that the working group for amending media laws has not yet dealt with this subject.

“I think we should wait with the codification to see what stand the ECHR and EU will take on this, especially now after Russia’s aggression against Ukraine when we see an intention to regulate some issues differently. However, I think that the more serious media must have internal rules and regulate this issue. I was recently a guest of the most visited news website in Slovenia and when asked if you moderate comments on social network accounts, the editor answered: ‘Well, of course, that’s our responsibility.’ Therefore, good practices should be followed,” she says.

Pavličić proposed campaigns that would raise awareness of the harmfulness of hate speech and the need to eliminate it, but said that it is extremely important that restrictions do not threaten freedom of expression.

“In this regard, I conclude that the adoption of some legislative solutions related to hate speech and discrimination is encouraging, but it is necessary for the state to make additional effort to limit the consequences of the spread of misinformation and harassment through hate speech in the online space, whereby care must be taken regarding protection of freedom of expression and it must be ensured that it is not disproportionately limited, and also that the imposed sanction does not produce a chilling effect on people’s willingness to publicly express their views on a certain issue of public interest”, said the Montenegrin ECHR representative.

This text was created as part of the mentoring programme Solutions and Innovations in Media, which is implemented by Mediacentar Sarajevo and the Association Zašto ne, with the financial support of the embassies of the Kingdom of the Netherlands in the Western Balkans region.

Translation: Kanita Halilovic