By Anastasia Karagianni*

Social media platforms offer to everyone the opportunity to connect and express freely their opinion and stay informed. In this way, information flows continuously, as it should be. Information though sometimes can be dangerous. The commonly said Hate Speech, according to the European legislation, could be defined as “every form of expression that disseminates, actuates, promotes or justifies racism, xenophobia, anti-semitism and other forms of hate that are based on intolerance, including the one that is expressed through the excessive nationalism and ethnocentrism, the discrimination and hostility towards minorities and the immigrants”.

A context that can define a hate speech action could be based on the character and the popularity of the speaker, the audience’s emotional situation, the content of the action itself as instigation of hatred, the social frame in which the action is manifested and the manners used for its dissemination, including the adopted language. The European Union Council Framework Decision 2008/913/JHA, issued on 28th of November 2008 concerning the elimination of racism, xenophobia and their interaction with the freedom of speech, according to which the national cooperation between the member countries is performed, mostly in article 1 paragraph 1, article 3 and article 4, played a decisive role in hate speech confrontation.

However, a conflict between human rights in the context of hate speech emerges sometimes. Specifically, according to the special regime for children’s protection, which is established in Europe Union, the article 2 of United Nation Convention on the Rights of Children about the welfare of the children is opposed to the article 13 about freedom of information. Thus the freedom of expression and speech seem to conflict with the protective measures will confine children’s access to some activities not the internet. Despite this conflict, children’s protection and the freedom of speech converge at the necessity for protection of the fundamental human rights, which are based on the fundamental values of human autonomy and dignity.

For the purpose of addressing hate speech on the internet, the European Commission agreed with Facebook, Microsoft, Twitter and Youtube on May 2016 and later on 2018 with Instagram, Google, Snapchat and Dailymotion on the adoption of a Code of Conduct, in order for these platforms to offer the opportunity for users to report hate speech incidents, by enforcing social support and coordination with the national authorities. They also agreed to submit users’ notifications according to the European and national legislation regarding hate speech and they committed themselves to extract, if necessary, the notifications assessed against the law.

Nevertheless, different risks possibly need different measures. Sonia Livingstone observes a distinction between the risks for children’s protection, detecting four types of risks: the commercial, the risk of attack or violence against children, risk of sexual abuse, exploitation etc against them and the risks that affect values, as hate speech. These risks are further distinguished according to children’s susceptibility to them: as recipients, as participants and as offenders. Both distinctions highlight the importance of a child-friendly policy making.

Such a policy is detected in articles 6 and 8 of the New Regulation on the Protection of Personal Data (GDPR), as in its Preamble 58. In more detail, in articles 6 and 8 GDPR, the Commission introduces paternal consent or consent from those with the parental responsibility as a way to legitimise the processing of children’s personal data on the internet. The age of 13 is the limit, which dictates if the processing of children’s personal data will be subject to less legal restrictions.

In practice, in this way, children are divided in two age groups: children who are able to give their consent in processing their personal data between 13-16 years of age, and the children who are dependent on paternal consent for their behaviour on the Internet, between 0 to 3 years. The establishment of such a strict line is in conflict with the stages of children’s physical and social development. In further, the paternal consent must be addressed each time from a legal standpoint, whether the proposed measure, in the present case, is proportional and if it reconciles with the framework of human rights.

Paternal consent is opposed, in some cases, with the children’s right to participate in the decision-making process relating to them, a right protected in the United Nations Convention on the Rights of the Child and also safeguarded in the European Union and its Member States. The child’s right to freedom of expression and to private life could be undermined in case that children’s access to information will be restricted and depended on parents. Furthermore, the scope of their right to privacy is shrinking, as parents will have to interfere in children’s privacy to make the corresponding decisions, for example the profile creation in social media. Accordingly, it is observed that paternal consent, occasionally infringes the fundamental principles of human right’s law established by the Convention.

Even so, the role of parents is undoubtedly important and determining for the protection of the child. Despite the fact that they are “children of the digital age”, they don’t have complete digital skills. According to a recent study of EU Kids Online, even though 43% of the children believe that they know more for the internet than their parents, they do not possess digital skills, such the blocking of an unwanted communication, the change of privacy settings in social media and the critical assessment of the information they have access to.

To sum up, social media platforms are among the most important players in the online marketplace. Their business model is based on the processing of users’ personal data. A huge and active part of them is children, which are dependent on the presence of these large companies in their everyday life and develop a strong consuming relation with them. The existence of these Codes of Conduct is really important, as it adds to the existing legal provisions and offers a high level of safety. Equally important is the use of social media for the children’s personal and social development. Thus, a fair balance must be found between freedom of expression and children’s protection.

* Anastasia Karagianni is a lawyer, specialising in children’s digital rights. She is a member of Homo Digitalis and co-creator of ChildAct with the aim of protecting children’s digital rights. On the 8th of November she represented Homo Digitalis in the session on “Facebook and other social risks”, which took place in the European Parliament.