Online harm is currently a major topic of concern with a growing feeling that tech companies need to do more to combat harms such as child abuse and incitement to terrorism, among others. As a result, Ireland has now published its Online Safety and Media Regulation Bill.  The Bill covers online safety and transposes the recast EU Audiovisual Media Services Directive (AVMS).

The Bill aims to pave the way for a new watchdog to regulate online services and to reduce the availability of harmful content.  The Irish government has approved the recruitment of an Online Safety Commissioner, who will form part of a new regulator, a multi-person Media Commission. This new body will be responsible for overseeing updated regulations for broadcasting and video on-demand services and the new regulatory framework for online safety created by the Bill.

The Media Commission will also have roles in relation to the protection of children, research, education, media literacy, journalistic and creative supports. There will be a “super-complaints” scheme where nominated bodies, including expert NGOs in areas such as child protection, can bring systemic issues to the attention of the Media Commission.

It will be the role of the Online Safety Commissioner to oversee the regulatory framework for online safety. As part of the framework, the Commissioner will devise binding online safety codes that will set out how regulated online services, including certain social media services, are expected to deal with certain defined categories of harmful online content on their platforms. The codes will cover content moderation, complaints handling, recommendation systems, reporting and advertising. The defined categories of harmful online content include criminal material, serious cyber-bullying material and material promoting self-harm, suicide and eating disorders.

The Online Safety Commissioner will have a range of powers to ensure compliance, including to issue notices to end non-compliance; the power to seek the prosecution of senior management of designated online services providers for failure to comply with a notice to end non-compliance; to seek to block access to certain online services and to issue content limitation notices to designated online services providers about individual pieces of harmful online content. If a company fails to comply with a relevant online safety code, and subject to court approval, the Media Commission will have the power to sanction non-compliant online services providers, including through financial sanctions of up to €20m or 10% of turnover.

The Media Commission will take on the current functions of the Broadcasting Authority of Ireland and regulate both television and radio broadcasters. The Commission will also be responsible for the regulation of video on-demand services. The regulations that apply to these services will be set out in Media Codes and Rules and will address issues such as programme standards, advertising, sponsorship, product placement, accessibility and other matters.

Under the AVMS, there will also be a new 30% quota for European Works in the catalogues of video on-demand services. There is already an existing quota of 50% for European Works for transmission time for television broadcasters. There will also be greater flexibility regarding advertising placement for television broadcasters, moving from a 20% hourly limit to a 20% limit for certain blocks of hours, for example between 6pm and 12 midnight.

The Bill addresses the majority of the recommendations made by the Joint Oireachtas Committee on Tourism, Culture, Arts, Sports and Media in their pre-legislative scrutiny report of the General Scheme of the Bill. Their report included recommendations in relation to better defining harmful online content, reporting requirements for online services providers, a bigger role for the Media Commission in education and the independence and resourcing of the Commission.

A number of further recommendations will be added to the Bill at Committee stage. This includes the recommendation to include an individual complaints mechanism for harmful online content. These recommendations raise a number of complex practical and legal issues, including in relation to scalability, due process and timeliness. To ensure that these matters are fully considered, an expert group is going to be established to consider them and best practices by other regulators. The group will then report within 90 days with its recommendations.