I focused in my submission on especially serious online harms, including harassment and aggravated racial abuse, for a few reasons. First, they are the most relevant to the committee's work. Second, these are areas where we have a national competence as opposed to wider issues of, for example, platform regulation, which are largely a European competence and will be dealt with by the European Commission in the near future, as Mr. Lupton pointed out. Third, these are the areas where we can get a quick win, as it were, with clear things that we can do to address the most serious types of harm that individuals are facing.
Unfortunately, there is an element of déjà vu for me. The first time I attended an Oireachtas committee was in March 2013. The then Joint Committee on Transport and Communications was discussing issues of online safety. At that hearing, I stated that we needed more resources for the Garda and what is now the DPC, and that we had to make greater use of existing laws. Unfortunately, we are six years on and I will, to a large extent, be repeating some of those comments because the key legal position in Ireland has not changed significantly since then in terms of the statutes and enforcement structures available to deal with the most serious types of online harm.
I will first make a number of specific points about how we enforce the existing legal framework before speaking about how we might reform that framework. In practice, how do gardaí deal with complaints that come to them? If an individual tells gardaí that he or she has been the victim of the distribution of intimate images - so-called revenge pornography - or serious online harassment, how is that handled? In 2013, the Oireachtas hearings followed the suicide of a teenager from Donegal, Erin Gallagher, which was attributed to online bullying. She had been told by gardaí whom she contacted that it was a civil matter and there was nothing they could do. Fast forward to today and committee members will be aware of the case of the Ryan and Mathis family, who have been subjected to horrific racial abuse online following advertising that the family took part in for Lidl. Upon contacting gardaí, the family were told exactly the same thing, namely, that it was a civil matter and there was nothing the Garda could do. Following the case's publicity, however, a Garda investigation has opened.
Even at the outset, though, an investigation could have been commenced into whether the case constituted the existing offence of harassment under the Non-Fatal Offences against the Person Act. As Mr. Lupton pointed out, that is not a perfect offence and many elements of it could be reformed. However, gardaí at a local level appeared to be unwilling to commence criminal investigations in a timely manner. That is not necessarily an issue with individual gardaí but a systemic issue with training and resources. On occasion, individual gardaí have told the media that they have been unable to investigate these types of complaint because they do not have devices in their stations that enable them to view social media. In some cases, they have had to resort to doing that on their own personal devices.
The lack of resources is reflected in the resourcing provided to what is now the Garda National Cyber Crime Bureau. When I first testified on this issue in 2013, it was then the Garda Computer Crime Investigation Unit. Since then it has been rebranded, but the resources given to it have remained essentially unchanged. For the past decade, the number of gardaí within the unit has fluctuated from the low 20s to the high 20s but essentially has remained largely static at a time when the workload has grown exponentially.
The issue is ordinary crimes have become cybercrimes in the sense that if there is evidence on a laptop or phone which needs to be forensically examined, what was the computer crime investigation unit, now the cybercrime bureau, must be involved in that examination. The problem is that, with some recent very limited exceptions, the resources given to the bureau have not kept pace. This has had the effect of crowding out investigations into online harassment, given the need to carry out forensic examinations of other types of crime also. I know that when the head of the bureau, Detective Superintendent Pat Ryan, testified before the committee recently, he indicated that there was a need for 120 staff members within the Garda to deal with these issues, which essentially represents a fourfold increase on the current staffing level of approximately 30. To a very large extent, the statement that we need to reform the existing law needs to be weighed against the fact that we have failed to adequately resource enforcement of the existing law.
Related to this is the question of how gardaí access information on these crimes. There are two distinct issues. One is the question of international co-operation, that is, the use of mutual legal assistance treaties and so on to obtain information from firms which have their headquarters overseas. However, the issue on which I will focus is domestic access to information, that is, when gardaí have information on a particular IP address, for example, how they go about obtaining information on it from service providers such as Eir or Virgin Media. As members of the committee will be aware from recent hearings on the Communications (Retention of Data) Bill, it is done under the Communications (Retention of Data) Act 2011. Since the committee held its hearings on the heads of the Bill that would replace that legislation, the High Court has given its judgment in a case brought by Graham Dwyer to the effect that the core provisions of the legislation, that is, the provisions which require telecoms providers to store certain information and the provisions that allow gardaí to access that information, are contrary to European law. This decision has been stayed, pending an appeal to the Supreme Court which will be heard this December, but it is quite clear that the legislation is now a zombie. It is heading towards the Supreme Court where it will invariably be dispatched, be it by the Supreme Court or, ultimately, on reference to the European Court of Justice. In the meantime it has the effect of undermining both existing convictions obtained using the legislation and ongoing investigations. We must fault the Department of Justice and Equality for failing to respond in a timely manner to the judgments which found this type of legislation to be unconstitutional and contrary to the Charter of Fundamental Rights of the European Union and for failing to provide adequate alternative enforcement mechanisms for gardaí. If gardaí wish to access subscriber information or information on Internet use, the only way they can do so in a way that will stand up to challenge later in the course of a prosecution is by getting either a search warrant or a production order from the District Court. There is no reason there cannot be a streamlined, expedited procedure which would meet the requirements of European Union law in having an independent authority, for example, permitting requests to be made. Without this, however, we are hampering investigations in two ways: we are slowing them down and also creating the risk that prosecutions will ultimately be unsuccessful because of the failure to provide the appropriate tools such that convictions will be overturned or not secured as a result.
They are the points I wanted to make about these practical issues and how we enforce the laws we already have in place. I will also make two brief points about how we introduce new laws. A model that has been adopted, particularly in some of the Private Members' Bills in this area, is the Law Reform Commission's proposal in the form of the draft Bill attached to its report on harmful communications. As Mr. Lupton said, there is much in the report that could have been implemented a long time ago. I agree with the majority of it. I do, however, have some concerns about the model it has recommended for the digital safety commissioner which has essentially since been adopted, subject to some modifications, by the Department of Communications, Climate Action and Environment in the context of its proposed implementation of the audiovisual media services directive. The approach taken by the Law Reform Commission was to state the digital safety commissioner should be established and that, essentially, social media providers - online service providers, generally - should be required to adopt codes of practice which would prohibit harmful content and that a takedown mechanism would then be available, whereby individuals could apply to the providers to have the material taken down and, if they failed to do so, the individual would have a right of appeal to the digital safety commissioner. The difficulty with this is that the Law Reform Commission in its report did not identify what was meant by "harmful content". It set out a few examples of harmful content, for example, material that would constitute the offence of harassment or material that would constitute a so-called revenge pornography offence, but it left it open-ended. Effectively, it would require providers to engage in a very subjective assessment of what constituted harmful content and would then leave the digital safety commissioner in the same position when faced with an appeal in determining what constituted harmful content. It is clear, as a matter of European law, under the European Convention on Human Rights, that this cannot be done. The convention requires that restrictions on rights such as freedom of expression be prescribed by law. The case law in this area makes it clear that this requires that there be a degree of predictability, that we should be able to say clearly what type of speech is and is not covered. The Law Reform Commission's proposals, as adopted, would fail to meet that criterion.
The second point which, in some ways, is more fundamental because it is harder to fix is that the Law Reform Commission's proposals do not address the question of procedural fairness. If an individual's posts are to be taken down from social media, for example, will he or she be given an opportunity to make a comment on it before they are taken down? Will he or she be given a right of appeal if they are taken down? Will there ultimately be recourse to a court in the event that they are taken down? Again, the standards of the European Convention on Human Rights are quite clear in that regard. If state bodies, as distinct from private actors, are to make these decisions, there must be, except in exceptional cases in which there is some special justification, notice, an ability to make representations, some redress mechanism and, ultimately, judicial oversight. Again, the Law Reform Commission's proposals do not provide for this. There is no question of any individual affected by a take-down notice having the ability to appeal to a court. This can be contrasted with, for example, the Data Protection Act and the role of the Data Protection Commission, whereby individuals who are told that they must take down content because it is contrary to the data protection rights of an individual have an appeal mechanism open to them in the form of an appeal to the Circuit Court. Again, therefore, from a procedural fairness aspect, we must consider whether the approach taken by the Law Reform Commission would stand up to challenge. In my view, it would not.
They are, in broad terms, the submissions I wanted to make on this issue. I am open to answering questions the committee might have.