Skip to main content
Normal View

Joint Committee on Justice and Equality debate -
Wednesday, 23 Oct 2019

Online Harassment and Harmful Communications: Discussion (Resumed)

The purpose of today's meeting is to conclude our series of engagements on the issue of online harassment and harmful communications.

We are joined from the Irish Council for Civil Liberties, ICCL, by its executive director, Mr. Liam Herrick, and its information rights programme manager, Ms Elizabeth Farries; from SpunOut.ie by its chief executive, Mr. Ian Power, and Mr. Jack Eustace, governance and policy officer; from the National Anti-Bullying Research and Resource Centre by UNESCO chairman on bullying and cyberbullying, Professor James O'Higgins Norman, who is joined by a senior research fellow, Dr. Mairéad Foody, and a postdoctoral researcher, Dr. Tijana Milosevic; and, last, and by no means least, from Dublin Rape Crisis Centre by its chief executive, Ms Noeline Blackwell, and its policy officer, Ms Shirley Scott. I thank the Dublin Rape Crisis Centre for taking up the late opportunity to come before the committee. The witnesses are all very welcome. I will call on them to make their opening statements in the order in which I have introduced them. If I get the name of the person leading off on the opening statement wrong, I ask the witnesses to correct me and put me on the right path.

I must ask not only all the witnesses but everybody here to put their mobile phones on silent.

I must draw the witnesses' attention to the usual position on privilege. I ask them to note that they are protected by absolute privilege in respect of the evidence they are to give to the committee. However, if they are directed by the committee to cease giving evidence on a particular matter and they continue to do so, they are entitled thereafter only to a qualified privilege in respect of their evidence. They are directed that only evidence connected with the subject matter of these proceedings is to be given and they are asked to respect the parliamentary practice to the effect that, where possible, they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable.

Members of the committee should be aware that, under the salient rulings of the Chair, they should not comment on, criticise or make charges against a person outside the Houses or an official either by name or in such a way as to make him or her identifiable.

For the information of our guests, we have, unfortunately, received a small number of apologies this morning, which impacts the attendance. I very much regret that.

I thank all of them for their respective written submissions, which I expect will be the basis or bones of their opening addresses.

I call Mr. Herrick to kick off.

Mr. Liam Herrick

On behalf of the ICCL, we thank the committee for inviting us to appear before it. As the committee will be aware, the broader questions associated with online expression and the regulation thereof and the challenges involved in this very complex area have been a key priority area for the council for over many years. We have made a number of previous submissions to this committee and other committees on matters related to this question.

The regulation of online speech, including communications through social media, engages complex human rights issues. In approaching these questions, the ICCL seeks in the first instance to understand the context and the effect of harmful online communications and behaviours before then moving to the challenging question of how to develop appropriate laws and policies in response, in particular laws and policies that are precise, effective and proportionate in dealing with the problems that arise. In the short time available to us, we aim to highlight the nature and extent of one specific form of online harassment, namely image-based sexual abuse. In doing so, we will refer to a specific case that illustrates the types of issues we feel the Oireachtas and public bodies should urgently address in this area.

I will now hand over to my colleague, Ms Farries, who will speak to the submission the committee has received.

Ms Elizabeth Farries

I thank the committee. We present these submissions in memory of Dara Quigley and in consultation with her mother, Aileen Malone, who is present in the Gallery.

Dara Quigley was an Irish activist and journalist. In 2017, members of An Garda Síochána forcibly detained Ms Quigley under Ireland's Mental Health Act for walking naked in a Dublin street. State CCTV cameras captured Ms Quigley's images. Media reports tell us that a garda was suspected of recording the images. The images were shared on a WhatsApp message group and, subsequently, posted on Facebook. The images were reported to have been shared 125,000 times before being taken down. Several days later, Dara Quigley committed suicide. Two and a half years later, no organisation or individual has been held responsible for their role in the abuse of Ms Quigley. Dara's mother has supported our discussing this case today to lend a personal perspective regarding, as she says, "how devastating online sexual harassment and public humiliation can be".

Today, the committee is considering responses to what it terms "revenge porn". ICCL objects to "revenge porn" as a term. It is not pornography; it is sexual abuse. Revenge is only one of myriad motivations that, principally, are to violate a person's dignity and his or her autonomy.

"Image-based sexual abuse" is a term coined by UK academics to better describe the non-consensual nature of this problem either by creating or distributing private sexual images. Image-based sexual abuse is a gendered problem with established harms. Our research shows that while anyone can be a victim, the majority are women and the majority of perpetrators are men. This is not to say that intersectional factors also influence the likelihood of experiencing online harassment. Generally, the LGBT+ community is at increased risk. One's race, religion, ethnicity, mental health and ability are all risk factors. The harms have been well described by others, numerous academics and in our submissions. I ask the members to please let me know if they want me to elaborate during questions.

The image-based sexual abuse of Ms Quigley illustrates weaknesses in the Irish regime for responding. We have proposed several solutions in our submissions and we will focus on three solutions here.

A first step for tackling this problem is updated and robust laws. This has been outlined well by the Law Reform Commission, LRC. ICCL specifically calls for the criminalisation of image-based sexual abuse. This is might be an amendment to the Non-Fatal Offences Against the Person Act 1997 to outlaw the creation and-or sharing of private sexual images. ICCL also supports the creation of a civil wrong together with legal aid access to ensure restitution and compensation. However, as ICCL has previously stated in its submissions on harmful content moderation, and in our review of the recent Harmful Communications and Digital Safety Bill 2017, "harmful content" must be clearly defined. As has been said well here to the committee, a lack of specificity will not meet legality standards and could lead to infringement of other rights, including expression.

Second, strong laws also require strong enforcement practices and we cannot have enforcement without a cultural understanding of this problem at gardaí level. The committee has heard stories here of people not receiving adequate assistance from gardaí on online harassment issues. Dara's treatment was far more egregious. In our submissions we, therefore, present a number of suggestions for the Garda in line with our other ICCL publications to ground Garda practices in the principals of human rights. Specifically, the integration of training and assessment regarding the gendered and otherwise intersexual nature of online harassment, including image-based sexual abuse intersectional. It is enough to focus on cyberbullying, a terms that erases the context of power and marginalisation in which injustice against people such as Dara has occurred.

Third, the ICCL is opposed, in principal, to the Department Justice and Equality's community-based CCTV programme. We recommend that the State does not further contribute to online harassment by building up the web of tech surveillance to which we are all, increasingly, subject. Installing CCTV to maintain public safety is based on flawed logic. Blanket surveillance does not help us; it harms us, which Dara's experience has certainly demonstrated and it harms some of us more than others. There is significant research that suggests CCTV does not effectively deter crime to sufficiently mitigate the impact on our privacy. Policing tech can also be very discriminatory. In Dara Quigley's case, CCTV has become a mechanism for gendered online harassment and her experience of image-based sexual abuse. We need to rethink this wrong-headed approach to expanding surveillance simply because the technical means exist together with over-policing segments of the population in Ireland.

In conclusion, Dara Quigley's experience is an appalling one and it is ongoing for her family. We are conscious that investigations have been initiated into Dara's death and egregious treatment of her by, as yet not formally identified, perpetrators.

However, two and half years later, these investigations have not provided any justice to Dara or her family. The aim of our submission today is to highlight the tragic consequences of violations of rights in this area and solutions in order to prevent such cases occurring in future. At the same time, the Irish Council for Civil Liberties will continue to support Dara's family in the struggle for justice.

I thank Mr. Herrick and Ms Farries. Members of the committee have been at pains to point out that they do not accept referencing to revenge porn as it is a total misnomer. Members have been quite strong in that opinion.

Mr. Ian Power

I thank the committee for the opportunity to speak before it today. SpunOut.ie is Ireland’s youth information website run by young people for young people. We provide free, reliable, non-judgmental information services to more than 150,000 young people in Ireland each month. At the core of what we do is a mission to reach our peers where they are, which is online. Consequently, we have developed a real and practical understanding of the realities of youth engagement on the Internet. It is important in any debate that touches these issues to consider both the challenges of better digital citizenship and the great capacity for good that exists in an open and accessible Internet. Too often we find ourselves talking about "online safety", by which we mainly mean restricting the capacity of bad actors to do harm but that kind of framing does not always recognise the ways in which online services can also enhance the safety and well-being of young people or the rights of children, enshrined in the United Nations Convention on the Rights of the Child to association, education, participation, expression and information. That is whether it comes through online mental health services access to a supportive community for the otherwise isolated or the kind of information on sexuality, health, employment or education that so many thousands of young people access every day online.

Our focus needs to be on preserving the possibility and promise of an open Internet while being clear about those areas where there can be no substitute for smart and effective regulation. SpunOut has a long history of engagement with the service providers that dominate young people's experiences online. We have worked for a number of years with Twitter’s global trust and safety council and as part of Facebook’s suicide and self-injury advisory committee. Where we can work with and influence the big platforms in the technology space we have done so and we have seen results. Our partnership with Facebook and our colleagues here from the DCU anti-bullying centre, for example, has helped train and resource young people and teachers in the area of better digital citizenship.

We can tell the committee, from experience, that possibilities exist in platforms and civil society groups like SpunOut, working together, to voluntarily improve the online space. That experience has also taught us what are the limits. It has taught us that, for all the platforms can do to try to improve their impact on the world, there are areas in which there is simply no substitute for intervention and regulation by the State. We come across the limits of not intervening, particularly with a platform like Snapchat. It is incredibly popular with young people, offering a service based on the promise that all messages being sent are private and temporary. However, Snapchat is domiciled outside of the European Union and routinely makes no effort to engage with non-governmental organisations on issues of privacy, transparency and safety of young people on the platform. Like any private company, the main goal is to maximise shareholder value and, like any non-profit, organisations such as ours can only involve ourselves as far as our limited budgets will stretch.

Government has a role that is inescapable in this as the only democratically mandated force within our society with the authority, ability and resources to intervene and tilt the balance from multinational global corporations back towards the individuals and communities that use our shared online space. We need to start reshaping how we view the relationship between us as service users and the companies that shape our online experiences. The nature of these companies' business models is simply not properly understood within our current laws. There is an assumption, for example, that when I set up a profile with Snapchat or another online platform, it is fundamentally and legally a relationship of equals and that the pages of terms and conditions I probably sign without reading, much less understanding, somehow mean I am fully informed of every way in which my data might then be used. This is a legal illusion we need to break. The reality is when we deal with an entity of that size, influence and power, it is a relationship not between an ordinary customer and a regular provider of services; it is a relationship in which we are at a profound information disadvantage.

The answer to how we manage those relationships better is by accepting first and foremost that when we access a service, such as a doctor or solicitor, where their level of knowledge so outstrips the client, and where the client relies on the provider not just to provide a service but also to do so in a client's own best interest. Those services must have a legal responsibility to a client above and beyond their obligation to turn a profit.

The answer is accountability. We do not necessarily need a Hippocratic oath for Snapchat or TikTok but we need the Government to recognise and enforce the accountability these services owe us for the disproportionate influence they have over our lives.

What does that level of accountability look like in practice? As a country, we have surely learned that regulations can only be as effective as the regulator. That is why we need a powerful and independent digital safety commissioner as we have outlined in our submission to this committee. That commissioner needs to be empowered by legislative change, specifically new laws that clearly define online harassment and hate speech and that make it an offence to share intimate images without consent, regardless of the platform used. Our current laws in this regard are largely from a predigital age and are long overdue an update.

So too is our conception of privacy. The idea that we have an exclusive right to our personal data is under relentless challenge by companies for whom mass data profiling is the model for doing business. Whereas the general data protection regulation has gone some way towards validating our individual rights in this area, it is still simply untrue to assume a multinational organisation with a legal department and an individual consumer are somehow operating on an equal playing field. Our right to privacy, like so much else, is only meaningful when we can be confident it is underwritten by a State system that is willing to wade in and defend it actively rather than just telling us our rights and expecting that we enforce them ourselves. It is no use just guaranteeing rights in theory. The digital safety commissioner will need a framework that lets it receive and respond to reports as quickly and effectively as possible.

If we are waiting for young people to call into the Garda to report that they are being bullied or harassed, we may be waiting a very long time. A single, clearly advertised central point of contact by text or online is far more likely to encourage young people who have been wronged online to come forward and report what has happened. They will also need to know that their complaints will not just gather dust. The Garda must be funded to deal with these reports in a timely way. We have had very positive experiences with cases brought to the child exploitation unit recently and I hope that ethos can be applied just as well to young people aged 18 and over too.

The question of funding really gets to the heart of this. From what I can see there is widespread support in the Oireachtas for the Government taking a bigger role in this area. Without meaningful resourcing and hard money behind the project, no new regulator, legal change or Garda unit will be able to make much headway. That is why we are calling for a minimum budget of €10 million on day one of the online safety commissioner taking office, if necessary funded by an industrial revenue-based levy on many highly profitable online multinationals that have chosen to make this country their base of operations. Accountability needs to be a watchword for all of us in this, including the Government, non-governmental organisations and service providers. Until we have a firm commitment to the resources we will need in creating a safer Internet, we will not be holding ourselves accountable to the millions of people, young and old, who deserve a better, more secure, more responsible online space.

Professor James O'Higgins Norman

I thank the committee for the invitation to come today. I am joined by my colleagues, Dr. Mairéad Foody and Dr. Tijana Milosevic, researchers with international experience working at the anti-bullying centre, ABC, in DCU. The national anti-bullying research and resource centre is a university-designated research centre that brings together not just researchers from across DCU but across colleges in Ireland and abroad. The centre receives public and private funding, which allows us to undertake research and develop resources into issues relating to bullying and online safety. There are currently 20 researchers based at the centre and researchers at the centre were the first in Ireland to undertake research on school bullying, cyberbullying, workplace bullying, homophobic bullying, sibling bullying and "sexting" relating to mental health and well-being with young people. The centre currently delivers a whole school, community anti-bullying and online safety programme to post-primary schools in Ireland and it is available for free to those schools. The programme, called FUSE, focuses on bridging the gap that often exists between parents, teachers and students in tackling bullying and promoting online safety.

Members of the committee have already received our full submission so I will highlight in summary three areas we hope will be helpful to the committee in determining what laws and regulations are appropriate for dealing with online harassment and related matters. With legislation, we support the move to increase oversight of social media and related platforms and the level of this oversight is ultimately a decision for the Government to determine. We recommend that consideration be given to the experiences in other countries where hastily introduced regulations have been ultimately found not to work as originally intended. Legislation should be informed by sound research, including the voices of young people.

We welcome the recent announcement by the National Advisory Council for Online Safety, NACOS, that it is commissioning national research that will provide insights to those seeking to develop legislation and guidelines on online harassment.

We encourage the committee to take into account a range of platforms on which cyberbullying may occur and not just a handful of large social media companies. Our research shows that children and young people are equally likely to be bullied or have negative experiences on gaming platforms, for example, as well as content-sharing apps, friending apps, and artificial intelligence platforms and chatbots. The established social media companies tend to have more developed resources to address various online harms that may not always be available to the various start-ups, which spring up quickly and accumulate significant numbers of young users. We are by no means suggesting that the big companies should evade scrutiny, merely that a range of platforms should be taken into consideration when designing legislation while being mindful of the need to protect innovation and freedom of expression. Incentivising safety by design might be one approach to addressing this issue.

Regarding support, cyberbullying is not merely an online safety issue but is also a behavioural problem. Therefore, removing content alone may not solve the conflict, which can continue on other platforms or offline. Any legislation that only or predominantly focuses on content removal might miss the opportunity to address the problem at a level beyond merely addressing its symptoms. Legislation should also focus on providing funding for support to those who have been victimised online. Often the removal of content is only the first step in a process of overcoming trauma related to online harassment, and further counselling and support may be required. This could include funding for psychological counselling services available to children involved in cyberbullying both as bullies and as victims. This could also entail the provision of funding for helpline services which offer counselling and educational support to prevent future incidents reoccurring.

Regarding prevention, cyberbullying by children and teenagers often relates to a lack of understanding and appreciation of the effects their actions and words might have on other children or teenagers whom they target. We have found in research that the provision of education related to online safety, tolerance, kindness, diversity and inclusion can go a long way to reducing the level of bullying and cyberbullying. Securing funding for educational measures aimed at prevention could also be considered. Specifically, there is a need to create a national standardised cyberbullying or online safety prevention and intervention curriculum, which could include online safety instruction and which would be deployed not just in schools but also in sports clubs, youth clubs, online training for young people who use the Internet, advertisements, marketing and engagement with parents and so on throughout the country. This could also constitute a way forward in promoting online safety among young people.

Finally, I call the Dublin Rape Crisis Centre. Is Ms Blackwell, as chief executive, leading?

Ms Noeline Blackwell

Yes. We are very grateful to the Chairman and the committee for the invitation to speak on the topic of online harassment, harmful communications and related offences. I am here with my colleague, Ms Shirley Scott. We have already made a submission to the committee answering the various questions it raised.

Our interest in this topic arises from the range of services we offer. We take more than 270 calls each week on the national 24-hour helpline we run. We provide face-to-face therapy to nearly 600 people and training to approximately 2,000 people per year, including those working on the front line with victims and survivors of sexual violence and those working with children and young people. Our personnel accompany victims and survivors to the sexual assault treatment unit at the Rotunda Hospital, to Garda stations and to court.

In our work we see the often lifelong consequences of the trauma and harm caused by sexual violence of all kinds, which is often compounded for victims and survivors as a result of technology and harmful communications that are used to harass and humiliate. In our BodyRight programme, developed to raise awareness of sexual violence among young people and to assist in its prevention, we have recently had to introduce additional modules to address what their teachers and guidance counsellors found as the serious issues of sexting and sextortion, and we are developing a module on pornography. As everybody has said, while the sharing of images might appear harmless to some people, particularly young people, sexting and other forms of image sharing can have serious social and legal consequences, and the importance of young people knowing about those consequences cannot be overestimated.

We hear all the time from callers and clients of how a single image uploaded to the Internet without consent can often have devastating consequences for the person whose image is uploaded.

We also prefer the term preferred by the committee of "non-consensual sharing of sexual images". This type of abuse, the sharing of images online without consent, is often done by partners and ex-partners and is one of the deepest betrayals of trust by a partner or ex-partner, who is using technology and the Internet to cause harm. We need legislation to better protect everyone together with a greater understanding of the harmful and insidious nature of such abuse.

Such forms of non-consensual sharing of intimate images are recognised as specific offences in other jurisdictions. We have expanded on this in our written submission. In brief, many jurisdictions have laws in place to protect people from online sexual violence. The offences introduced in Australia, New Zealand, England and Wales targeting online harassment and harmful communications focus on both the behaviour and the impact of the behaviour, which is an approach we recommend.

We also recommend that harmful content be defined as any content that seriously interferes with the peace and privacy of the other person or which causes alarm, distress or harm to him or her. Freedom of expression, while an important right, is not an absolute one and must be balanced against the impact of the online harmful communication and privacy rights. Our criminal legislation on the issue is at present entirely inadequate to address issues of harassment, stalking, voyeurism or other harmful online behaviour and does not protect the rights of those who are victims of such behaviour. If the definition of harmful content were broadened, there would be a more balanced approach that would not impose undue restrictions on the right of freedom of expression, which only requires the removal of content where the content is injurious to the victim. We do not have legislation creating any statutory responsibility for Internet service providers to monitor effectively content being posted on their websites.

In our written submission to the committee we have recommended a flexible approach whereby the legislation would be supported by the capacity of a digital safety commissioner to regulate content within the main mandate of the legislation. The commissioner would have mandatory powers to implement legislation and capacity to build codes of practice and to issue notices or other takedown sanctions if the need arose. Based on the data and research of that office, rules could be adjusted to ensure that the legislation remains effective and current. We also believe there would be a substantial increase in safety, a reduction in crime and much more responsibility online if every account had a verified author. This is not the current model for online companies anywhere but it would be a substantial safeguard against criminal behaviour and activity, such as that which happens on online dating sites, defamation, which happens with trolls and fake accounts across social media, and irresponsible posting. Companies would be responsible for verifying. Posters and authors would be responsible for their defamation or their criminal activity and harmful communications. This should be universal but could at least be Ireland and Europe-wide.

I thank Ms Blackwell and all our other contributors. A number of members are indicating, and we have a number of colleagues with us who are not normally here. I have Deputy Catherine Connolly noted. I ask members to indicate. I must take them in the order in which they indicate. Do Deputies Paul Murphy and Gino Kenny want to be added to the list?

Certainly. Deputy O'Callaghan was first.

I thank all the witnesses for coming before the committee - they are all welcome - and for the submissions they have given. I also acknowledge the presence of Dara Quigley's mother and thank her for coming in. I offer my sympathies on the death of Dara.

The function of this committee is such that we will produce a report. Since we are lawmakers, the primary function of the report will be to explore how we think the law should be changed. I say this just so the witnesses are aware of the process. We are looking at a very broad sweep of issues.

I will start by looking at some issues that we all recognise as wrong and in respect of which the law needs to be changed. I would like the witnesses to answer by explaining how they think the law should be changed in these areas. I will mention as an example the case of a woman in a relationship who consents to the taking of intimate photographs, before the relationship ends and her ex-partner subsequently publishes the photographs on the Internet. How do the witnesses believe the civil law should be changed in the first place? The woman's right to privacy has been breached and her confidence has been eroded. Is Ms Farries suggesting we should have a new statutory provision that recognises the statutory right to privacy of such a woman and provides her with a remedy in statute?

Ms Elizabeth Farries

I thank the Deputy for his question, which I need to answer in several stages.

Ms Elizabeth Farries

I want to make it clear that the term "non-consensual sharing of sexual images", which I believe the Deputy is identifying here with respect to someone whose partner has distributed images without her consent, is inadequate. The use of the phrase "image-based sexual abuse" is better because it immediately and accurately conveys the significant harms entailed. The term "non-consensual sharing of sexual images" does not capture the lack of consent involved when the images are created in the first instance. It deals with the sharing aspect only. Many cases, including that of Dara Quigley, have been identified in which images were created without a person's consent. We need to be very clear about our terminology to ensure we draft effective solutions. Drafting effective solutions at the legislative level must involve setting out a specific civil wrong to address image-based sexual abuse and providing for accompanying sanctions. Legal aid options must also be made available to facilitate those who seek redress.

I would like to speak about privacy more generally. This committee has considered legislating for privacy as well. I want to make it clear that there is an established constitutional right to privacy and that privacy is a human right. That is codified. If it would be helpful to Ireland to put the right to privacy into State law to make it more clear to the Government how serious it is, I would not be opposed to it in principle. Consideration would need to be given to how it would be drafted.

Does Ms Farries think there would be a benefit in a statutory provision recognising the right to privacy of an identified woman?

Ms Elizabeth Farries

Yes, I think there would be a benefit to a civil wrong addressing image-based sexual abuse.

I would like to ask about the liability. Who is responsible for this in civil law? The ex-partner who publishes such images will have a civil liability. To what extent does Ms Farries believe the social media site on which such images are published should have a liability, and what should that liability be?

Ms Elizabeth Farries

Many members have called for social media companies to be held accountable. However, this is very difficult. How can we hold them accountable when the law is silent on this matter? The first step should be to put effective regulation in place. We are not just looking at the inaction of social media companies here; we are also looking at the inaction of the Garda and at the restrictions on judges. If no laws adequately guide the Garda and our Judiciary, how can we point the finger at social media companies? How can we say they need to take down certain content when it is not even illegal? We need to make it illegal before we start looking at what social media companies should and should not be doing and at how they should be held accountable.

I suppose it is unlawful in that it would be a breach of the woman's privacy and of her confidence for these images to be published without her consent. If representatives of the social media companies were sitting here, they would say that under the e-commerce directive, they take down any such images brought to their attention. Many people are dissatisfied with that because when they contact social media companies, it takes a period of time for this to happen. They find it difficult to invoke a statutory provision. Do any of the witnesses think there would be a benefit to putting a more strict statutory provision on social media companies and holding them liable for any loss or emotional damage sustained by a person as a result of the publication of images of this nature?

Mr. Ian Power

Fundamentally, we believe the offence must be firmly placed on the Statute Book in order for people, particularly young people, to have an ability to seek redress. I wholeheartedly take the point made by Ms Farries that young people need access to supports such as legal aid to engage in the redress process.

The Deputy asked about the liability of companies. We live in the real world.

From our perspective, there is a pragmatic response in these circumstances. At present, under the e-commerce directive, the responsibility of platforms begins when they are notified of a piece of content. We would broaden that to provide that they also have a responsibility to be proactive. To pick up on the point made by Professor O'Higgins Norman, we are not greatly concerned about some of the larger platforms; we are concerned about platforms such as Snapchat and TikTok that have not engaged and are continuing to refuse to engage. From our perspective, such services are overwhelmingly used by young people. There needs to be an onus on such platforms to engage in proactive measures, such as hashing of images that have been shared, to prevent their resharing. When we think about closed and encrypted spaces, we appreciate that encryption is set to proliferate further across all platforms. The first point we would make from our perspective is that this approach can only work in circumstances in which platforms become responsible when they are notified. We have seen a number of cases in which a significant takedown response has not been initiated by the platforms even though they have been notified. Such cases have been escalated to us. I know we are talking about the harmful communications Bill, but it cannot be divorced from the proposal to establish an office of digital safety commissioner. There must be a place within the State where citizens can go to seek the takedown of a piece of content that has not been responded to by a platform.

I agree with Mr. Power that there should be a statutory body, such as an online safety commissioner, to which a complainant can go without having to go through the hassle of going to court and getting lawyers to make his or her case. Does Mr. Power think that would be of benefit to complainants in cases like that I have mentioned as an example?

Mr. Ian Power

Absolutely. Immediacy is of great importance to victims in cases like that described by the Deputy. My organisation believes that young people should have immediate legal recourse and should be able to access legal counsel to enable the enforcement of the takedown of such content. In less egregious cases, when the citizen involved has exhausted the avenues within the individual platform, he or she should have recourse to a regulator like a digital safety commissioner to enforce the removal of the content in question.

I understand that Ms Blackwell would like to come in on the back of what Mr. Power has said.

Ms Noeline Blackwell

I want to say before we move on that we absolutely go along with what has been said. If we read this without having established an office of digital safety commissioner, it will not work. In order to involve the social media companies or the online companies, we believe there is a real need for such companies to take a tiered approach at the civil level. The job of the digital safety commissioner should be to raise awareness and - as part of a two-tiered approach - to take down content effectively, with sanctions where this does not happen. That is where they would come in at the civil level. The companies use the e-commerce directive the whole time. If we look at how that is being developed by the European Court of Justice, we will see that there is increasing recognition of the need for effective takedown procedures.

There was a recent judgment.

Ms Noeline Blackwell

Maybe they cannot clothe themselves in this area as easily as they could in the past.

I would like to look at the civil liability in the case of the complainant I mentioned in my example. Everyone agrees that what happened to this woman should be a criminal act.

Ms Noeline Blackwell

Yes.

It should be criminalised. The problem with our current law is that the only method by which such behaviour could be regarded as criminal is as a form of harassment. However, it may not be persistent, which is required under the legislation. There is no doubt that we need criminal legislation for issues such as this. Is that correct?

Ms Noeline Blackwell

Yes.

I would like to ask Professor O'Higgins Norman a few questions. He mentioned cyberbullying. Does he think cyberbullying should be a crime?

Professor James O'Higgins Norman

I think we avoid trying to criminalise children who behave in that way. Our submission is based on research with young people, including children and teenagers. I certainly believe that in the case of adults, a level of cyberbullying should be criminalised. In the case of children, we should look for other remedies because of the reduced responsibility.

If we make cyberbullying a crime, people might argue that bullying should be a crime.

Professor James O'Higgins Norman

The problem is there is a definition of "bullying".

Professor James O'Higgins Norman

The Deputy will be aware because he has explored this matter before. The problem is how that would be defined in legislation. A certain level of bullying and a certain repeated type of behaviour could be considered for legislation. When it comes to children we will again look for other remedies.

Does Professor O'Higgins Norman think a statutory body such as an online safety commissioner is the method by which these issues could be resolved?

Professor James O'Higgins Norman

There are three issues. First, there is a duty of care that the platform social media companies have towards their users and others, which the Deputy has mentioned. Second, if people are not finding a quick enough remedy with a complain, they should be able to go to a statutory body or statutory officer who can act on their part very quickly.

The third is legislation if people want to pursue that. There is a danger, if we go straight to defining something as a crime, that the victim has to retraumatise himself or herself to get a remedy. We would be kind of thinking, especially when it comes to children, that there should be processes in place where it can be swiftly dealt with and addressed pretty quickly. In addition, in the context of children, those who engage in cyberbullying behaviour need work done with them as well. There are reasons for that and there needs to be some kind of intervention there as well.

Like all social problems, laws are not the only solution to them. Much more is needed in terms of educating and informing young people.

Ms Blackwell raised a point that was mentioned by an academic who was here a couple of weeks ago whose name, unfortunately, I have forgotten, about a requirement that every account should have a verified author. Does she think that would cut down much of the abuse that emanates from social media?

It was Professor Joe Carthy.

I thank the Chairman.

Ms Noeline Blackwell

I do. A lot of the harassment that we hear about comes from dating sites where, ultimately, nobody can be found as the person who is the-----

Ms Noeline Blackwell

Yes, who has the account. I refer to the level of harm that can happen out of harassment online or, indeed, more serious or more physical harm as well. If, in every case, the company had to know who the author was, people might be able to publish anonymously but it could go back to a verified author. This would mean that if somebody published defamatory or criminal material, he or she could be found and that would raise responsibility for everybody who wants to say something. It would not stop people having freedom of expression. People have the right to freedom of expression not anonymous, possibly falsely created, people.

It is like at present just allowing anyone to drive on the road in a car that is not registered or there is no reference to who is driving.

Ms Noeline Blackwell

Yes. It kind of solves the problem of allowing people to say what they want to say and allowing all of us to say what we want to say so long as we are held responsible for it at the end of the day.

The committee is considering what is harmful content. Harmful content has a broad sweep of meaning and ranges from the example that I gave at the outset to something like hurling abuse at a politician. Do the witnesses think that people should be allowed to abuse other people?

Ms Elizabeth Farries

We are hearing very concerning propositions are being raised to the committee right now in the names of rights and we are concerned because they are, frankly, quite alarming. The term "cyberbullying" is as concerning and challenging to legislate for as the term "online harm". Vaguely defined terms such as this may include illegal content but it might also capture users' behaviours that are deemed harmful but not illegal and not sufficient to take down. This, therefore, would have the effect of censoring speech and certain associated rights attached to that.

The UK Online Harms White Paper has introduced duty of care but that has been robustly criticised by qualified rights advocates. Duty of care imposed on social media platforms, combined with heavy fines, creates incentives for platform companies to block online content, even if its illlegality is doubtful and even if it serves to further conversations about online abuse.

We have seen content moderation by social media companies. Please forgive me, they are actively engaged. They are not unengaged in this scenario. We have seen them get it wrong repeatedly. Women of colour complaining online about their experience of racist harassment will be censored. That should not be censored. We need to make sure that we do not incentivise the removal of speech to protect certain interest groups.

I refer to this dangerous proposition about user verification. We do not support such proposals. The ICCL is part of an international network of civil liberties organisations. My colleagues in the Egyptian Initiative for Personal Rights have talked about an activist who was imprisoned, egregiously, for simply calling on people to gather and speak against their experiences of harm. In 2013, Alaa Abdel Fattah was violently arrested for talking about issues on Facebook and Twitter and he was only released last March. The idea that people would be unable to speak out anonymously about their experiences of harm in areas where autocratic regimes erode civil rights is a problem. That is something we absolutely need to protect not just gender considerations but intersectional considerations.

Mr. Liam Herrick

It might seem superficially that one might propose a safe area such as the EU where problems like that do not arise but that is not the case either. There are plenty of member states where activists, trade unionists and journalists are targeted and, indeed, have even been assassinated. There is a superficial appeal to this idea of verification but it is deeply problematic in practice. The Deputy compared expression online to driving a car on a road. His analogy is problematic because one can also use the analogy of online expression in terms of expression more generally. We do not require licences for people to be politically active, express themselves, protest or write letters to newspapers and so on. This goes to the heart of the question about harmful content and, again, the analogy that was helpfully introduced by the Deputy, about expression that might cause distress to a politician. We have a carefully crafted balance of freedom of expression within the Council of Europe area under the European Convention on Human Rights and, indeed, under the Constitution. Once one starts getting into the concept of what might cause distress to others, in terms of being harmful, that is not amenable to a proportionate legal definition. That is the problem we are likely to encounter if we go down this road. It is not at all to understate the challenge of finding the right ways to deal with things that undoubtedly cause injury to people. Of course, that is the challenge the committee has taken on but we need to be careful not to introduce vague concepts that will compound the problem.

Dr. Mairéad Foody

I agree with the previous statements in respect of young people. We are definitely always on the side of the victim and concerned that supports exist for people who have been victimised. For an LGBT young person in Turkey, for example, who has no physical supports in their community, the online environment is the only place that they can feel comfortable. We do have to be careful that when we think about regulation that we do not prohibit free speech but we also do not take away supports for young people that exist today. The supports never existed in the past so it is positive that they exist today.

Mr. Ian Power

From our perspective, we agree with what has been said by previous speakers regarding concerns about civil liberties. On the point made by Ms Blackwell, the difficulty in finding the perpetrators of these types of crimes or proposed offences is not to be understated. Ultimately, the way to come at that is a resourced law enforcement that is able to conduct these investigations in the way that is able to find the people who are responsible. That is made more difficult by encryption and other such things that are, in our mind, proportionate. The proportion of harms versus the normal day-to-day activities of citizens is small. Should everything be de-encrypted just to allow law enforcement investigate those kinds of harms that are perpetrated or should law enforcement be enhanced, trained and provided with the resources to investigate these crimes in an effective and efficient manner? At the moment the Garda is not resourced to be able to do that work.

Ms Noeline Blackwell

Regarding Deputy O'Callaghan's question as to whether it is like abuse of a politician, many of the cases of online abuse we hear about are between intimate partners. Therefore, they are different from politicians receiving abuse. Those of us working in the sexual violence sector and our colleagues working in the domestic violence sector would say that it is about impact of those images. The abusive behaviour - the stalking and the capacity to know the points at which one can damage somebody - means that the impact on the person must be taken into account, be that a young person or an intimate or ex-intimate partner.

The debate reflects the difficulty we face. We came up with the topic of harmful content, which is a very subjective term. There are some examples of behaviour such as those we discussed earlier that we all agree are clearly reprehensible and should be criminalised, but once we start going further down the line, it gets far more difficult and ambiguous. I do believe people have the entitlement to abuse others, not just in the political sphere. People can be abusive and we must be very careful about criminalising behaviour that is not clearly reprehensible behaviour such as the examples about which we spoke. I thank the witnesses. We must come up with solutions at the end of this process.

I thank all the witnesses for broadening that out a bit. It has been very useful and informative.

I thank all of the witnesses for their contributions. As Deputy O'Callaghan said, we aim to sort out or come to some sense of where we can make recommendations regarding the reform of laws around all of this. It goes into another area, which has been evident from the conversation. Laws may not always be the appropriate way to deal with it and there may be other things that can be done. I would like to tease that out. One of the things I am interested in is the platforms and their liability in this area. While there may be difficulty in finding the person because of anonymity, even when he or she is found, all one can do at the moment is ask the platforms to take it down. The platforms can take down the offensive images or comments but often they do not. There can be a fight to make sure that happens and the individual feels he or she is very much David against Goliath. Should Goliath be more responsible for the situation? Should there be some way of ensuring that the platform is liable and should there be a sanction against it if it does not act within an appropriate timeframe? We know the Internet is instant, which is what makes it great to use. How do the witnesses think ensuring platforms take down offensive content within an appropriate timeframe could be legislated for or do they think there is a means of legislating for it?

Dr. Tijana Milosevic

I researched the effectiveness of what social media companies are doing when it comes to cyberbullying. That was the focus of my research so it involved the tools these companies already have. I am not a legal scholar so I do not feel competent to speak on the issue of liability. I understand both sides in terms of freedom of expression and innovation on the one hand and the need to protect on the other. It is extremely important for victims to have a streamlined way of communicating what has happened and an effective way to take down content that has harmed them in some way. It can be difficult for companies to do that for various reasons. Very often, even if the poster is anonymous, the problem is not just with him or her. Content spreads easily and is very easily shared so the problem becomes a cultural problem.

I will give a perfect example of this. I have studied the cases of children who have died by suicide in different countries. This happened because their intimate images were shared online without their consent, for example, through predators. What really got to them and in a way brought about that tragic outcome was the bullying from their peers that took place afterwards, so it was not about the poster and taking down that one piece of content or even punishing the poster. It was about the wider culture of bullying. It is important to take the content down and point out that this form of abuse is not okay, particularly when someone is a young person, but it is not enough. Of course, companies should be responsible but regulating their liability is a very difficult question.

In respect of the proposal for a digital services Act in the European Commission and the e-commerce directive, it is a very difficult question that legal scholars must address. This committee should bear in mind that taking down this content is really not enough. It is very important to know that even if we incentivise the companies by proposing fines, which is one aspect of what the Australian eSafety Commissioner is doing, it is not enough of a solution.

It is also very important that we need to evaluate continually what the companies are already doing. If they say they have an effective reporting mechanism, there needs to be a way to determine through independent research how effective that is from the perspective of victims, particularly if the victims are young people.

Mr. Ian Power

Regarding the point that the reach of this is broader and we should consider preventative measures in considering this issue, the key thing for us is to reframe it from online safety, which is all about protecting oneself and where the onus is on the victim or potential victim to make sure he or she is protecting himself or herself, to general digital citizenship, which is about ensuring that all citizens make sure they are behaving in a way that is within the law and is not harmful to other online users. From our perspective, we disagree with the title of online safety commissioner. Any school curriculum that is developed should not just focus on how to protect one's password. It talks about not sharing a toothbrush. Those analogies are very helpful and useful for working with children, but ultimately it is about not being a bad actor oneself and reframing it in that way.

We feel there should be monetary sanctions on the platforms after they have been notified and if it has been escalated to the regulator. For example, in Australia and New Zealand, there are fines of up to $300,000 for every instance of non-compliance with the takedown request from the regulator. The established platforms will comply with that and build systems to ensure that those requirements are complied with. For us, it is very helpful in tackling some of those entities that are perhaps not domiciled here or within the EU as a way to engage them with the regulatory structures here as we might establish them.

Ms Elizabeth Farries

Should the Goliath be responded to? The answer is "Yes", which is why we proposed solutions for two Goliaths here - An Garda Síochána and the need for it to human rights-proof its practices, and the State and its desire to expand CCTV unnecessarily, thereby capturing all of us in an increasing web of technological surveillance. We have framed this by reminding the committee of the experience of Dara Quigley. It is not just about the social media companies, although that is certainly a significant question. As Dr. McIntyre has told this committee, that is very much an EU project. Meanwhile, here in Ireland, we have much more of what he described as easy wins, such as access to discrete pieces of legislation, for example, on image-based sexual abuse, and looking at the practices of An Garda Síochána to ensure they are human rights-proofed and that An Garda Síochána has the resources and education to do that effectively.

Ms Farries mentioned the case of Dara Quigley, which goes to the heart of everything we are concerned with her. Those of us who followed that case saw how an agency of the State behaved in that manner and failed repeatedly. An Garda Síochána has appeared before this committee many times and we are always assured that everything is done properly and followed through, yet this case and others arise. It is a very appropriate warning that things are not right and that we need to acknowledge that.

I was going to follow up on my question about the Goliaths. If all those agencies were held to account, one of their arguments is that when they get this stuff up and it goes out there, they lose a certain amount of control. It is my view, and I wonder if it is shared, that if greater sanctions were in place, the companies could build it in to ensure they would have more control.

I certainly see that reflected in online content, which is often very intimidatory and bullying. When two groups in society identify online as opposites, they start arguing online and some of the dialogue becomes terribly intimidating and aggressive. This attracts an audience. I sometimes think that this is the social media companies' game. It is what they want. That also needs to be examined. Bullying is a very closely related issue. As others have said, this goes beyond what happens online. It concerns society at large. Mention has been made of a cultural problem. People behave much worse online than they would if they were in a room with each other. When they get used to behaving a certain way online, they start behaving the same way on the street or in a room with others. That is where it can lead.

It is almost a closed or fascistic mindset. It is about domination and refusing to have any civil discourse on anything. It is a case of them being right and all others being wrong. People form identity groups, which they can do very easily online. It is a way of enhancing their status. People come to think that nobody is as good as their group. Their view of themselves is so wonderful that anyone who says anything else is to be wiped out. This culture has developed considerably online and I now see it developing in society. Everyone can recognise this as very harmful. Should there be an obligation on the social media companies, which provide the space for this to happen, to warn against it? There should be legislation obliging them to give space on their sites to warning messages, just like we have legislation requiring labels on cigarettes to warn that they are dangerous. There should be a warning on social media-----

Would the Deputy like to direct his question to a witness?

I do not have a witness to direct it to. This is not about sanction or legislation against certain online behaviour, but about making social media companies act responsibly by warning of how their product can be used in a harmful way.

This process does not require witnesses to respond to every question. If a witness agrees with something somebody else has already said, we can leave it at that. If witnesses have something new to add, they are most welcome to do so. We will hear from Ms Blackwell, Professor O'Higgins Norman and Mr. Eustace.

Ms Noeline Blackwell

We have noted with regard to dating apps, which make a lot of money by sharing information very quickly, that little care is given to pointing out the dangers of blind dating online. It is not in the interests of social media companies to put up the kind of warnings that might have been issued by dating agencies in the good old days. There is something to that. The codes of conduct established in other countries like Australia have led to better practices. Governments find it easier to create obligations for companies. That requires a digital safety commissioner, although we do not have to use that title if SpunOut.ie does not like it and has a better word. Those codes of conduct are needed because sharing is the essence of what online companies do. They will have to be told they cannot do it before they will stop.

Professor James O'Higgins Norman

The Deputy has painted a picture that represents the situation online very well. It goes to the heart of human nature and the way human beings behave when they are not educated to appreciate kindness, inclusivity, diversity and equality. One analogy is the time in history when schools were first established. Bullying happened in classrooms and teachers did not see it as a problem. They saw it as boys being boys. They thought it was just how kids behaved, and as long as it did not get too far out of hand, teachers did not see a need to step in. Educators later realised the damage this caused to individuals and we realised that education has a very important role in helping human beings to overcome the inclination to dominate or be aggressive towards others.

Perhaps something similar needs to happen in the new parallel society online and some kind of duty of care should emerge. If I own a coffee shop and two people are having an argument in it, am I responsible for that argument? Probably not, but I have a duty of care if the argument gets out of hand and I might need to step in and ask them to take their argument somewhere else. I would be obliged to uphold a standard of behaviour in the coffee shop. Something should be done to make that duty of care very clear to social media companies and other platforms. It must be clear that certain standards of behaviour are not acceptable in this space. I do not know how that can be realised technologically. In the past, people joined clubs that had rules. When people joined the club they signed up to behave in a certain way, knowing that if they did not do so they would be shown the door. This is not a new problem but we need to deal with it in a new place, that is, online.

Mr. Jack Eustace

I wish to make two points, first, about our online responsibilities as individuals and, second, concerning the responsibilities of social media companies. In regard to individual behaviour, it is very important not to focus on warning people about what may happen if they engage with social media. That implies that it is individuals' responsibility not to be abused. We think it is better to focus on creating firm codes of conduct that apply to everyone who is active online, on a social media platform or elsewhere. It should be our responsibility not to abuse others on the platform. How that works must be very clearly laid out.

I wish to refer to the points made about social media companies' behaviour. Reference has been made to the point at which a piece of content is shared too much and taking it down is effectively beyond the power of these companies. It is important to outline how modern social media companies operate. Much has been said about the challenges of getting a piece of content taken down. While the companies may not have control over whether an individual uploads something, a piece of content, abusive or otherwise, does not get hundreds of thousands of views and shares because individual users choose to amplify it. A modern social media platform generally has algorithms that work behind the scenes to boost content. The technical details can be murky but as we will all be aware, content is generally boosted because it is controversial. It is content people want to click on because it might outrage them or elicit a reaction. Everyone who is on Twitter will be aware that users no longer have to actively amplify a piece of content to get it on to other people's timelines. Content is shared because it is popular. It is a chicken and egg phenomenon. However, that does not happen in a vacuum. It certainly does not happen because we all decide to boost a certain piece of content. It happens because of processes social media companies put in place to boost content. It is absolutely the case that the bigger a piece of content is, the more the companies are responsible. We should never assume that content simply gets out of a platform's control. The reason something gets out of control is the processes those platforms have actively put in place and propagated.

I appreciate the witnesses' contributions. The term "parallel society" is an accurate description of this. The difficulty is that this parallel society is a real society. It is real for the people who are in it, and it has a real impact.

A failing since the Internet gained widespread popularity is that society and governments have considered that the problem is happening at some remove and should be left alone. That is not the case. It is in people's lives and we are far behind where we ought to be on this issue. We need to move fast if we are to make a difference. I appreciate the witnesses' comments.

The witnesses are very welcome. I thank them for their presentations, which I have read. My mind is going in several directions regarding the questions I should ask. It is a very helpful process for us because there is no simple solution. I acknowledge the points made by the witnesses, especially those from the ICCL, regarding the balancing of rights and the importance of the Garda, the State and the Oireachtas in addition to the platforms. I am aware that only some of the bigger platforms have appeared before the committee. The witnesses have pointed out that there are other platforms about which we should be equally worried. The bigger companies reassured the committee of their interest in this area, as they do, in order to make profit, as they are entitled to do. However, they referred to the e-commerce directive but did not freely acknowledge it is not up to scratch or fit for purpose. We know it is not and that there is a process under way to update it.

The committee asked the companies about their self-regulation. They were quite confident that their self-regulation was robust. What jumps out at me are the comments of the representatives of the ICCL and the NABRRC regarding more accountability. The NABRRC submission it states: "Commissioning research into the evaluation of effectiveness of company self-regulatory measures should precede and inform any further steps about regulation." Similarly, the ICCL is asking for companies to be more open in supplying information. It seems to me that we are looking at a problem that we are all anecdotally aware is very serious, but we do not have the information to inform us on the proper steps to take. There is a danger that we will go down the wrong route. I will revert to Ms Blackwell. My heart agrees with verification but I do not think it is the solution because it does not strike a balance with open and free expression. Is Professor O'Higgins Norman aware whether that research been done? What is the position in that regard?

Professor James O'Higgins Norman

I will defer to my colleague, Dr. Milosevic.

Dr. Tijana Milosevic

I thank the Deputy for her questions. What we meant is that companies, as the Deputy heard in their-----

I have a limited amount of time to ask questions. What has been done by the NABRRC or the ICCL regarding looking at-----

Dr. Tijana Milosevic

I will be very specific. I have looked into how much children are using the tools being provided by the companies and whether they are aware of them. The companies have told the committee about their safety and help centres which provide educational advice to children in cases of bullying, for example. The question is whether those tools are useful if children, parents and teachers are not aware of or using them. I asked children who were using them whether they found them helpful in bullying situations. If the companies are acknowledging that we need regulation but are pointing to these tools as examples of the good things they are doing, and the committee wishes to hold them accountable in that regard, then we need to be able to see how effective these things are.

It needs to be assessed.

Dr. Tijana Milosevic

While I was a post-doctoral researcher in Norway, I found that a limited number of children - approximately 26% of my very small sample - were aware of safety centres. A survey with a nationally representative sample found that more children were aware of the companies' reporting and blocking tools, but fewer were aware of the safety and help centres. One should assess how effective those tools are for children. They are not the only tools being used by the companies, which are starting to rely more and more on artificial intelligence. They are open about that in their transparency reports.

We received that information from the companies which presented to us.

Dr. Tijana Milosevic

If, for instance, a transparency report states that a particular percentage of bullying cases has been proactively captured by artificial intelligence tools, we would like to see the data on which that is based and how the conclusion is reached. What happens to the cases that are not caught? Rather than companies keeping that information close, these data should be available to researchers other than in-house researchers, provided that the ethical handling of data can be ensured.

Those data are not currently available.

Dr. Tijana Milosevic

They are not. The companies publish their research, but there should be greater transparency on how they reach their conclusions and the data should be opened up to other researchers.

That seems eminently reasonable, but how would it happen? What is the next step in getting the companies to share the data?

Dr. Tijana Milosevic

Opening up the data to a broader set of researchers-----

Should that be done on a voluntary basis?

Dr. Tijana Milosevic

I am not a legal scholar, so I do not know how it would be implemented in law.

The bigger companies stated that they are at pains to help, provide the information and prove they are open and accountable. Has the NABRRC gone down that avenue with them?

Professor James O'Higgins Norman

Thus far, we have been briefed on research produced by the companies' in-house researchers, but we do not have access to the data.

Has the NABRRC asked the companies for the data in a spirit of openness and accountability?

Professor James O'Higgins Norman

We have not explicitly asked for the data because we are aware there are all kinds of data protection issues such as GDPR. It would not be a matter of simply asking for it; a procedure would have to be put in place. If there were some kind of digital safety commissioner who could put moral or statutory pressure on companies to provide the data in a safe and ethical way that complied with GDPR and so on, that would provide a bridge for the making of that request. However, there is currently a vacuum in that regard.

There is just the goodwill of the companies, in theory.

Dr. Tijana Milosevic

I would like to know whether they can give us examples of bullying cases they caught proactively or tell us how the algorithms catch these cases. I would like the companies to provide that kind of evidence. It would be helpful for us to understand how they invest their effort in handling this issue.

Ms Elizabeth Farries

I thank the Deputy for raising this issue on this occasion as well as when the committee most recently met with representatives of social media companies. It is a very important issue that requires more time and attention than it has received. It is a good start to focus on social media companies as it goes to the heart of the matter. The UN special rapporteur on freedom of expression, David Kaye, has called for radical transparency from social media companies. He stated that there needs to be a radical disclosure of all information in a manner that is meaningful. Our perspective is that is not a radical request; it should be absolutely normative that social media companies be transparent in that way.

The Deputy asked what the ICCL has done on this issue. We issued two sets of submissions, both of which are available online and one of which regards content moderation generally and also the submissions to this committee specifically. I very much appreciated the Deputy's questioning of Karen White of Twitter regarding what that transparency means. The Deputy asked whether Twitter is disclosing what is going on in the background and being forthright in its quarterly publications. The ICCL agrees with Amnesty International that those publications do not go far enough. They present data, but those data are not meaningful. Amnesty International provides very specific suggestions on how to make the data meaningful. For example, the company may state that it took down content in a particular number of egregious situations and claim that is amazing. We would like to know the number of reports of abuse it receives per year which fail to receive a response from the company. We would like the reports to be disaggregated by category of abuse reported. Social media companies often highlight that they took down particular posts or pages. We would like to know the average time for such harmful material to be taken down. That question has been asked of the social media companies by the committee. We need to see that information transparently provided in their published reports, but that is not being done.

Spunout.ie works closely with various platforms. Is it part of the Twitter trust and safety council?

Mr. Ian Power

Yes.

Was it party to the letter regarding the concern of trust and safety council members at the recent lack of openness and accountability with the council? Is spunout.ie involved in that letter?

Mr. Ian Power

I received a personal notification to my messenger account at 10 p.m. on Saturday night.

That was held by the filter so I did not see it in time to respond. We were not formally involved with that letter.

I see reference to Twitter's trust and safety council members in that letter but I do not see names. I am going to stay with the issues that are raised in the letter. Would Mr. Power agree with those issues and that the interaction with Twitter, certainly since December, has not been of the same quality as it was previously?

Mr. Ian Power

Twitter was clear up to the point when the letter was sent that it was changing the process. From our perspective, an argument could be made that we should not engage with those companies at all but we do because whatever limited impact we can have is still a better impact than none at all.

Can I stop Mr. Power there for a moment? I sit on another committee which is all about procedures, accountability and processes that are in place. The trust and safety council is, to a certain extent, a process and a good one I presume. Is that the case?

Mr. Ian Power

It is.

The key words are "trust" and "safety". I do not want to personalise this. That is a process and it is in place.

Mr. Ian Power

Sure.

The trust and safety council was mentioned when Twitter was before the committee. Of course, it is only as good as its operation on the ground. We have a letter from members stating that, in the past nine months, Twitter's engagement really has not been as good. In fact, the members of the council seem to be begging for a phone call with the chief executive officer. Perhaps "begging" is the wrong word but they are appealing for a phone call. Mr. Power is part of that but he only became aware of this letter last week.

Mr. Ian Power

No, we have been aware of the letter since it was sent but our invitation to be part of the letter happened late at the weekend.

That is fine. I will leave the details for a moment and ask if the trust safety council is working effectively.

Mr. Ian Power

From our perspective, it is. The engagement from that individual to the leaders of the council could have been handled better. If someone is unhappy with the communications coming from the organisation, he or she should engage with it one-on-one, as opposed to the way it was handled and leaked to Wired magazine immediately. We would engage on a more constructive basis with the platforms and take an approach where we would escalate any issues that we have to them directly. Our approach is that we would be an honest broker. The substance of our submission today is to the effect that we are not in the business of mollifying the platforms, we are there to tell them what they are doing wrong and-----

That is okay. I am not taking issue with SpunOut.ie in any way. I have read the submission and I am just trying to get behind the processes and how effective they are.

Mr. Ian Power

I will come back to that question in the context of the overall effectiveness of self-regulation. These are initiatives that the bigger companies have set up to engage with non-governmental organisations in this space.

Those initiatives are very often used by such companies to say, "We are engaging and we are doing our best".

Mr. Ian Power

Yes.

That is my little difficulty.

Mr. Ian Power

We all recognise that we need State regulation. Self-regulation is not enough. These initiatives that the companies have established, from our perspective, are hugely welcome. They give us the opportunity to highlight the edge cases and that their overall systems do not work. There is very little point in saying referring to a certain amount of take-downs. There is a percentage of real people who are in situations where content that has affected them has not been removed.

Would there be minutes of meetings of the trust and safety council? I am focusing on this matter because it stood out when the representatives from Twitter spoke about it. Are there regular meetings? Are minutes taken and published? Do issues emerge that we could all look at?

Mr. Ian Power

That is a good question.

Mr. Ian Power

The way it works is that we have calls about specific issues and a summary of that call will be circulated. Those are not exact minutes but summaries of those calls are issued.

SpunOut.ie does great work and I only hear praise on the ground and I am just exploring issues.

Mr. Ian Power

Sure.

Legislation is only one part of this, there are other mechanisms. It seems to be that, if the companies were more forthcoming with the information, we might try a different route. Do companies pay SpunOut.ie?

Mr. Ian Power

They do not pay us for our participation in any of the activities.

Do companies make contributions to SpunOut.ie?

Mr. Ian Power

They do not. There are two initiatives in which we are involved. In partnership with the anti-bullying centre in DCU-----

That is a giant process.

Mr. Ian Power

It is. We have a programme to establish a digital citizenship library of resources for young people.

I have read that. That project is funded specifically.

Mr. Ian Power

Exactly.

Do the companies pay, or make contributions, to SpunOut.ie or other, similar organisations?

Mr. Ian Power

They do not for our participation in-----

Do they pay or make contributions for anything? I thought Mr. Power said in his opening statement that SpunOut.ie receives support from Facebook.

Mr. Ian Power

I said that in reference to the partnership with DCU.

Support is received from Facebook through specific projects.

Mr. Ian Power

Exactly. It is for specific projects. We never receive a gratuity or donation for no purpose.

Is it the same for the National Anti-Bullying Research and Resource Centre?

Professor James O'Higgins Norman

We have a specific project that has received funding from Facebook. We have other projects which have received funding specifically from Vodafone, Dublin City Council, the European Commission and so on.

I heard all of that. The funding from Facebook was a new departure.

Professor James O'Higgins Norman

It was a new departure for us. I am not sure if it was a new departure for Facebook.

It was a new departure for the centre. Prior to that, the centre had not received funding from Facebook or any other organisation.

Professor James O'Higgins Norman

We had received funding from Vodafone. We have received some private donations in the past. I would have to go and check.

Is the centre involved in policy, at any level, with any of these platforms?

Professor James O'Higgins Norman

No.

Is SpunOut.ie involved in policy with any of these platforms?

Mr. Ian Power

In what sense does the Deputy mean that?

At the highest level possible in the formation of policy around the removal of harmful content and so on. Is SpunOut.ie involved in that?

Mr. Ian Power

We inform the decision making of the platforms. We highlight it to those companies when we see things that are not working. We tell those companies if their current policy is not working and needs amending.

SpunOut.ie would not be involved in the formulation of the policy?

Mr. Ian Power

That would be entirely for the companies themselves.

Would it be open to the companies to bring in such advice at that level if they were seriously interested in doing so?

Mr. Ian Power

It would.

I thank the representatives from Dublin Rape Crisis Centre for attending. Submissions from that centre always make for painful reading. We hope that, at some stage, the level of gender violence will decrease and be dealt with in a different manner. However, this has not happened to date. I thank the representatives from the centre for their hard work. Will they talk to me about legal aid, the problems they are encountering and the obstacles to people accessing legal aid?

Ms Noeline Blackwell

Would the Deputy like me to speak in the context of harmful communications in general?

I ask Ms Blackwell to speak on this topic.

Ms Noeline Blackwell

My colleague, Ms Scott, rooted around in our own statistics yesterday and might tell the Deputy what we came up with. This is by way of actual evidence of what we are hearing on the helpline.

Ms Shirley Scott

People disclose sexual violence that has happened to them and may also speak about additional harm where technology is used. Our helpline is starting to capture the use of technology as a growing trend. We started looking at it back in 2016 and, at that stage, less than 1% of the callers were speaking about online harassment and communications in addition to the sexual violence they had suffered. That had increased to 1.3% in 2017 and, in 2018, the figure was still less than 2% but it is rising.

Ms Noeline Blackwell

The focus of the centre has principally been on the fact that it is so hard to actually find an accountability mechanism, let alone legal aid. The fact that there is not a sufficient criminal framework means that even reporting complaints to the Garda is difficult. That is true for the once-off sharing of an image by an intimate partner, or former intimate partner, of a person. That image may have been taken consensually but is shared non-consensually or was taken non-consensually. There are also cases where the harassment is so-called upskirting or the covert taking of images. Those issues cannot even be reported to the Garda, let alone brought to a stage where somebody is in a position to bring a civil suit on privacy grounds.

Legal aid and the lack of law is in some ways the most significant problem we see, if that makes sense.

It does. There is a complete absence of remedy.

Ms Noeline Blackwell

Exactly. There is an absence of a remedy or a way to hold someone to account.

Even if there were a remedy, there would be access issues because of legal aid.

Ms Noeline Blackwell

Legal aid would come into that broader thing of where the victim gets proper legal advice and support along the way in reporting what is, in effect, sexual abuse online.

That is a theme which has come up repeatedly with regard to the use of language. We need to move away from using the term "revenge porn" rather than "abuse", which Ms Blackwell has mentioned. It is abuse to transmit the sexual image. One thing is language and another is where we go on legislation. Third is the larger issue of accountability and openness and how we do that. We need that information. We need information to analyse the problem and its extent. Currently, it appears to be gender based once again, which has to be said.

Ms Noeline Blackwell

In that regard, there is fairly common agreement that the Non-Fatal Offences Against the Person Act 1997 is insufficient to deal with what everybody understand is harassment, stalking and abuse online. That is a piece which does not need research.

Is Ms Blackwell suggesting the Act can be amended immediately rather than to wait for a larger legislative enactment?

Ms Noeline Blackwell

Certainly, that is an immediate need.

Mr. Liam Herrick

I agree strongly with what Ms Blackwell has just said. It is a centrepoint of our submission that there is common agreement on the need for a discreet and specific amendment to that legislation to deal with this issue. It is also agreed that legal aid is a central issue, which we identify in our submission. As the committee will be aware, there are wider challenges within the legal aid system and there are other areas but this is clearly one which is about practical implementation. I echo the comments of Dr. T.J. McIntyre who attended the committee last week and said there was a great deal which could be done to implement the existing law effectively. That should include providing An Garda Síochána with financial resources, human resources and technical expertise. There is a great deal that can be done to enhance current implementation.

I thank Mr. Herrick. That is very helpful. When An Garda Síochána was before us it was clear that we did not have enough data on the breakdown of resources. Some unit that was supposed to be set up was going to be set up next year.

With regard to Dara Quigley, the failure to hold anyone to account in that matter goes to the heart of what we are talking about. Things happen but something must be done about them. However, nothing was done about that. I am not going into the personal details but it would be remiss of me not to mention the case. I am grateful it was highlighted.

Before I move to the next speaker, Deputy Gino Kenny, I remind members that this is the fourth and last of our hearings. As we have done on each of the earlier occasions, we will invite people for a group photograph afterwards. If members are going off to other work, they might make their way back to complete the report.

It has been a very interesting session and a great many of the questions and observations have been responded to very well. This is an area which has been of huge interest to Deputies, Senators and society as a whole. I will be brief, as I do not want to labour any point. We are here because of what was done to Dara Quigley. I did not know her but a terrible wrong was done to that young woman. We look at social media as a wondrous thing but it has an underbelly of depravity, which sometimes brings out the worst in people.

Justice has not been served with regard to those involved in Dara Quigley's untimely death, but I hope it will be.

These platforms make a great deal of money from young people and there is a form of social baiting that takes place on them. They give a platform to people who would not normally act out in some ways. They are given ways to act out online. There is a great responsibility on technology companies to act responsibly not only in respect of themselves but in relation to the people who use their platforms. Everybody has a smartphone. In some ways, we love them because they are so useful and permit human beings across the world to communicate. However, they have also become dangerous weapons that can destroy someone in the blink of an eye. There have been other incidents in which people have shared images, videos and all sorts of material which were deeply personal and have wronged not only the victim but the victim's family also. It is extremely difficult to police it. The genie is out of the bottle. How do we monitor the wild web? It comes down sometimes to our own behaviour but largely to the companies that provide this technology. They say it is freedom of expression and freedom of speech but one does not provide freedom of hate and freedom of depravity which hurts other people.

The question is how to police these platforms. The law needs to change. It must be updated from where it was in the mid-1990s. The Internet did not exist as it does now 20 to 25 years ago. If individuals did elsewhere what they sometimes do online, they would generally end up in court. In Dara Quigley's case, for example, if someone put such images up in the form of posters all over the city centre and was identified, he or she would end up in a court of law. There is no doubt about that. However, the fig leaf provided by online platforms has given some people the green light to do things they should not. It is just an observation. The law must be updated. The Non-Fatal Offences Against the Person Act dates from the 1990s and contains no reference to cyberbullying or digital or online harassment. I understand the law will be updated by the end of the current parliamentary term. I hope that happens to protect everyone who uses the Internet. The Internet is here to stay but while there are positives to it, there is also an underbelly that has extremely detrimental effects on our physical and mental health.

The Deputy is not asking a specific question.

It is an observation.

The Deputy is very welcome and I thank him for attending today. I call Senator Ruane and will leave Senator Ó Donnghaile with the last contribution.

I am not a member of the committee and apologise for not being here for the opening statements. I became aware of the topic quite late and, as such, it was a bit of a last-minute dash. I thank the committee for allowing me to contribute.

I am motivated to come here by two particular cases, namely, that of Dara Quigley and that of Jacqueline Griffin.

I have specific questions which are directed at all the witnesses. One of the questions has in some way been answered in terms of the Non-Fatal Offences against the Person Act. What do the witnesses think of Deputy Howlin's proposed legislation? Alternatively, to they believe we should look to the Non-Fatal Offences against the Person Act? I also refer to the amendment suggested by Dara's mam, Aileen Malone, on not including a requirement concerning intent to cause distress. There is a huge onus on the individual or family to prove the intent of somebody else to cause distress. In terms of the question concerning the Non-Fatal Offences against the Person Act or Deputy Howlin's legislation, how do the witnesses feel about moving away from revenge porn and creating another form of offence that does not specify that requirement on an individual? I am not sure how many members are aware of Jacqueline Griffin's case concerning the sharing of her decapitated remains following her death on the M50. She was a family friend of mine and her brother is one of my closest pals. Because I know the distress and harm that caused to the Griffin family, I asked for their consent before I raised Jacqui's case because I watched the absolute horror that family faced. I sat with him while he begged me to tell him the truth on whether I have seen that image, where it is and if it is still online. It was absolutely horrific. How do we begin to move, in the legislation and in the development of policy, to also include and have a conversation about giving consent when somebody is no longer alive to give it? Paul asked me to contribute on his behalf this morning. He sent me a few lines to read which I will paraphrase. He says it better than I can.

Jacqui wasn't alive to consent to the sharing of her decapitated remains all over social media. Just because her accident happened in a public place should not mean that she does not have a right to privacy and dignity. The damage that this caused our family has destroyed our faith in humanity. It goes beyond the initial trauma of just seeing the image, but has completely destroyed our emotional well-being. Everyday life is impossible. Everything from travel, work and even our own ability to have an online presence.

How can we begin to legislate for that? We are very aware of who shared that image and I wonder what should be the criminal sanction on somebody that records somebody's death and something so horrific. The online platforms acted as quickly as they could when we tried to intervene at the time. How can we create legislation to deal with the recording by an individual of someone's death and is that an area that has been examined in the context of causing of harm? Obviously the person in question is no longer there to take the case. Can the case be taken on behalf of the family and due to the impact on the family? What is the legal position in terms of that family being able to pursue something? I wonder if that could be part of the conversation.

I will give witnesses an opportunity to respond because that is very specific, very particular and very upsetting. I invite Ms Farries to speak first.

Ms Elizabeth Farries

I thank Senator Ruane for raising this additional example of abusive treatment online. I also thank Paul for the statement, which captures very well the types of harms that are experienced when these images are shared without consent online. It illustrates perfectly how incredibly damaging and challenging it can be. I would like to respond to the two specific questions asked by Senator Ruane about criminalisation and the types of legislation that we might look at. The Irish Council for Civil Liberties supports the findings of the Law Reform Commission. There might be an amendment to the Non-Fatal Offences against the Person Act in order to criminalise those sorts of behaviour. We specifically talked about image-based sexual abuse. We support the criminalisation of that through an amendment of that specific Act. We also support the creation of a civil wrong. That is the response to the first question.

I will now respond to the second question on the intent to cause distress. Senator Ruane referenced Dara's mother's statements about that to the media. There is Scottish legislation dealing with image-based sexual abuse that focuses on the circulation of these images with the intent to cause distress.

That does not go far enough because, as we have said in the submissions, intention to cause distress is not the only motivation. There are a variety of intentions. To be clear; criminal law requires intent. There needs to be an action and an intention. We just want to be careful that we do not inadvertently narrow that intention so that other reasons are lost. Perhaps someone circulates an image because he or she thinks it will be lucrative and he or she wants to make a lot of money. Perhaps it is being done for salacious reasons. Perhaps the principal motivation is to violate someone's autonomy and dignity. That does not necessarily capture distress alone so, as always, we need to be very careful about how we draft this.

Mr. Ian Power

I thank Senator Ruane for raising this case. I also thank Paul for allowing us to hear Jacqueline's story today. From our perspective, as Ms Farries has described, any legal framework has to consider intent but it should also include impact. One of the things that we have seen in the devastating consequence of harms that are caused to people through the sharing of content online is that often the impact is minimised and often is not well understood. While we might have heard of Jacqueline's case at the time, it is really important to hear Paul's story today articulating that it does not just end there, that the impact is much further-reaching and there is a longevity to it that people do not realise. From our perspective, in terms of somebody who shared those images, and in the case of Dara Quigley as well, while obviously we are not in those people's frame of mind, we have seen that the public are just not aware of the consequences of their actions. Whether there is intent there or not, people need to understand that publishing images as distressing as the ones mentioned denies a person the right to privacy and dignity.

One of the first cases we had to comment on in that regard concerned a music festival and a young girl whose images were posted online. We would have thought we had come some way since that point towards understanding the consequences of sharing content that we do not have the right to share. It is not within our gift to decide whether content should be shared. There is obviously context to the issue as well. We must think about the platforms as well, in that when the Arab Spring was happening or even when the violence was perpetrated against people fighting for independence in Catalonia, there is a context to some of this content. We should be able to see some of the harms that are caused to people but, again, it should be proportionate and there should be mature consideration about where that applies and where it does not apply. For us, ultimately, it has to focus on impact on the individual. Again, the reason we need a digital safety commissioner is to be able to escalate hard cases like this where time is of the essence to be able to restrict the further sharing of such content. To be fair, the platforms do respond in certain circumstances such as those outlined. I do not think they have to be of such a gravity as the cases of Jacqueline and Dara to require a quick response. Every case where there is content about a person or that causes a person distress should be responded to just as quickly as in those circumstances. We need to create the framework to allow that to happen.

Dr. Mairéad Foody

The example is so horrific it hits one in the face that even though it is a legal matter, it is also a civil and human matter. Most of the work we do is with young people but we just should not engage in adult-to-adult sharing. We have rights but there are responsibilities on us to act appropriately.

Young people are a little better at this because they have lived more in the online world and are fluid between both. For some reason, the adult generation seems to see it as being separate offline and online worlds. We do not seem to see that we have to be responsible online in the same way.

With regard the impact of bullying, the definition of bullying that we use in our work and research is that there has to be repetition, intent to cause harm, a negative impact and a power hierarchy. That holds for cyberbullying when it is peer-to-peer, but some of those things will be different. Deputy O'Callaghan mentioned that it might not be persistent. A person might take one photo, so it might be hard to penalise that person or say that he or she is perpetrating bullying because he or she has only done it once or shared it once. However, the impact on the victim is repetitive and those images have been shared all over the place. It is persistent in that sense and it is harder to see if there was intent to cause harm. Once those images are available, there will be an impact no matter what, because one does not know where they are and cannot imagine where they will go next. The ideas relating to intent needs to be changed. We need to think more about consent being sought and make sure that we have appropriate language, understanding that we should be asking for consent from the beginning before we take any image and share it. Rather than focusing on sexual or horrific content, it should be in our day-to-day language and we should flip the way society talks about it from being a separate, online issue. We should be asking for consent offline before taking a photo or sharing it. Young people are a bit better at doing that than adults.

Professor James O'Higgins Norman

There is a legal question and Dr. Foody has talked about intent and impact. There is also the matter of decency. We should have a core value system that means that a person will not do something just because he or she is not allowed to but because he or she should not do it. If we lose that in society, then an educational response is needed across society. We keep talking about schools but it also applies outside schools, in the community, in clubs and wherever people gather. We need to re-educate society about what it means to be a decent individual.

Could I add something specific? I understand all the educational impacts. Is there a constitutional provision about violating someone's right to privacy? Is there a legal provision for an individual that breaches it?

Professor James O'Higgins Norman

I will have to defer to the legal experts on that.

We can bring Ms Blackwell in. If there is an additional point to be added, I will invite Ms Farries to do so.

Ms Noeline Blackwell

The sharing, or the stealing as one might say, of Dara Quigley's film, and the taking and sharing of the image of Jacqueline Griffin's body, should both be recognised as communications. They may not be in our law at present. The law should be much clearer that they are communications and also that they are communications that result in harmful content being shared. Many other jurisdictions, such as Australia, have come up with ways of framing that. New Zealand talks about serious emotional distress, as experienced by the Griffin family. As Mr. Power said, there does not have to be a death to cause serious emotional distress but it can be a factor. The fact that we do not have sufficiently clear legislation to cover this as a communication and then to recognise that it is a harmful sharing or communication needs to be addressed quickly. Our figures show that it is becoming an increasing problem. We need to make sure that those things are clear because, to answer the Senator's question, the current law is not strong enough to deal with it, and it needs to be.

Ms Shirley Scott

To add to what Ms Blackwell said, if we make online harassment and harmful communications more visible in the legislation, we might see victims coming forward to seek the support that they need for the devastating impact of what we hear on our helpline and in our counselling rooms every day.

Does Ms Farries have anything further to add in response to the Senator's question?

Ms Elizabeth Farries

I think the Senator asked if we have a codified right to privacy in Ireland.

Ms Elizabeth Farries

The answer is definitively "Yes". It is in Article 40.3 of our Constitution, Article 12 of the Universal Declaration of Human Rights, Article 17 of the International Covenant on Civil and Political Rights, Article 8 of the European Convention on Human Rights, and Article 7 of the Charter of Fundamental Rights of the European Union. We have a wholesale right to privacy. It is well established and well codified. What we lack are specific laws about specific crimes, which we can address through the Law Reform Commission and our submissions. There needs to be a culture within the Garda to be able to tackle that and to have human rights-based principles and sufficient resources to take it on.

I apologise to our visitors for missing the presentations. I have listened to what has been said and I am very conscious of Senator Ruane's contribution. The contributions indicate the multifaceted nature of what we are dealing with. I am not a legal expert, nor am I an expert on social media, as much as I would like to think I am. It is absolutely apparent that young people are using platforms that are not even familiar to most of us. We had a presentation on this theme at the British-Irish Parliamentary Assembly yesterday, where I asked the question that I am going to ask the witnesses. Please do not interpret it as me being glib since I mean it sincerely. Since it represents the bulk of what we are dealing with and what we understand to be the main form of online harassment and harmful communication, I would like to get some understanding from the witnesses about how we go about legislating for toxic, harmful or warped masculinity. For me, there seems to be a common thread throughout most, though not all, harmful, abusive communications online. The need for education and a cultural change was discussed at the meeting I was at yesterday and that is absolutely the case.

We have referred to the main social media providers such as Twitter and Facebook and things being posted on their services. We could implement laws and define the legal instruments required to deal with some of this. However, if we do not legislate for the root cause, then even with the greatest will in the world, if something is posted and dealt with in a matter of minutes, taken down, and sin é, it has still found its way in that time onto WhatsApp and such, where it is shared on perhaps 47 different group chats with an average of 20 people each. There is an issue of culpability, and I would like a clearer understanding of who is responsible. Is it the producer of the image, the sharer or the holder? People have God knows how many WhatsApp messages a day and sometimes they do not even know what they have been sent.

I would appreciate the witnesses' views on those themes. That is not to let the providers off the hook. Legal and legislative instruments need to be agreed and provided for, but we need to address it fundamentally as opposed to just saying that we need to change the culture or to have a societal conversation. We have seen that problem of toxic, harmful and warped understandings of masculinity on public platforms and how harmful some of that can be. It happened in my own home city, with the notorious trial that took place last year.

Professor James O'Higgins Norman

The Senator is asking how we legislate to deal with toxic masculinity. I do not have an answer to that but I certainly know that education works. When we teach young people about diversity and inclusion, and give them other opportunities to think about what it means to be a man or a woman, research shows that it makes a difference.

My PhD was in the area of homophobic bullying and we tested educational initiatives to reduce homophobic bullying among males. When they were implemented, they worked. From an educational point of view, there are many examples of how to proceed on this. Legally, how to enforce that or criminalise people for being a bad type of guy, and so on, is a question for someone else.

Are we seeing that at a younger generational level? We have not yet seen what the professor is talking about in terms of the education that is required, although it is happening to a degree. Do we need to wait a few years before we start to see that at a broad, informed, societal level?

Professor James O'Higgins Norman

Policy and laws frame a space in which we exist. For example, the introduction of the Equal Status Acts and related legislation around that sends a message out to society that we should not discriminate against people on certain grounds, but it also has a ripple effect and it has an effect in terms of thinking about their behaviour. That kind of legislation, set alongside education, can have a very positive effect. However, there is still not enough going on in our education system. I do not know if the generation coming out of school today will be any better around these topics.

There is currently a revision of the RSE programme in schools that is being undertaken by the NCCA and I hope that revised RSE programme will seek to tackle these issues of toxic masculinity and restricted forms of femininity and so on, in a more direct way than we have in the past. That requires resourcing and education and training for teachers. Research tells us they are slow to deal with some of these topics in the classroom because they are not comfortable talking to kids about some of the issues that might come up in a classroom when dealing with these topics. Education certainly helps.

Mr. Ian Power

The first thing to say about toxic masculinity and the current generation is that if one looks at the reaction to the Belfast rape trial on Twitter, the lack of empathy from quite a significant proportion of young, male Twitter users was astounding. From our perspective, it underscored the amount of work we have to do to effect that cultural change, and education is the silver bullet in that regard.

Does Mr. Power think so?

Mr. Ian Power

I do. Ultimately, it is from infancy upwards, which is a long game. However, in terms of what we can do in the short and medium term, for perpetrators, who are predominantly male, at present there is no accountability so there is no deterrent to refrain from engaging in either offensive behaviour or harmful behaviour. The purpose of all of us being here today is that we have seen that the lack of specific offences on the Statute Book has prevented gardaí in local Garda stations around the country from being able to take reports and complaints from citizens about, say, image-based sexual abuse, harassment, cyberbullying or otherwise, and bring those cases through to resolution after deciding whether it is a civil or criminal matter.

From our perspective, we need to see the gardaí being able to very clearly see what is an offence on the Statute Book and being able to take that and enforce it, so there is a deterrent that stops people from engaging in this type of behaviour. At present, there simply is not. We see so many cases where images are shared in private messaging groups and, ultimately, there is no sanction for that. Therefore, we cannot be surprised that it continues to happen.

We do not solicit these cases and we are not a one-to-one service, but somehow they still come to us because there is nowhere else for people to go. We have had a number of cases where younger guys, perhaps in their late teens, have shared images with somebody else, where the other person had a false identity and was then extorting them with the threat of sharing the images publicly. That has a huge impact on those young people.

We have been able to escalate them to the child exploitation unit within the Garda. However, at local level the resources and capacity to respond are just not available. We need the legal framework to be able to do it. We need legal aid for young people to gain access to remedies quickly. Ultimately, we need the regulation to regulate any other harmful content.

To address the Senator's opening point, just because something is posted and taken down quickly, there is still the event and it should still be prosecuted, particularly where adults are concerned. It is not enough to have the content removed. It still needs to be pursued. It is not the case that once the content is removed, that is the end of the matter. It should be pursued relentlessly in order that a deterrent is clear and clearly established in order that we can start to clamp down on this behaviour. I hope education will do us right in time, but it is a long game. We need to have the legal framework now in order to be able to deal with things in the short and medium term.

Ms Noeline Blackwell

As I agree with what Mr. Power has just said, I will not repeat it. I recognise that it is not just a general air of toxic masculinity or toxic anything else, but very often it is a malicious intention to do harm through digital communications which currently cannot really be prosecuted. The current harassment laws are insufficient, as are current stalking laws. We need a better definition of communication and to have it recognised that, where harmful behaviour happens, it is harmful by reference to impact. It will take into account particularly those who are targeted, not just the general fall-out of the harmful behaviour. Sometimes, people are targeted and that should be a particular element of the prosecution in the case, which is very important. It is the carrot and stick approach. The carrot is education, with which we all agree. As for the stick, to go back to Deputy O'Callaghan's analogy about driving a car, we stopped drink-driving partly by awareness raising but also because people's driving licences were being taken off them. We need personal prosecution, as well as awareness raising.

Ms Elizabeth Farries

I thank the Senator for raising the issue of toxic masculinity. It is part of the problem that needs to be addressed and it is very important. Dara Quigley was certainly exposed to it. It is not absent from law enforcement institutions. Long ago academics used to think that what they called information and communications technology spaces would create safe zones for people who were experiencing things like toxic masculinity to group, talk and collaborate. What we are seeing instead is the opposite in that the inequalities we experience offline are amplified online. Toxic masculinity is certainly an example.

As for solutions, in our submissions to the committee, in addition to law and our rejection of the roll-out of State CCTV surveillance systems, we advocate changing these cultural issues at Garda level. We advocate education, training, assessment and integrated modules across units to ensure the harm attached to the intersectional nature of online harassment, including toxic masculinity, are addressed. We have to be very careful that we do not erase the context of power and marginalisation in which the inequalities occur by simply saying something like "bullying" or "cyberbullying". West Coast LEAF is a rights group in Canada and I am a lawyer who used to practice in Canada. It likes the term cybermisogyny because it captures better the harm associated with things like toxic masculinity.

On the issue of education, is there a role, beyond what is being done, for social media providers to do it credibly, or does it just become corporate speak? For example, we might bring to bear a legislative requirement for advertisements or perhaps users to have to go through a firewall to access some of these things. Is there a role for providers to talk to people, regardless of age - it can be designed and framed in a certain way - about the issues of consent and people's own personal responsibility? I have worked previously in education and think there is a role for education providers to play, but it needs to be even broader.

We need to empower and enable parents and guardians in speaking to their kids about these matters in an informed and appropriate way. We must also empower youth clubs and youth service providers. This may be happening already, but I am not aware if it is. How crucial is it that we reach a position where we can compel some of the providers in raising and honing in on some of these matters?

To whom is the Senator addressing the question?

I am asking whoever wants to take it. It is general.

We will not ask everyone.

Ms Noeline Blackwell

This is where the digital safety commission - it cannot be called that because SpunOut does not like the word-----

I know what Ms Blackwell means.

Ms Noeline Blackwell

The commission will be able to formulate codes of conduct which are crucial, but we need a framework and an objective set of criteria. To be clear, some social media companies are doing absolutely nothing in this area, including dating sites, etc. They are big social media platforms, but they are not doing any work in this area, at least to my knowledge.

Dr. Tijana Milosevic

I return to the point I made about parts of the platforms some social media companies provide, including help and safety centres and bullying prevention resources. They provide that education voluntarily. In our submission we propose that it should not just be up to the company's corporate social responsibility or self-regulatory practices but that there should be a requirement for them to assist with funding, with the Government, for education. We speak about young people, but I cannot speak for other parts of society. In addition to funding education measures, there should be psychological assistance for victims when bullying happens, for example. We would have to leave how it would be executed to the committee. Broadly, it should not just be done on a voluntary basis.

I thought I was ready to wrap up this discussion, but Deputy Paul Murphy is back.

I will be very brief.

I thank the Deputy.

I have one question about Ms Dara Quigley. I pay tribute to her family for continuing to fight her corner for her because of the injustices she suffered, as well as to the ICCL for the campaign it will launch seeking justice for her. Is it because of the law that no one is being held responsible for the abuse she suffered before her death? It was responsible for her death. I am convinced by the various presentations made, by which I have been informed and educated, that legal changes are needed. Is it the case that there is no law applying under which somebody could be held responsible? I refer to the garda who shared the video, etc. and whether they could currently be held responsible. Is it simply a legislative problem or are there other problems which mean that people are not being held accountable?

One of the delegates will have to go for that question.

Ms Elizabeth Farries

I thank the Deputy for his question. The laws need to be updated, but it is not simply about the law. There are issues at Garda level that we have outlined and identified. There must be cultural changes at that level and we have provided suggestions on how to do it. There was a third suggestion that I have repeated over and over again today, but it has completely escaped me. I feel it has not been part of the conversation, which is why it has left my mind.

What about the expanding web of surveillance technology in the State to which we are all exposed? It is increasing every day and it is not just coming from private companies. It comes from the State. We must fundamentally rethink it for the sake of privacy which, as I have outlined, is heavily codified. There are also data protection principles at EU level. The use of closed-circuit television systems is wholly unnecessary, given the level of invasion of our privacy. The egregious treatment of Dara Quigley really illustrates the point. The risks to our privacy and willingness to associate with people important to us, travel to areas of a city important to us and speak out are too great. A host of rights are implicated. The data which suggest CCTV systems warrant that invasion of our fundamental rights are not available.

It is not worth it and the horrible experience of Ms Dara Quigley illustrates that. This is a call to the committee to keep that under consideration when it examines the problem of online harassment. There are individual perpetrators and social media companies but we cannot forget the role of the State.

There is nothing to add to that as we agree with the point. If Deputy Murphy is finished, it remains for me to say it has been a very informative session. On behalf of the Oireachtas Joint Committee on Justice and Equality, I thank Mr. Herrick, who had to leave early and had indicated that to me previously. I also thank Ms Farries from the Irish Council for Civil Liberties. Well done to Mr. Ian Power and Mr. Jack Eustace from SpunOut.ie on their work and I thank them for their contribution. Professor O'Higgins Norman, Dr. Foody and Dr. Milosevic have been very helpful in today's process and I thank them very much. They are based in DCU and we wish them continued success in the work of the national anti-bullying research and resource centre. Ms Blackwell and Ms Scott had little opportunity to prepare but seized the moment when the gap appeared. Well done as always.

I certainly will not conclude the meeting without making reference to the fact that we have for all the sad but right reasons reflected very particularly on the very tragic passing of Ms Dara Quigley. As has been mentioned, we are honoured to have her mother, Ms Aileen Malone, with us in the Public Gallery. I know it cannot have been easy for her to attend this morning but we thank her and those who accompanied her here this morning to our final session on online harassment and harmful communication. In her absence, I thank Senator Lynn Ruane for introducing the brief but very pertinent statement of the brother of the late Ms Jacqueline Griffin. These and many other cases have been addressed or highlighted over the past four weeks of hearings and they are the driver behind us choosing to try to grapple with this. Each of the sessions, including today's, which is no different, demonstrates that this is a very complex area and it is not a simple matter. It will not be simply remedied.

As colleagues have said, we now have the responsibility to prepare a report and deliberate on the recommendations we will put forward. I hope, if we are still a sitting Parliament, we will have the opportunity to publish that report and recommendations in late November or December. We do not know if we will still be here but we hope we will. We do not want to lose the value and potential of this exercise. I invite all members present, including Deputies Paul Murphy and Gino Kenny, who are not members of the committee, to join us for a group photograph with our guests outside in order to complete our snapshots of these sessions.

The joint committee adjourned at 11.50 a.m. until 9 a.m. on Wednesday, 6 November 2019.
Top
Share