Léim ar aghaidh chuig an bpríomhábhar
Gnáthamharc

Joint Committee on Tourism, Culture, Arts, Sport and Media díospóireacht -
Wednesday, 16 Oct 2024

State Response to Online Disinformation and Media and Digital Literacy: Discussion (Resumed)

We have received apologies from Senator Annie Hoey. I thank our guests for coming in and being here with us. We really look forward to what they have to say on this very topical issue of disinformation and misinformation.

I have a little housekeeping to do before we proceed, if our guests can bear with me and my colleagues while we get through that. I have some committee business to dispose of before we hear opening statements. Can I take it that the minutes of our meetings of 18 and 25 September 2024 are formally agreed and that no matters are arising?

The meeting today has been convened with representatives from Hope and Courage Collective, the Institute for Strategic Dialogue, Media Literacy Ireland and DCU's Institute of Future Media, Democracy and Society to discuss the State's response to online disinformation and media and digital literacy, including social media and fake news. This meeting continues the committee's engagement on related matters. Regrettably, we will not have representatives from the NUJ with us. They are unable to attend but have been very good and regular attenders and have made a written submission for the committee's hearing today. We thank them for that.

I warmly welcome the following witnesses to committee room 1. From Hope and Courage Collective I welcome Ms Edel McGinley, executive director, and Ms Niamh McDonald, director of advocacy and community engagement. From the Institute for Strategic Dialogue I welcome senior analysts Mr. Ciarán O'Connor and Ms Aoife Gallagher. I welcome Ms Martina Chapman, national co-ordinator of Media Literacy Ireland. Finally, from DCU's Institute for Future Media, Democracy and Society I welcome Dr. Shane Murphy, who is a postdoctoral researcher.

The format of today's meeting is such that I will invite our witnesses to deliver their opening statements, which are limited to three minutes each. This will be followed by questions from my colleagues. As the witnesses are probably aware, the committee will publish the opening statements on its website. Are we all agreed on that?

Before proceeding to the opening statements, I will explain some limitations as regards parliamentary privilege and the practice of the Houses as regards references witnesses may make to other persons in their evidence. The evidence of witnesses physically present or who give evidence from within the parliamentary precincts is protected, pursuant to both the Constitution and statute, by absolute privilege in respect of the presentations they make to the committee. This means they have an absolute defence against any defamation action for anything they say at the meeting. They are, however, expected not to abuse this privilege, and it is my duty as Chair to ensure that privilege is not abused.

Members are reminded of the long-standing parliamentary practice to the effect that they should not comment on, criticise or make charges against a person outside the Houses or an official either by name or in such a way as to make him or her identifiable.

I propose we now move to the opening statements. Maybe we could begin with Ms Edel McGinley on behalf of Hope and Courage Collective.

Ms Edel McGinley

I thank the committee for the invitation. Hope and Courage Collective was set up in 2021 and is an independent national organisation supporting communities and civil society across Ireland to be resilient in countering the far right and the spread of hate and disinformation. We have developed a cycle-of-hate framework to help understand how different aspects of political leadership, digital platforms, mainstream media and disinformation create the conditions for haters to shift power in our communities and democracy.

Over the past two years, we have tracked huge growth and spread of hate online and offline, the mainstreaming of far-right ideas and language, and the hardening of progressive policies. This is led by a small core of ideologically committed far-right actors who constantly stoke hate, fear and violence. They benefit by making us afraid of one another and they distract by scapegoating and turning us against one another. The escalation of violent rhetoric, threats and attacks leaves people and communities targeted by hate and racism, feeling increasingly unsafe and afraid to speak out.

Calls to racist violence were recorded across all social media platforms in the lead-up to riots in Dublin, in Belfast and across England. In the past two years, Hope and Courage Collective has provided rapid response supports in 124 incidents. Our community engagement programme provides high-intensity outreach and community support to 58 communities who pride themselves on being welcoming and know what their communities need to thrive. This includes providing support to respond to the two Dublin riots and violence in Belfast, six incidents of election interference, eight of LGBTI hate, library disruption and intimidation, and 41 responses to anti-migrant protests.

This has been driven by the business model of social media platforms, which prioritises emotionally manipulative content through algorithms that drive sharing and engagement for profit. This distorts what people believe to be true, manipulates emotions and heightens tensions. In reality, a handful of big tech corporations and billionaires, who serve their own interests, put people, communities and our democracies at risk. Despite consistent reporting by Hope and Courage Collective of illegal and harmful content to platforms, there remains a huge gap between reporting and removal. Since 2023, there have been huge cuts to trust and safety teams. As a result, harmful, hateful and defamatory content, incitement to violence and disinformation posts remain online, with little action taken by platforms for their removal.

In tandem, there is a failure of the State to protect people seeking asylum without accommodation and those in accommodation centres, who are constantly harassed and targeted by far-right actors. There is also a failure to protect public sector workers, which is acutely felt in libraries and schools, who are often at the sharp end of hate groups whose messages proliferate online.

We welcome the introduction of the Digital Services Act and the powers given to Coimisiún na Meán. The soon-to-be adopted online safety code is extremely important. It is new and untested, however, and it is hard to say how effective it will be in holding platforms to account. This approach is also reliant on trusted flaggers, the public, civil society and others to report harmful and illegal content. An emerging issue is the treatment of trusted flaggers, with no funding made available. There is also a lack of clarity on definitions of illegal and harmful content. We would welcome greater clarity on this in order that social media platforms do not evade accountability.

There is work to be done to increase public literacy on what people can do to keep themselves and their data safe online. It is, however, the responsibility of platforms to be transparent and accountable for the harms they cause. An Garda Síochána's response to online and offline hate is also uneven across the country, although there are some interesting new cases in train.

In the same way there is no single cause that leads to the cycle of hate, there is also no single solution. Instead, it is important to think about what is needed at a societal level to protect the freedoms of us all to thrive irrespective of where we are from, whom we love, our gender or our status in society.

Ms McGinley, I will pause you there because we have gone way over the time.

Ms Edel McGinley

Sorry.

We will take the rest of your statement as read because we all have copies. Is that okay?

Ms Edel McGinley

Okay.

Thank you. I now call Mr. Ciarán O'Connor on behalf of the Institute for Strategic Dialogue.

Mr. Ciarán O'Connor

I thank committee members for the opportunity to contribute to this discussion. My colleague Aoife Gallagher and I represent the Institute for Strategic Dialogue, an independent non-profit founded in 2006 to counter online hate, extremism and disinformation globally. Ms Gallagher and I lead the ISD's research in Ireland and we were invited to contribute to the development of the national counter-disinformation strategy. Our work has documented the spread of falsehoods during the pandemic, homophobic smear campaigns targeting the LGBTQ community, including some politicians, intimidation and harassment, campaigns targeting librarians and teachers, and hateful narratives targeting asylum seekers with false claims that often result in violence.

This activity has only increased in recent years, as documented in a series of reports we released in November 2023, days before the riots in Dublin. Social media platforms, driven by business models that prioritise engagement over safety, routinely promote sensationalist, extreme or false content, rapidly spreading it to vast audiences. This makes ongoing monitoring essential to tracking the evolving dynamics of disinformation. Ireland's research sector in this field is still nascent, so such work remains enormously resource-intensive. As conspiracy networks boomed during the pandemic, we saw a growing intersection of misinformation and disinformation, hate and extremist communities online. Permissive digital environments have resulted in blurred lines between a broad ecosystem of online movements and the emergence of an increasingly hybridised threat landscape.

The ISD's research has evidenced three major challenges as regards social media: platforms are inconsistent in enforcing their own rules; platforms' own products and algorithmic systems have been found to recommend disinformation and hateful content; and platforms are now actively shuttering data access for researchers, making assessment harder. To resolve the information crises experienced in recent years, these three problems must be addressed. Transparency, including the type of data access for independent researchers now mandated by the DSA, is essential for improved platform accountability. This will enable the assessment of platform moderation efforts and, indeed, the assessment of potential overmoderation of legitimate speech. While illegal content should be removed, when it comes to falsehoods, we must balance the protection of freedom of expression with protections from harm.

In 2024, our work in Ireland has tracked misinformation and disinformation targeting our democracy, including threats and violence against local election candidates and a campaign spreading false claims alleging voter fraud was rampant in Ireland, a first for this country.

Our message to conclude is that with emerging technologies like generative AI, digital threats to democracy are growing. A robust whole-of-society response is essential, one that encompasses proper regulatory oversight and enforcement, meaningful platform transparency and data access, and investment in digital literacy education to empower citizens to critically evaluate the evolving information environment. I thank the committee and we look forward to answering any questions members may have.

I thank Mr. O Connor. I call Ms Martina Chapman on behalf of Media Literacy Ireland.

Ms Martina Chapman

I thank the committee for the opportunity to contribute to this meeting today. Media literacy is an umbrella term for a constantly evolving set of skills and knowledge that are required by people to function effectively, safely and ethically in a media-dominated world. Media literacy needs vary considerably from person to person and will change based on personal circumstances, life stages, social norms, changes in technology, and even changes in legislation. As such, it is a lifelong learning process that cannot be delivered by any single organisation or sector. At the heart of media literacy is the ability to understand and critically evaluate media messages and assess information to facilitate informed decision-making. Media literacy is increasingly being recognised in policy agendas and in legislation as a tool for helping to counter disinformation.

Media Literacy Ireland, MLI, is an independent, informal alliance of organisations and individuals working together on a mainly voluntary basis to promote media literacy. Facilitated and funded by Coimisiún na Meán, it is an unincorporated body with more than 350 members drawn from many sectors, including media, communications, academia, online platforms, libraries and civil society. Since 2018, MLI has established itself as the go-to body for media literacy in Ireland and has created a successful communications and project delivery infrastructure involving many stakeholders across the island. Our work has been acknowledged in numerous national frameworks and strategies, and our multi-stakeholder approach has been recognised as best practice by a number of international bodies. Of particular note is the Be Media Smart campaign, which encourages people to stop, think and check that the information they are getting is accurate and reliable. The campaign is supported free of charge by a wide range of MLI members, and it directs people to bemediasmart.ie for help and support. The last iteration of the campaign reached in excess of 3.1 million radio listeners weekly while the TV campaign reached at least 5.3 million people. More than 1 million were reached via news publications. A TikTok-TheJournal.ie initiative achieved more than 6.4 million video views. Independent research shows that unprompted awareness of the message peaked at 23%. For context, 13% to 17% is considered very good for similar campaigns. The initiative has been noted as best practice and the concept has been adopted in six other European countries.

Our work with the EDMO Ireland has resulted in training more than 100 community leaders to deliver Be Media Smart training in their communities, and we are actively working with the library network to explore how we can help them achieve the objectives in the national public library strategy. We are also represented on the national counter disinformation strategy working group.

Building resilience to disinformation requires reliable, well-resourced, cross-sector infrastructure that can facilitate the delivery of messaging and practical support in a tailored way to a diverse set of people over an extended period of time. MLI has been successfully bringing together diverse stakeholders to deliver tangible results in this area, and we look forward to continuing to do so, as well as expanding our reach, influence and impact, with the support of Coimisiún na Meán and other key stakeholders.

I will stop Ms Chapman just there and she is over the time and members can take the remainder of the statement as read. Is this okay with colleagues? Yes. I now invite Dr. Shane Murphy on behalf of DCU.

Dr. Shane Murphy

I thank the Cathaoirleach, Deputies, and Senators for the opportunity to contribute to this discussion on the State’s response to online disinformation. As the committee has heard from many witnesses over the years, disinformation is a complex phenomenon. It can be difficult to understand and there are no easy solutions. Moreover, it is always worth remembering that, in a democratic society, disinformation is not illegal. Nevertheless, it clearly can be harmful and the State has a duty to minimise harm. In addition, non-State actors contribute significantly to this effort.

The Institute of Future Media, Democracy, and Society, FuJo, at Dublin City University co-ordinates a number of projects to understand the dynamics of disinformation and how to counter it. In particular, we co-ordinate the Ireland hub of the European Digital Media Observatory, EDMO, an EU-wide network of researchers, fact checkers, media literacy practitioners and data analysts. It must be noted, however, that this project is only part-funded, which limits its capacity. Nevertheless, the funding helps DCU monitor the EU code of practice on disinformation, research the effectiveness of countermeasures, and develop practical media literacy outputs. It also helps our partner TheJournal.ie develop its fact check team and expand its efforts to deliver reliable information to the public. Similarly, it helps our partner NewsWhip develop its media monitoring platform, which allows journalists, fact checkers and others to understand the flow and influence of information online.

There is great demand for the research, insights, and tools generated by projects like this. This is evident all across society as librarians, teachers, community organisations, businesses and journalists respond to concerns about disinformation and try to build societal awareness about bias and influence. Given the scope and depth of civil society actions, we suggest there is no shortage of expertise and good intentions in Ireland. What people often lack, however, is a solid evidence base to inform their work. It is essential that responses to disinformation are grounded in evidence, but this is challenging because the online environment is always evolving. The rapid mainstreaming of generative AI is evidence of this, but so too is the change in content policies at companies like X.

To support both the State and the civil society response to disinformation, we believe two kinds of evidence are needed. First, we need objective evidence of online media trends and practices. There is currently no regular and reliable source of information about media trends. That means there is only a partial understanding of what kinds of media platforms and figures are influential among different groups in Ireland. To give one example, the extreme misogynist Andrew Tate was a well-established figure in the media diets of Irish young people long before he came to the awareness of most parents and teachers via mainstream media. It should also be obvious that efforts to counter disinformation and build media literacy are greatly limited if we do not know what media people are consuming. Second, we need research on the effectiveness of disinformation countermeasures. Research on disinformation countermeasures is in its infancy. Consequently, public officials, journalists, and others often do not have good insights on how best to communicate accurate information to different groups. It is a mistake to assume there is a one-size-fits-all solution to communicating good information. Relatedly, there is a need for research on how to integrate media literacy and so-called pre-bunking techniques into communication practices.

We note that there are important new legislative structures in the EU and Ireland to address disinformation and related harms, most notably the Digital Services Act, which will be mainly implemented by Coimisiún na Meán. These structures, we are told, create a new paradigm for digital platforms. Time will tell if that is true. It is already telling that civil society has responded so vigorously to the onslaught of harms emanating from online platforms. Civic-minded people and organisations across the country are putting out digital fires. It is the job of politicians and regulators to ensure that burden is also shouldered by the platforms that have played an outsized role in creating these problems.

Very good. I thank Dr. Murphy. I will now call my colleagues, who may have questions, thoughts or observations. We will begin with Deputy Ring as I know he is under pressure with other meetings.

I welcome all of the witnesses. I thank them for coming here today. We are discussing a very serious issue. It is probably the most serious issue facing society in the next 30 years. One of the issues that concerns me, which was also raised by one of the speakers, is that misinformation is not actually breaking the law. That needs to be dealt with quickly. The second thing that needs to be dealt with is the funding to organisations to try to stop this misinformation. The Internet is an uncontrolled medium. At least the national media we have, such as RTÉ and the local and national newspapers, are a controlled media. I do not mean controlled by the State; they are controlled by libel law. Until we start prosecuting people who are putting misinformation up on Facebook and elsewhere, it is really challenging to democracy at the moment. If something is not done, something very serious will happen. It has already happened, as some of the previous speakers have said, with riots in Dublin and riots in Belfast. We have unrest all over the world and we see the way the Internet can be used to bring people together to give misinformation and to get illegal gatherings. We need to do something about it.

The one thing that worries me, and I would like the witnesses to comment on it, is whether governments are doing enough. The European Union is great for regulating. They regulate and regulate but it is funny how it has not made any effort to regulate this industry at all. This will have to be dealt with, not just at national or EU level but also at world level. Everybody is going to have to be on the same hymn sheet in this.

While social media and all that is very good for getting information, people are getting misinformation. It is coming to the stage - I heard about a survey on this on the radio yesterday - where even children as young as six or eight are now beginning to doubt the information that is on the Internet. They are worried about the information. They do not know what is truth and what are lies. We have to do something about that. Those are all my comments for now. I will come back in again.

Does anyone wish to respond to the Deputy's observations or concerns?

Ms Edel McGinley

Are governments and politicians doing enough? No. It is incumbent on this committee to have the platforms before it in public session to answer some of the questions the Deputy put forward, which are very serious. We do not have the full answers. We do not have transparency from platforms. The majority of them are headquartered in Dublin. We have, and can have, an oversized impact, as can the Deputy as a political representative, on how we force these companies to be more transparent and open about how they do their business. We would welcome the committee inviting them in. We have previously seen these platforms come in but in closed session. That is not in a public forum, which is not good for democracy either. That is one of our key recommendations to the committee.

Ms Niamh McDonald

To get a little more into the weeds of things, a number of things have happened over recent years that have increased hate and disinformation online. One of the big things are the cuts to trust and safety teams within platforms. They are the moderators of content. They are humans who are moderating content. We are now seeing a shift towards computerised content takedown. A range of platforms have sacked between 25% and 85% of their trust and safety teams and have directed all these resources towards automated decision-making tools. We see that as a big issue because we have to understand the localised context in which everybody works. We had local and European elections. The ISD spoke about the rate of disinformation around that, which we also monitored. There was also the lead-up to the riots in Coolock and Dublin city centre. If we do not have that human context moderating the platforms, we do not have trust or do not understand how the content is being taken down. We see that TikTok announced last week that it will cut 25% of its employees who are moderating in its trust and safety teams in the UK. Twitter has cut its teams by 43%, Google has cut them by one third and Meta, in 2022, removed more than 200 content moderators. When platforms say they are being moderated within their rules, we are not sure how that is happening. There is no transparency.

To reinforce what Ms McGinley said, in January and March of this year, the platforms X and Meta got private sessions here. This was post the riots, post the violence and post the impact there has been on communities. I lead the community engagement programme in the Hope and Courage Collective. I work with fabulous communities throughout the country who are trying to make their communities resilient and communities of welcome, but they have this onslaught of consistent disinformation from international companies. That is generating fear. The vast majority of people throughout the country are welcoming, engaging and democratic. They want to have the best for their communities but there is no oversight of accountability regarding social media platforms. That is what we are up against.

I agree with Ms McDonald on that. I was not on that committee at the time. I know-----

For clarification, Meta was here in public session. Twitter was not; it was in private session.

To be fair to the Cathaoirleach and the committee, these guys think they are above the law. They actually are above the law. They think that because of the wealth and jobs they are creating, they do not have to have any responsibility to community, democracy or the country. To be fair to Brazil, while it might not be the most democratic country in the world, it put a bit of manners on these companies. It closed down their system. We are not powerful enough to do that here. I will say one thing, and my colleagues here will suffer this in the general election, we will have a very difficult time with misinformation. It is very unfair on candidates, if something is put up online, that it takes two or three weeks to get that misinformation taken down. They cannot recover because of that. There is not enough access to these organisations when somebody does something wrong. Have we seen anybody being brought to the courts for that? Maybe somebody can answer that. I have never seen anybody being brought to the courts for online misinformation or for slandering somebody online. That is something we need to deal with and deal with quickly.

I agree with the witnesses that the Government is not doing enough. It is because there is a vested interest, which are the jobs and the tax returns from these companies. However, what is more important - money or democracy? We have had democracy in this country for more than 100 years, but we will not have it if this continues. The kind of stuff that is going up online would not do anybody any good. I do not know how people can watch that morning, noon and night, which they do. There has to be something wrong with anyone who does that. You could not watch that kind of rubbish day in and day out.

The best thing I ever did following the most recent general election was to close down my social media altogether, and do you know something? I was the happiest man in Ireland. My head was saved and I was happy. I did not look at rubbish every morning, upsetting myself, my family and everybody else. I managed without it. I was elected over 30 years without social media. It is a nice tool to have. It is nice to be able to pick up information about travelling and this and that. It is positive when it is positive. However, what I see happening in America is awful, for example, one of the men - I cannot remember his name - working for Mr. Trump's campaign. The power that man has in the world is more than that of any president. That is the way he feels. Those are my comments.

I welcome all the witnesses to what is an extremely important and topical matter that we have consistently dealt with over the course of this Oireachtas term. The witnesses and Deputy Ring made the point about whether governments are doing enough. The fact is that during this term we have at least seen a Government try to tackle this issue. That is very important. Legislation has been introduced and there has been engagement, which Ms McGinley mentioned. The Government, including the Taoiseach and the Minister for Education, has engaged with these platforms. That is a very important step. I wanted to say that at the outset.

In the broader sense of what we are discussing, I sometimes smile because disinformation is not new. It is just that the platform is different. Any reasonable analysis of the tabloid battles in the UK in the eighties will show it was no different then. It is just a different platform. It is just a different product in terms of how it is being conveyed. Even within that, there is nuance. This is why it is important, and Ms Chapman raised this previously, to understand media literacy. There is obviously a difference in slant. That is very pronounced in a country like the UK where, depending on whether somebody buys The Daily Telegraph, The Guardian, The Mirror or The Sun, there is a different slant on what is still, fundamentally, basic news. There are lies beyond that. We also see that in the USA election in respect of Fox News, CNN and the various news channels.

We are very lucky in Ireland. I point to the recent digital news report. Dr. Murphy did an analysis of people's interest in news as well. I will start with him. The report noted that there was a significant difference between those aged 18 to 24 and those aged over 65 when it comes to interest in news. It registered at 30% for those aged 18 to 24 and 73% for those aged 65 and over. That goes back to Ms Chapman's point on media literacy. That is the issue. What is Dr. Murphy's professional analysis of those stats? What is his sense, from walking around campus, of the genuine interest of young people in accessing news and where they access their news?

Dr. Shane Murphy

I did not work on the digital news report but I am familiar with it.

You are accredited for analysis of it anyway. I was trying to give you a compliment.

(Interruptions).

Dr. Shane Murphy

On where young people get their news, one thing that comes up somewhere else in that report is that 18- to 24-year-olds, and it might even go up to 18- to 34-year-olds, are, compared with other groups, far more likely to get their news incidentally. They are not going to specific news apps or specific news sources. They are not deliberately seeking those out.

They are getting their news or, as far as they are concerned, they are informed about current events and things that are going on as it comes up on their TikTok feed, their Instagram For You page, their Facebook page as they are scrolling through a wall. They are not necessarily considering where it is coming from or who is producing it but this is the primary way many of them are consuming most of their news.

Okay. There is the analysis he quotes but from Dr. Murphy's own engagement with undergraduates on campus and so forth, is that reflected from his engagement with young people? This is going to sound very outdated but when I was doing my journalism degree, you could walk into campus and buy The Irish Times for a discounted price of 50p. There was a genuine need to go to buy a newspaper to be updated. I ask about undergraduates today being worried about the type of news they are getting, and questioning it more so. There is the famous tagline of the Irish Independent: "Before you make up your mind, open it". Does Dr. Murphy get the sense from young students that they are actually questioning where their news sources are coming from?

Dr. Shane Murphy

No, I do not.

Okay. On that point, in terms of the work with media literacy, if the view of a professional on campus is that he feels young people are not really too worried, the challenges we face are clear. Ms Chapman referred to a cradle-to-the-grave approach in her closing remarks. If we do not engender a sense of that necessity within the primary education system, by the time you get to campus, you are-----

Ms Martina Chapman

This is a really good point to pick up on one of Dr. Murphy's earlier points. Unless we have hard evidence into, say, the categories of 16- to 18-year-olds, and are monitoring that on a regular basis, it is very difficult to say with any certainty, unless we have the data to back it up, what level of media literacy or awareness exists or how people interrogate information. If we had that body of evidence, it would be much easier to sit here and say we think a particular group of people are more vulnerable to maybe misinformation or disinformation. We would be able to look at that and then create bespoke interventions we think would be most likely for that.

To pick up on Deputy Ring's comment earlier on, I have not seen the research he mentioned. However, from a media literacy point of view, I am a bit more reassured to hear that young people and children up to the age of six or eight are actually questioning what they see online. I would not necessarily see that as bad news.

That is true enough.

Ms Martina Chapman

It is actually a positive. The whole issue of disinformation is extremely complex. Any solution to it will be very complex as well and will require different organisations and different sources of support at different stages. Absolutely, it is about the cradle-to-the-grave approach. Again, this goes back to the research. In order to effectively develop a cradle-to-grave approach, we need to understand the difference between all of those different life stages. It is not just ages; it is also socioeconomic factors. There is a huge number of factors. Going back to the point about whether the Government is doing enough, technology is only one part of this. The Deputy is absolutely right. It is a different platform. It is a different way of disinformation being spread. Yes, it is potentially much faster and with a much greater reach but there are specific factors that will make specific people vulnerable to different types of disinformation and that is a completely different conversation we probably need to be having and involving different parts of Government as well. Absolutely, we need the cradle-to-the-grave approach. Yes, we need media literacy in schools. It is already in schools but not as a bespoke subject. It is kind of cross-curricular. There are advantages and disadvantages to that. One of the advantages to this is that it can be brought into different subjects and be talked about. Where the vulnerability with that approach lies is with the teachers. It is about making sure the teachers are skilled and informed enough to be able to facilitate those conversations in classrooms. Some countries do have bespoke media literacy classes but there is an issue around that as well. The technology changes so quickly. The other factors such as social norms, even legislation, change so quickly. It takes so long to get a curriculum agreed and signed off, to get teachers upskilled and it into the classroom, that by the time it is in the classroom, it needs to be changed again. The cross-curricular piece is really important. With young people and people at school and in formal education, there is a great mechanism there to deliver media literacy learning but it is people who do not have regular access to formal learning opportunities who are probably more at risk and need more support.

Ms Chapman is so right in what she says about the skill set a teacher would need to deliver that. I remember being in secondary school and our teacher bringing in the Sunday newspapers to the English class and asking us to analyse the advertisements that were in them. This was the 1990s. We were asked to understand the subliminal messaging. That was not on the curriculum but that was her trying to get us to understand the nuances of messaging. That was outside the curriculum. It was her trying to drive and probe our minds.

On the issue, or correlation, maybe, of the decline in print media, is that significant in terms of the rise of disinformation?

Ms Martina Chapman

This picks up on the point Deputy Ring made earlier, in that there is a difference between traditional regulated media and other media. It is not exactly a level playing field. It is quite difficult for somebody who is receiving information, potentially online. We are asking a lot of individuals to take a step back and question if they know who produced a particular piece of content, what their intentions were, who owns that media, what that media is trying to do and if it is trying to inform or entertain or is trying to persuade them to do something or think a particular way. However, we are then asking individuals to ask if something is regulated and if they understand what that regulation means, the guidelines and the rules that are around, for example, print or broadcast media and how that differs from content they might engage with online.

Ms Chapman poses an interesting question. One wonders whether the people back in the eighties or the nineties picked up a particular product and questioned whether it was a Rupert Murdoch-owned entity either. Maybe we are being unfair today. Mr. O'Connor made a point in his remarks regarding the balance of the freedom of expression with protection from harm. Here is the ball game. For example, at 4 o'clock this afternoon, the Criminal Justice (Incitement to Violence or Hatred and Hate Offences) Bill, which has caused a lot of debate, will be before the Seanad. It is finding that balance. Mr. O'Connor made that statement but I ask him to go beyond that. Where does he think that balance is? That is where the ball game is at.

Mr. Ciarán O'Connor

I thank the Senator for his question. The struggle with tackling or combating disinformation, as it is viewed under the DSA for example, is that it is legal but harmful. Platform guidelines and platform moderation techniques are based around trying to find that balance between what is legitimate speech and what is potentially harmful speech. Disinformation goes hand in hand with hate speech as well. Falsehoods, particularly targeting marginalised groups, are often used to create polarising narratives. We have seen multiple instances where falsehoods about asylum seekers or migrants are spread online that often result in vigilante-style violence. Here we see smears, rumours and speculation leading to offline violence. Our interest is on platform accountability and the downstream effect of objectionable, potentially hateful, speech. I think that is it for the moment.

Okay. Ms McGinley and Ms McDonald made the point regarding the platforms. However, the platforms were before the committee both in public and in private session and one thing I have a sense of from them is that they will make a change in their business model and algorithms if they feel there will be a public mood that is negative towards them which impacts their commercial clients who subsequently do not want to be associated with a certain platform. You can see that with certain platforms, large corporate entities will migrate because the platform has a negative connotation for them and their brand. Notwithstanding that governments, states and the European Commission have a huge part to play, the companies themselves will know this will have a negative connotation for them. Would Mr. O'Connor agree with that synopsis?

Mr. Ciarán O'Connor

I would. The question of evidencing this comes back to transparency and data access and for us, that is one of the major challenges. I know other organisations here today echo that also. It is like trying to solve a jigsaw with only a handful of pieces. The research that we are conducting is based on limited data to which we have access. Platforms are opaque in their transparency. I think of a tool called CrowdTangle, which Facebook offered for a number of years that allowed researchers to view content that was being shared by public groups and pages to track the spread of content across platforms. This was an enormously valuable tool for researchers conducting analysis but that tool was shuttered during the summer. We have backward slides by platforms in allowing tools for accountability and transparency. That is one of the key challenges in holding platforms to account to force some change on their part.

That is a fair point. I thank Mr. O'Connor.

Ms McGinley and Ms McDonald have set out a specific number of asks for the committee. I note there was a request to ask the Taoiseach to put pressure on social media platforms. The Taoiseach, the Department of Education and the Minister are actively engaging on that. I would support that request. They specifically mentioned the recommender systems and that is something I have personally called for, both here and in the Seanad. The majority of people are not aware of that and going back to media literacy, if we are aware of that and happy with it, that is okay as it can tailor what I am interested in and provide the content I am interested in but if one is not, it can have negative connotations. On committee support for a public inquiry into the harms, that is something that this committee - while it is not a public inquiry - has done with civil society and young students as part of the analysis of the legislation. There was extensive engagement in terms of examining the harms. That was quite scary in many of the areas we looked at. With regard to the area the collective delved into and touching on Mr. O'Connor's point on understanding the cycle of hate and how that is having an impact, I would like to hear more.

Ms Niamh McDonald

I can speak to that. We came up from working with communities on the ground and understanding that the online sphere has a real-world impact. Members have a copy of the cycle of hate diagram to hand. I will start with the disinformation and hate speech. I will begin with that part of the cycle. If we look at that and go back to November 2022 when a lot of the anti-migrant mobilisations in our communities began, that led to a lot of real-world violence. A lot of disinformation and hate speech was generated and amplified online. The biggest point that has not yet been mentioned is that this is all being amplified by the recommender systems. A lot of this content will land on people's newsfeeds, on children's newsfeeds and those of young adults and older people, without them even seeking or looking for it. That is a big point that is missing. We see the amplification of the disinformation and hate speech. There is a lot of disinformation, as we have seen. That leads to isolation and exclusion in our communities. A lot of disinformation and hate speech others minoritised communities and creates a sense of isolation, exclusion and fear. We then come to polarised communities that feel they have to be fighting against each other for many different reasons, whether resources or fear. We then have the motivated haters and we see copycat tactics. For instance, we have documented 58 private "says no" Facebook pages with anti-migrant, far-right, Nazi content. Many of these have places where people can post anonymously, which means there is no record of accountability. We do not understand where that comes from through Meta, as it is on Facebook. We can see this as a pipeline of hyper-local private Facebook pages. We saw it during the local and European and elections. These pages can send copycat tactics straight across.

Deputy Ring is talking about canvassing. These are people who have gone out and tried to put cameras in the faces of canvassers or candidates and they copy that going from place to place. We saw that being copycatted across Facebook pages. That leads to the next part of the cycle, which is events and issues. We can see what led to the violence in Coolock on 15 July and to what we saw on 23 November in Dublin almost a year ago. It moves with speed. That creates a chill effect on mainstream politics, almost as though there is a fear of a backlash. People are almost pre-empting a backlash and therefore do not want to speak out. That will temper what needs to be said in politics. We then go on to a hardening of policies and reactionary narratives. That is a rollback on progressive policies, especially toward people seeking asylum. We need to respect the needs of people coming here seeking safety, the processes in place and the systems that protect their dignity and rights. That is not happening with a lot of young men who are on the canals in Dublin city centre at the moment. We then get into media and public debate and see that this disinformation and fear is driving the media and public debate. That is amplifying polarised views and normalising fears.

I want to go back to the work that I do in communities. The vast majority of communities right across this country are welcoming and engaging and want to support people. However, that is not the dominant narrative in our media. What is coming is the negative about immigration. What we want to do is to balance the shift of what is happening in our communities and not the amplification that the recommender system is delivering to all our devices day in, day out. That is how we look at the cycle of fear.

I wish to thank all of the witnesses for the work that they do. It is of such importance. It is not just at Government level, it is within those who work in and analyse media. That is hugely important and I thank them for their work in that respect.

Senator Míchéal Carrigy is online and he may have a few questions lined up.

I apologise for coming in late on the meeting. I have a few questions. We did a lot of work on the Future of Media Commission report that came out a couple of years ago. If I am correct, the figures demonstrated over 70% trust in mainstream media and less than 30% in online media. That was because there is a significant amount of disinformation across online media. It is important therefore to see the supports that we have in the budget, that is, the €6 million set aside for independent broadcasters. We need to support the tried and tested media, be that written, radio or television that is putting out correct information. I have a few questions about things we have discussed at committee meetings before. We met Coimisiún na Meán and speaking as a parent, our eldest is 12 years old and starting secondary school, which has opened my eyes to media, phones, etc., and the difficulties parents have. Coimisiún na Meán is looking at having minimum ages for social media accounts and putting the responsibility back on social media companies, which is something I believe in myself. They are making mult-billions, we are in the age of AI, etc. I cannot see putting proper age verifications in place to reduce the number of young kids on social media as being a huge difficulty. We have seen statistics of kids as young as eight, nine and ten years of age on the various platforms, be they TikTok or Snapchat, etc. What are our witnesses' views on that?

On an issue that has come up in the budget, I was at Moyne Community School here in Longford on Monday talking to TY and third year students. The big issue that came up with them for me as a public representative was the idea of these wallets for their phones in schools that is being put forward. They are totally against it but the reality is that we have students in schools who, when they have access to their phones, are accessing social media during the day and even during classes and that is going to have an effect on their education. Do the witnesses have views on that and what would be an alternative? There were very differing views in the two classes that I met.

One class would have been in favour of phones and not having access between 9 a.m. and 4 p.m. while the other would not accept that and was in favour of the phone being on the teacher's desk during the class but the pupil getting it back afterwards. Regarding the challenges around social media, we need to ingrain this into our education system at primary level to make sure that children are aware and information needs to be sent home to parents because a lot of parents are not aware of what is on social media or actually understand social media or the various platforms. What are the witnesses' views regarding the minimum age and access to phones in schools, etc.?

Ms Aoife Gallagher

There seems to widespread support for measures to protect children from harmful content, which is very welcome. It is also important to realise that the addictive nature of social media does not just harm children's rights; it also harms adults' rights so we have to think about it at a cross-society level. A couple of solutions have been put in place around age verification, one of which is that the platforms will ask for some form of Government ID to prove age. We need to be wary of handing over any more information to social media platforms that have proved they do not act responsibly with personal data that is handed over to them, so I would be very wary of approaches that adopt that measure.

Issuing blanket bans - banning children from social media - is not realistic at all. I am sure we all remember being children. There is no doubt that they will find a way. However, there are other solutions. Again, it goes back to the algorithms. I am not just saying that this should be for children. It should be used by everyone who uses social media. The algorithms are where the harm is coming from. They are amplifying harmful information every day. People think we live in a free speech environment on social media. We do not. It is completely curated. If you go on to TikTok, you are not deciding on the videos you see. An algorithm is deciding that for you, which is extremely problematic. There is more the social media platforms can do around enhancing parental controls. I did research we released during the summer looking at YouTube in particular and the recommendations that come through YouTube accounts that were set up to be young teens - 13- and 14-year-olds. It was very clear that YouTube made no difference between the information it was sharing to a 13-year-old's account and information it was sharing to a 30-year-old's account, which is really problematic. There is so much more the platforms could be doing in terms of child safety. They all have rules and may have very detailed policies around child safety but they are not being enforced, which is where the issues lie.

Ms Edel McGinley

I echo Ms Gallagher's concerns. We cannot lose sight of the fact that we are talking about big for-profit corporations that have a fiduciary responsibility to their shareholders to build that profit. When we talk about the use of social media, we need to elevate it to the fact that we are talking about big corporations that are making profits. To do that, they will do anything they can to have a captive audience so the more they can feed people and the more advertisements people can see, the better it is for the company as it will make more profits. We are talking about a for-profit industry. The recommender system and that amplification are key to reducing the harms that are caused right now. We cannot overstate that enough to this committee.

Ms Martina Chapman

A really interesting point was raised. There are probably parents across the country who are thinking "Yes, I'm feeling that too". Children and parents face particular challenges online, particularly around that time when there is a move to secondary school. There are a lot of benefits in technology for children of that age. There are numerous risks, as have been pointed out already. Regarding depending on some kind of technological interventions to ensure that children are safe online, I have been in the area of media literacy for a very long time. At the beginning, a protectionist approach was taken to online child safety. It was a belief that we could try to protect them from all of the dangers. Technology changes. As Ms McGinley said, children will find a way to get to the content they want to get to. This comes back to media literacy. It is similar to the issue around disinformation. There is no simple answer to this. It is complex and the solution will probably involve a number of different approaches. One of those is about empowering parents to support their children on this journey. That is not easy because it is a bit like teachers. Parents need help as well, particularly when we feel our children probably know more about this technology than we do. How on earth do we start this conversation? Webwise.ie has phenomenal resources, not just for young people and teachers but also for parents, that are specifically designed to help parents start to have those conversations with children and start to build relationships and have conversations about technology, its use and, going back to the media literacy piece, understanding the role of recommender systems and algorithms, how content reaches you, whether it is appropriate and what the problems might be around that.

Does Senator Carrigy have any follow-up questions?

The committee did a lot of work on the Online Safety and Media Regulation Act 2022. Are there enough protections in that Bill? Do we need to strengthen it? Can Coimisiún na Meán enforce it or could and should it do more in this area?

Ms Edel McGinley

The online safety code is due to be adopted quite soon. It is early days in terms of enforcement for Coimisiún na Meán. Part of its role is to look at systemic failures. In order to do that, it needs more time to look at those systemic failures over time. It is a new regulatory regime. This all came from the EU. There is a harmonised approach across the EU. Ireland has set up its digital services co-ordinator where other countries have not. We are a bit ahead of the game in some respects but in terms of the algorithms and the recommender system, this is something we need to push at EU level. It is not just a member state responsibility because a more harmonised approach is being looked for at EU level. We have a representative there in the new Commissioner so it will be incumbent on us to put pressure there around his new remit. This has a European dimension that needs to be taken into account.

Is the Senator satisfied with that?

Ms McGinley spoke about moderators and how each of the platforms has had significant job cuts in the area of moderators. Could she expand on that? I think she said Google cut one third of its staff.

Ms Edel McGinley

I will let Ms McDonald answer that question.

Ms Niamh McDonald

These are trust and safety teams. Since 2022, there have been significant cutbacks to trust and safety teams across all platforms.

Has there been a domino effect, as in one did it so they all followed suit?

Ms Niamh McDonald

Yes. It was almost like a spread. There were a lot of job losses in tech companies in Dublin-----

That is where they took the jobs-----

Ms Niamh McDonald

Between 25% and 85% of trust and safety teams have been cut. The Communication Workers Union in the UK announced this week that TikTok is to cut 25% of its 500 UK jobs. We do not know about the levels in Ireland but we have the US figures from across the companies. Twitter cut 43% of its trust and safety teams in 2023.

Twitter also effectively disbanded its ethical AI team last November and laid off all but one of its members. It talks about ethical AI but removed that ethical AI team. It says it is relying on AI to moderate but has removed the ethical AI team. In February Google cut one third of a unit that aims to protect society. We go back to what is happening on YouTube where, whether you are aged 30 or 13, you see the same content. In 2022, Meta reported ending the contracts of approximately 200 content moderators in early January, as well as at least 16 members of Instagram's well-being group. Amazon and Microsoft have also downsized their AI teams, so it is right across the board.

As I stated, trust and safety is one arm of it. We have got amplification. There is trust and safety, but the content is already up. We have seen numerous politicians talk about dangerous content being left up for a long time and not being removed.

I go back to the context of working in different localities, whether Ireland, the UK or Europe, and understanding that at different points in time different issues could become volatile. We had the local and European elections. We had the riots. A lot of that disinformation begins and gets amplified online. We do not have trust that content is being removed right now, whether by the Garda or by the Taoiseach or small organisations like us calling for it.

Would it be fair to conclude that the comfort the platforms felt about removing jobs from the moderation and trust and safety arms of their organisations was because enforcement is not there, and they feel it is soft-touch oversight?

Ms Niamh McDonald

Yes, and I think it is about their suddenly using AI technology because they see it as a way to reduce costs to the organisation for moderation. We also see that within that there has been a massive increase in disinformation and fear-----

Has that been since the moderation teams were reduced?

Ms Niamh McDonald

Since November 2022, we have seen significant anti-migrant violence, arson attacks and riots in Ireland. That has coincided with the drop in trust and safety teams across all social media platforms. In our work doing rapid response, the nub of where we normally start to find things is on online. Then it is the amplification. It comes back to the recommender system. Trust and safety teams are extraordinarily important for removing content as it is happening, but the recommender system produces that content in the first place, spreads it and it lands in your phone. There is a two-pronged approach. We cannot rely on social media companies regulating themselves. We have had these companies for the past 20 years and they have not created that sense of comfort for us until now. Otherwise, we would not need to be sitting in this room right now.

All of the witnesses are extremely passionate about this, and it is wonderful to see. This Government has proposed hate speech legislation. There was push-back on that from certain quarters. My view was that this was one area where it would have been hugely helpful. However, the witnesses are the experts. I ask each of them if they feel that could have had a positive impact, or any impact at all, on the disinformation and hate speech online. Have they any thoughts or views on it?

Ms Niamh McDonald

I will keep going and say it goes back to amplification and removing the content. People can say-----

Would it have added more pressure to the platforms?

Ms Niamh McDonald

It depends on if you are able to call accountability to the platforms. What is the legislation that allows you to call the companies in to get that removed? Are you relying on the Garda to get it removed? What will be the process?

Even the Garda gets the run around when it comes to taking information down.

Ms Niamh McDonald

Yes. We have to put the responsibility back on these multibillion profit-making organisations as my colleague has said. They have the tools, the energy and the resources to be able to remove and moderate the content. Small organisations like ours and the Government do not. We have to keep our focus on the people who have the power to remove it and how we can regulate that in a way that ensures all our communities, no matter who they are, where they come from, or how they identify, feel safe.

Mr. Ciarán O'Connor

I will comment on the recommender and the trust and safety systems, but also wider teams on social media platforms. There is an infamous example. In the lead-up to the 2020 US election, Facebook enacted what it called news ecosystem quality scores. This was essentially a break-glass measure where they put more weight on their algorithms to offer up more content from reputable news organisations to users. They turned it off not long after the election, but one of the findings was that because of these changes to the algorithm people were spending less time online. That was a fundamental issue for social media platforms. The implementation of these guidelines has never been shared, which comes back to an issue of transparency with platforms, their awareness internally, and their potential culpability of knowing how these algorithms are helping to promote and expose people to potentially harmful, hateful and false content online.

Ms Aoife Gallagher

I will speak to the hate legislation. It is clear that our current legislation is not working. It is either not being enforced properly or it is not suitable for the environment we are now in. We released some research a couple of weeks ago where we documented incidents of anti-migrant violence over three and a half months. We documented more than 60 incidents. These are not just people saying bad words online. These are all offline incidents and are really horrific and horrible. We are only scratching the surface because we are only seeing what is posted online. There is a playbook here. The people who do this will go out on the street and film the confrontation. The most horrific confrontation they find they will post online. That gets fed into an online ecosystem that is also shared by far-right figures internationally. All of these videos are consistently inciting hate and violence towards migrants and asylum seekers. These people consistently post these videos online under their own names. They are posting evidence of themselves committing these crimes and nothing is being done about it. There is obviously something missing there. Either we enforce the legislation that we have or we need something else.

Ms Edel McGinley

I will come in on the legislation because it is also important how An Garda Síochána addresses this issue, and how people who experience hate online are empowered to report harassment and threats. Just because it happens online does not mean it cannot be reported. There is a body of work that needs to be done there to support people who are affected by threats and harassment online so there is greater understanding of their rights. There is a lack of understanding that there are some laws in place to support them. Gardaí also need to upskill in that area because there is an inconsistency across the country in how things are being dealt with. It is very serious.

I thank the witnesses very much for the work they do. While Ms McGinley did not get to finish her statement, we have noted that the last piece was the ask. All of that is documented and it will be included.

Ms Edel McGinley

I will make one last point about the new Appeals Centre Europe. We have some questions about that because this new appeals centre seems to be an attempt for the platforms to self-regulate, in particular Meta. We are confused about this parallel framework that is being set up. We would welcome some clarity about that, as I am sure the committee would as well. We are not convinced there is enough oversight of this by the Government. We are worried about different levels of appeals and that we are again going back to an era of self-regulation. It is something we would be concerned about. We did not have it in our opening statement because it is a rolling issue that was announced in the past week or so.

I have a real worry about that self-regulation. We see that these companies have already taken away staff from existing regulation and dealing with what they are dealing with. They will not self-regulate. It is all about money with them. Governments have to regulate. That is their job. Governments have to regulate and put agencies in place, whether these are the Garda or Coimisiún na Meán. I worry. To be fair, the Garda does not have the resources to spend hours on the Internet every day. There has to be an agency to deal with that and report it to the Garda. The Garda should then deal with it.

I will ask a question relating to children. A good point was made that they have to prove the age they are. How do they prove that?

Ms Aoife Gallagher

There is no way to prove it. You can say whatever age you want to be.

That is correct. That is exactly what I was going to say. They can say whatever age they want. The question you get is whether you are over 18. Whether you are or are not, you just tick it off. That is the problem.

I will not hold things up with another question, but we have to come back to this issue. It is getting out of hand. Something has to be done. Remember what I am saying; after the next general election, which is not too far away, this will be on the agenda. It will be on the agenda because people will see what will happen in that election. These people are getting more sophisticated and more organised. They are able to get on the Internet and have six or seven accounts, with no name on them. As was said, even having names on the accounts does not seem to bother them because they feel they can say what they like to anybody and put up whatever information they want. This issue will be the most serious issue. It is already the most serious issue facing the world, not alone Ireland. The kind of stuff that is going up online and the misinformation is why we have such discontent in the world. I thank the witnesses for their work.

I thank Deputy Ring. That concludes our public session. Go raibh míle maith agat gach duine. I ask my colleagues to remain in place. We have some private business to attend to. I ask our guests to leave the room. I propose we go into private session to deal with housekeeping matters and correspondence. Is that agreed? Agreed.

The joint committee went into private session at 2.52 p.m. and adjourned at 3.14 p.m. until 1.30 p.m. on Wednesday, 6 November 2024.
Barr
Roinn