Skip to main content
Normal View

JOINT COMMITTEE ON EDUCATION AND SCIENCE debate -
Thursday, 10 Jul 2003

Vol. 1 No. 17

Literacy Report: Presentation.

I welcome Dr. Carl Ó Dálaigh, Deputy Chief Inspector in the Department of Education and Science, and Dr. Tom Kellaghan, Director of the Educational Research Centre at St. Patrick's College, Drumcondra. Today we will be discussing literacy, with particular reference to the joint OECD-UNESCO report, "Literary Skills for the World of Tomorrow". I welcome both witnesses to the committee and thank them for agreeing to meet us at short notice.

Before we begin, I draw attention to the fact that members of the committee have absolute privilege but this same privilege does not apply to witnesses appearing before the committee and to remind members of the committee of the long standing parliamentary practice to the effect that members should not comment on, criticise or make charges against a person outside the House or an official by name or in such a way as to make him or her identifiable.

I now invite Dr. Ó Dálaigh and Dr. Kellaghan to begin a presentation for 20 minutes and then we will discuss the issues raised.

Dr. Tom Kellaghan

Today we will be talking about literacy in the context of the report issued last week, "Literacy Skills for the World of Tomorrow". It is a joint OECD and UNESCO publication and reports in the press last week were based on it.

I will talk about the context in which this and other studies were carried out, in particular the concern with quality of education. I will then give a brief description of the programme for international student assessment, PISA, on which this report is based. Dr. Ó Dálaigh will then talk about the main findings of PISA.

The need to improve the quality of education is a common refrain throughout the world, in both industrial and developing countries. While a concern with quality cannot be said to be new, there have been developments in the last decade in the way it is thought about.

Traditionally quality was thought about in terms of input to the education system, the resources provided, physical facilities, curricula, teacher training and books. There has been a shift, however, to a focus on outcomes of education - what the students are learning as result of their experiences in education - a shift from inputs to outputs when considering quality.

Techniques have been developed to appraise the achievements of an educational system as a whole. We think of assessment in terms of student achievements in the leaving certificate but there are now techniques to aggregate student responses to a level where inferences can be made about how the system is performing. It is unlikely that statements will be made about the whole system, from beginning infant class to leaving certificate. Certain ages or grade levels are picked and a representative sample of students at that age or grade level is chosen. Standard assessment instruments are then administered and aggregated to give a picture of the achievements of the system as a whole. When I say that samples are used in such studies, a number of countries test all students. That happens in Britain, France and, increasingly, in the United States at state level. We have had these national assessments at primary school level for 20 years in basic literacy, English reading and, less frequently, in mathematics and Irish.

People were not satisfied with information about only their own systems and began to look for comparative data with other countries to measure how students in their systems are performing relative to students in other systems - such studies have existed since the 1960s.

However, they tended to be sporadic, underfunded and slow to report, and it was really only in the 1990s that Governments became interested in these comparative data. That interest has been expressed through the OECD, and during the late 1990s the OECD began to develop a programme for international assessments - that is, comparisons of students in member states.

This is regarded as very important as an index of human capital because it is believed that the prosperity of countries depends on the development of human capital. If students are not doing well in schools vis-à-vis competitors, that is likely to create not just educational but economic problems.

The last thing to be said about this in terms of context and quality is that the information that is obtained from these is not considered just to be useful to describe the achievements of students in the system; it is supposed to act as a basis or a lever for reform - that is, when the information comes to policymakers and reveals strengths or weaknesses in student achievements or indicates differences between students in different countries, gender differences or whatever, it is then up to the policymakers to make decisions about the allocation of resources to address those kinds of issues.

The purpose of the programme for international student assessment is, as I have indicated, to obtain comparative data on student achievements, and it has been working in three literacy domains - reading, mathematics and science. We would normally associate the term "literacy" with reading, but PISA talks about mathematical and scientific literacy. The reason that term is used is that it does not want the assessments just to be a reflection of what is going on in the curricula for schools. It wants them, as was indeed the case in previous international studies, to focus on the usefulness of the skills that students are acquiring for everyday life and for the future. Thus, this generic term "literacy" is used in those three areas.

Page six of my submission includes definitions of the three. Reading literacy is defined as the ability to understand, use and reflect on written texts in order to achieve one's goals, to develop one's knowledge and potential and to participate effectively in society. Mathematical literacy is defined as the capacity to identify, understand and engage in mathematics and to make well founded judgments about the role that mathematics plays in an individual's current and future private life, occupational life, social life and life as a constructive, concerned and reflective citizen. Scientific literacy relates to the ability to think scientifically in a world in which science and technology shape lives. It is something that all citizens require and is not just for those who are going to be scientists.

So far, PISA is a cyclical operation. The first assessment was in 2000, a second one has just been completed in 2003 and there will be a third in 2006. In 2000, reading literacy was the major domain, and they had what they call minor domains in mathematics and science. They did not fully sample those areas and used shorter tests. In the 2003 assessment, just completed, the major domain is mathematics, with minor domains in science and reading. In 2006 science will be the major domain, with the other two as minor domains.

The reason they test in each area every three years is to provide some kind of data for monitoring trends so that we will be able to look at literacy figures from 2000, 2003 and 2006. This gets a bit complicated because a new report has just come out. A report on the 2000 study was issued by the OECD in 2001. Some 28 out of 30 OECD countries participated, with only Slovakia and Turkey not participating. Data from the Netherlands were rejected because the samples were not regarded as satisfactory, so we ended up with 27 OECD countries in 2000 and four non-OECD countries.

In 2001 the study was repeated in ten further non-OECD countries, so the assessment is moving out to include less developed countries, though not in Africa, for example. These countries are mostly in eastern Europe, with some in Asia and Latin America. This new report now incorporates the 2000 data for OECD and non-OECD countries and the 2001 data for non-OECD countries.

The results do not have much implication for Ireland because all the non-OECD countries, with one exception, did very poorly. They all scored below the OECD countries and the only one that came out of the woodwork and knocked Ireland out of its position was Hong Kong-China in 2001. I ask Dr. Ó Dálaigh now to give the main findings.

Dr. Carl Ó Dálaigh

Starting off with reading literacy, there was a combined scale and Ireland finished fifth of the 27 OECD countries and fifth of the 41 overall when the other countries that participated in 2001 and 2002 are included. Finland was significantly ahead of everybody else, while the others were very much grouped together, including Canada, Australia, New Zealand, Korea, the UK, Japan, Sweden and the United States.

Dr. Kellaghan

There are figures at the back of the paper provided. Dr. Ó Dálaigh has just got those, which is why he has not referred to them. The first of those is multiple comparisons of mean performance in the combined reading literacy scale. All of the countries are listed across the top and down the bottom.

Dr. Ó Dálaigh

When we looked at how achievement is distributed across five levels, Ireland had more at the highest level, level 5 - 14.2% - in comparison to the OECD country average of 9.4%. That level involves the most complex tasks, which require the management of information that was difficult to locate in complex texts and so on, the critical evaluation of texts and the ability to draw on specialised information. Some 11% of Irish students, on the other hand, scored at or below level 1, the lowest level of proficiency. That compares to 17.9% across the OECD countries. If one takes a subset of that below level 1, for people with severe literacy problems, the score for Irish students is 3% as against an OECD average of 6%.

In terms of mathematical literacy, the scores were not as good here. The mean score - also listed on one of our diagrams - was close to the OECD country mean, averaging 16 out of 27 and 17th among all participating countries because students in Hong Kong scored highest in maths overall, thus knocking us down the table. Students in Hong Kong did not score as well as Ireland in reading literacy.

In the third area of scientific literacy, Ireland performed better, coming ninth of 27 countries on a scientific literacy scale and tenth among all other countries. Again, our results were significantly better than the OECD average while students in the same type of countries - Korea, Japan, Finland, the UK, Canada and New Zealand - performed significantly better than Irish students. Students in Hong Kong ranked third on this scale.

Three different areas of reading literacy were assessed - understanding of a text, interpreting of a text and assessment of text and evaluation of quality, call retrieval, interpretation and reflection evaluation. They were very much in line with the overall achievement. There was not much score difference in each of those subsets of literacy. More detailed analysis was done then on how various groups fared, and in terms of gender, females had a much higher overall score than males on the reading literacy test. That was common to many other countries. More males than females scored at the lower levels, for example, one-seventh of males were at level 1 compared with less than 10% of females, while proportionately more females scored at the higher levels. That is reflected in literacy figures across almost all other countries. Males performed slightly better than females on the mathematical test, and the gender difference in science was of no statistical significance.

There were several indices of performance based on the home background. The socio-economic status of parents, as indicated by their occupation or their level of education, was related to student literacy scores in reading, mathematics and science. The higher the socio-economic status of students' parents, the higher their achievement scores. The largest differences were between students of parents whose combined higher educational level was upper second level, that is, up to leaving certificate level, and students of parents whose combined level was at most a primary level education. While there was a relatively strong association between socio-economic status and performance in Ireland, the association was better in other countries, for example, Germany and the UK.

One might expect that a country's economic circumstances would affect students' achievements as they would affect the resources available both in students' homes and schools. PISA examined relationships between students' achievements averaged across the three areas of reading, mathematical and scientific literacy and the per capita GDP of countries in 2000 which was adjusted for the differences in purchasing power between countries. In general, students in countries with higher national incomes tended to have a higher achievement level than students in countries with lower national incomes and that was further reflected in the work done at UNESCO. The performance in Ireland reflected this trend. In terms of GDP we rank fifth after the US, Norway, Switzerland and Denmark, and sixth in average achievement. However, this correlation was far from perfect. None of the countries that ranked above Ireland in achievement had a higher national income, for example, Japan, Korea, Finland, Australia and the UK, based on that 2000 GDP analysis.

If one looks at expenditure on education the mean level of achievement over the three literacy domains was also related to the amount of public and private money that was expended per student from the beginning of primary education at six years of age up to the age of 15, the time at which this survey was done. There was a tendency for a country's mean level of achievement on PISA to increase as the expenditure per student increased. This, however, was not the case in Ireland which ranked 70 in terms of expenditure but sixth in overall terms of achievement. The OECD report notes that moderate expenditure per student cannot automatically be equated with poor student performance, pointing out that Irish students perform significantly better than German students although Ireland spent 25% less per student than Germany. There were other factors in the case of Germany.

There was an analysis done on performance in PISA and on performance in the international adult literacy survey which had been carried out in 1994 and attracted considerable attention. The adult study looked at reading literacy in adults aged from 16 to 65, taking five sub-groups from 16 to 24, and 25 to 34, and so on. On one of the literacy scales in prose Ireland ranked 14 of the 22 countries. This was a joint OECD statistics Canada exercise in 2000. The Irish adults performed significantly less well than adults in ten countries, better than adults in five others and about the same as adults in six others. On that scale one-fifth of the adults across the 16 to 65 sample achieved at the lowest level on the literacy scale.

PISA also included some test items that were used in the adult survey making it possible to compare adults in the 16 to 65 group with 15 year olds. This result came out last year as it was not part of the original 2000 survey. Irish 15 year olds had ranked fifth of the 27 OECD countries on the adult system, at about the same level as students in Australia and New Zealand but lower than students in Finland, Japan, Korea and Canada. As Japan and Korea had not participated in the adult literacy survey Irish adults fared third best among the countries that had participated in PISA and IALS. While 22.6% of adults across the broad range had failed to score at higher than level 1, only 8.6% of 15 year olds scored at that level, about 3% below that level, 11% in all. There were fewer Irish 15 year olds in PISA at level 1 on the IALS scale than in all other countries, except Korea, Japan, Finland and Canada. IALS showed that adults aged between 45 and 65 had the worst literacy levels. That is not shown in younger adults, such as the 16 to 24 and the 25 to 34 age groups. That is to be shown in comparison with the 15 year olds where the same items were tested.

Thank you both. Your presentation has helped to explain some of the intricacies of this topic. Are the methods and standards of testing the same across all the countries and are the tests administered and dealt with objectively? Can it be taken that the standards are comparable on the basis of the analysis?

Dr. Kellaghan

The same instruments are used, the same standards for sampling. The Netherlands was excluded because it did not meet these criteria. There is also a quality issue in this mechanism because during the assessment a sample of the schools is visited to check that procedures are being observed.

I welcome the delegation whose submission makes for pleasant reading. It is good to see that these achievements have been recorded internationally. The different measurements for mathematical literacy is significant. In the past three or four years, as we have converted to the euro, older people have struggled and may suffer from a high degree of mathematical illiteracy. Do the adult literacy figures measure literacy equally on the basis of reading and mathematical literacy as that is an issue that is causing unease? I understand adult literacy is a matter to which we propose to return. Is there a cohort of people in the much higher age brackets with a greater degree of illiteracy? It appears these figures, in relation to the PISA findings, may squeeze out adult illiteracy in a number of years.

In the presentation, the phrase "human capital" is used. It is not a phrase I like, but the group suggests better education and literacy lead to better economic performance. The report also refers to the correlation between higher national income and better education as though one causes the other and not the other way around. Is this a chicken and egg situation, does one cause the other or does one impact on the other?

The figure of 11% of 15 year old students who performed at level 1 was mentioned and while this is significantly ahead of the international average, does this mean that 11% of 15 year olds in this country are functionally illiterate? If that is the case, while it might compare well internationally, it is totally unacceptable and must be addressed.

I welcome the figures for reading literacy. However, 11% of level 1 concerns me. The group is aware of the school dropout rate; 800 primary school children fail to complete schooling and 19% of secondary school children do not do the leaving certificate examination. What percentage of that group are included in the 11% at level 1? Have most of that group left the system already? Presumably the figure was derived from surveys within the school system. Are there others who are outside the system who are not included in this statistic? If that is the case, it means more than 11% are functionally illiterate.

There are literacy studies from 20 years ago. Has the figure for 15 year olds at that time now been included in the adult rates? Up to 25% of Irish adults are at level 1 literacy but there are another 20% who can perform only basic literacy tasks. Approximately 500,000 adults cannot even carry out simple task such as reading instructions on packaging. I spoke to a 33 year old man recently who had left school at 14 years of age. He informed me that he cannot read his daughter's story books to her. In such a scenario where that child's parents are not in a position to do basic things such as that, what are the Department's plans to tackle this? If this man cannot read a story book, he is in no position to help her with her homework. That child is automatically at a disadvantage compared to other children.

Performance in mathematics is another matter of concern if we are in 16th place. There is a gap here between boys and girls, but it is reversed in the other two cohorts. How does the Department intend to tackle this? The sciences would concern me as well because, as we are told, science is the way of the future. While we are ninth of 27 countries, we still have a problem with the number of qualified science teachers, particularly in physics and chemistry. Does the Department consider the shortfall in numbers would have an impact in scientific proficiency? Students from wealthier backgrounds are more likely to do well in school. Can the Department comment on that? In Ireland, students in the bottom quarter of the index reflecting their parents' occupational status are recording significantly lower in the tests on all there cohorts. While this is similar to the OED average, we need to concern ourselves with what is happening here. I know there are programmes in place but this needs to be tackled. We must also ensure there are teachers who are qualified in these areas. Are other states in the OECD introducing science as a subject at an earlier stage? Has this had an impact on their figures?

With regard to infrastructure and equipment in schools, the report at page 192 states that the condition of the buildings do not have a significant impact on performance - we are talking here about 15 year olds. It also gives positive comments on the condition of secondary schools, though it does not refer to primary schools. However, the provision of equipment such as laboratory materials, computers and library materials is poor in Ireland. What plans are there to address this and to bring Ireland further up the OECD scale?

I welcome our two visitors. It has been very interesting listening to their presentation. Dr. Kellaghan stated that this report should be used as a lever for reform. Our education system is not that bad but there is plenty to learn from this report.

Dr. Kellaghan mentioned that all students are assessed in some countries whereas not all are in Ireland. In June 2002, the Minister for Education and Science, in answers to parliamentary questions on testing in schools, stated that he asked the Department to consider the best method of extending support to schools for the purpose of age related literacy tests. He said earlier in his reply that arrangements were being made that all primary classes from first to sixth class will be given support for the purchase of standardised norm referenced tests of literacy and that at post-primary level there were grants available of up to 75% of the cost. Have we made progress in this area? If we can do accurate testing at various stages in the school system, then we can identify where the problems are and how to improve. What take-up is there of the grants at post-primary level? If you do not have the information with you, perhaps you would make it available at a later stage.

It is to be expected that the socio-economic position of parents affects the performance of young people. What can be done about that and what can we learn from it? Will the Department be looking into this? We are told that there are more untrained teachers in schools in disadvantaged areas due to the difficulties of attracting teachers to such schools. Can this be addressed because it must have an effect on proficiency in reading, mathematics and science?

Will there be a reassessment of how teachers are trained in teaching mathematics and even of the curriculum at primary level? There is room for improvement there to ensure that we do better than we are doing at present. The adult literacy rate is a cause for concern where a fifth are performing at the lowest level, if I read the figures correctly. I would have the same experience as Deputy Enright. We have people coming to see us in clinics who literally cannot read the letters they receive from local authorities and cannot fill in forms. Those are basic skills that everyone should have. We must reach out to those people, for many of them will not attend adult literacy programmes, though there are some very good ones available all over the country. Many of them do not have the courage to go out. I know that the TV programme was a help, and perhaps I might ask about that too. Deputy Andrews raised a question about the 11% of 15 year olds who leave school. That is of concern to us all. I hope we will do as you suggested and learn from those figures. I also hope they will be a catalyst for examining how we do things. Once again, I am not saying that we do things all that badly. Considering per capita spending, we seem to do quite well. However, we obviously need to learn and particularly to focus on the young people who are being failed by the system for one reason or another, deciding how we can change, how we train our teachers, how we can have more of them, and perhaps how we can change the curriculum.

I spoke to some people last week who were trained outside Ireland and have been taken into the Irish system to try and improve the number of trained teachers. They told me that the Irish examination is very hard. Someone might look at that. In some cases, they were people without honours Irish in their leaving certificate or an Irish A level. Some of them had that standard but still could not pass the exam. I believe they are given a five year probation period and that they must pass the exam by its end. We might be excluding some very good trained teachers. Perhaps we should re-examine that, for at primary level it could be that the level of Irish that teachers need is not as high as the exam expects of them. That is probably a narrow point, but I wished to take the opportunity to raise it in some forum, and I feel that this one is appropriate.

I join previous speakers in welcoming you here today and thanking you very much for your presentation. Others have gone through specifics in the report but I have one or two general questions. While I acknowledge what we have achieved and the positives as reflected in the report, we must also look at the negatives. We must build on the former and improve on the latter. What is the process regarding this report? Is it simply noted and filed away? How does it move down to the various levels of the Department and the different teaching faculties and teacher training? We must be seen to take real action addressing the points, positive and negative, raised in the report. I would also like to comment on the point raised at pages 10 and page 11, where it states that

The OECD report notes that moderate expenditure per student cannot automatically be equated with poor student performance, pointing out that Irish students performed significantly better than German students, though Ireland spent a quarter less per student on education (up to age 15) than Germany.

On that point, if it is not financial, what are we doing right that has got us into that position so that we can build on it? The authors are making one point, and I wonder what we have that is good and that we can work on. Perhaps we should bottle it and sell it.

Welcome. Reading the report, people would say that, because of our ranking and so on, things are going very well. The media were fairly critical of the previous results from this group because of the ranking. Page 7 talks about that ranking. Owing to measurement error, a rank of three and a rank of eight or nine might not be statistically significant. Are you saying that, while we are doing well, we could be further down the ranking?

Dr. Kellaghan

I will show you how to read the tables at the back.

This goes back to the literacy problems of older adults. Females scored better, and that seems to be the trend everywhere, particularly in Ireland. It is down to age. People say that it is hormonal with young male adults and so on. I take it that there has been no study into that in other countries.

The main focus last time was on reading and literacy, and now they are talking about mathematics and science too. People working in the field say that the difference in someone being able to read is like someone dreaming in black and white or colour. Being able to read affects one's aspirations. Everything changes through it. One can draw both positive and negative lessons from the report. Is it down to the statistics? Can we adapt them whatever way we want? You talk about their being the indicator to bring about change.

A recent report about poverty cited Ireland as one of the worst places outside the US. When one talks to Ministers, they say that the gap between rich and poor has not grown. However, those figures suggest that it has. We may have more millionaires, but we certainly have our fair share of people living in poverty. It makes sense that people not having to worry about bread on the table or going to school hungry achieve more in their schooling. If Ministers deny that there is a gap between rich and poor, will they accept that those figures must be a lever to improve education as suggested?

The last thing is simple. The document talks throughout about "Ireland". I believe that we are talking only about the Twenty-six Counties, or is it done on an all-Ireland basis?

I too welcome the two gentlemen. It has been a very valuable exchange of information from both sides of the room. I welcome the report generally. We must note, in an up-to-date report such as this that we are in a very different society, in Ireland and Europe, because of the impact of technology. There is a different focus of information for young people in schools today. We were very much book centred in the past, and technology has taken over to a great extent. The basic literacy and numeracy skills - reading, writing and arithmetic - must continue to be developed. With the sophistication of technology, I believe that in many ways they could be threatened, but according to this report we are still producing people from a system where education is doing a good job.

There are certainly areas where we must concentrate and improve, but technology could have a negative effect in some areas. We are all familiar with the text messaging on mobile phones that young people engage in so skilfully - and so much better than some of us. It is certainly a whole new mode of spelling and communication and could overtake the traditional learning of spelling and the English language in general. I understand, as Deputy O'Sullivan mentioned, that in some countries all the schools were assessed and that in others that was not the case. In those where schools were selected for assessment, what selection process was undertaken? What were the profiles of the schools? Were they urban or rural, outside the large centres? I understand some members had occasion to visit Finland recently. What is Finland doing to achieve such unique success rates?

The results and reports we have received cover a wide span of learning. While there has rightly been great strides in terms of the development of scientific and mathematical skills, the education system should also develop good communication skills. Many school leavers achieve high grades at leaving certificate and university levels, yet will be glad to get training in interviewing and presentation skills. They often pay large sums of money to attend courses in this area. Greater efforts should be made to have this area covered by the education system.

I thank the delegation for their presentation. For Ireland to have reached sixth place out of 41 is a fair ranking. However, the figures can be interpreted in any way one likes and some of them indicate a cause of concern. While we rank well in terms of literacy, the sample was taken from a cohort of 15 year olds who are in education. I take it, therefore, that those outside the education system were not included, so the sample is not a true reflection of literacy levels at that age group as it does not cover those outside education. In view of this, are the figures skewed in terms of the bigger picture?

The ranking of sixth out of 41, while a good score, is only a temporary achievement because on the next occasion the weighting will be changed. The major domain for the 2000 survey was reading while the minor subjects were mathematics and science. We were somewhat fortunate that our good subject was ranked highest while the two weaker ones were ranked lower. If the criteria to be used for 2003 was applied to the existing figures things might not look so good.

Given the economy's dependence on technology, why are we not scoring comparable results in the sciences and mathematics as we are with literacy? It is not fair to say it rests with the ability of the students as it would be hard to justify why they are good in one area but not another. Is the low scoring in science a reflection on participation rates in science? Is the score for mathematics related to the perceived difficulties among students with higher level mathematics at leaving certificate level in terms of obtaining points for access to third level education? Given this, students may opt for subjects other than mathematics and science and, if so, it may skew our position in the rankings. If this is the case how do we address the problem?

I welcome the delegation. In some countries, but not in Ireland, all students were tested. How did the sampling take place here? The Minister recently indicated that Ireland has one of the highest provision of special needs assistance in Europe, with 11% of students receiving special needs help in schools compared with 3% in other countries. Were such students and those in special schools included in the sample?

How do you see reform taking place as a result of this survey? Mention was made of Finland and a number of committee members recently visited the country. We asked about their higher literacy ratings and were told that a major factor for them was that while most of their television programmes are in English, they are subtitled in Finnish. They are not dubbed as they are elsewhere in Europe. It means that children must read the subtitles to understand what is happening while learning English at the same time.

Many children in Ireland begin their formal education in school at the age of four, four and a half or five years. Many commentators have expressed concern that at that age they are not ready to learn formally, although they can learn through play and so on. In Finland, formal education starts at the age of seven years, but children engage in pre-school and kindergarten while other European countries have good kindergarten provision. Does that have an impact, especially with regard to mathematics and literacy levels among boys?

What is Hong Kong doing that it scores so well with mathematics? Perhaps we should visit to find out. I am concerned that a number of countries - our competitors - scored significantly better in science. How much better are they and what should be done here to address the problem? Is the lack of science teaching at primary level a problem and should it be encouraged?

What are the reasons for the differences between males and females, especially regarding literacy levels, even allowing for the mixed and single sex schools? The association between socio-economic status and performance is not as strong in Ireland as in other countries, such as the United Kingdom and Germany. Why is that the case? Is it because Ireland does not have such class differences?

Were cultural differences taken into account? We are moving to a multi-cultural society and other countries, such as the UK and Germany, have multi-cultural populations. You mentioned in your introduction that other factors were involved in Germany, especially with regard to expenditure on education. Perhaps you would expand on these in explaining why Germany is not as good as it should be.

It is curious that 15 year olds are performing better than adults in terms of literacy levels. Have any studies been done on the impact of the growth of television, video and other forms of entertainment on the provision of public libraries and books in the home? Do we have a responsibility to try and help some of the non-OECD countries which are not performing as well as their OECD counterparts? What are we doing in this regard? Perhaps we should help them, even from a selfish point of view. It was said at one stage that we should try to improve the economies of other countries so that we have more markets to which to sell our products. Perhaps that could be done where countries are underdeveloped.

I welcome the presentation. Most points have been raised but there are some to which I wish to refer, namely, the survey of schools and the teaching system. How do we compare with other countries in terms of the pupil-teacher ratio? I thought that might have been spelled out. Perhaps it is in the small print at the back of the documentation. I cannot see it.

The age bracket of seven to 15 is quite different to what I would be familiar with, which is four or four and a half up to 12 for primary schools. How do attendance levels in our schools compare with those in other countries? I welcome the report. We must learn from it and tackle the problems highlighted in it. I compliment both witnesses on their presentation.

If during responses something strikes members as being relevant, they should feel free to intervene.

On the issue of the difference between boys and girls in literature and science, does the fact that nine out of ten primary school teachers are female have a negative impact on boys? Has this been investigated? It may have nothing to do with it.

Dr. Ó Dálaigh

I do not know if we can answer all the questions in the timescale, but we will do our best. Dr. Kellaghan is looking through the documentation to answer one or two items and I will take some of the others.

Another feature I discovered about Finland in addition to what was said about the subtitling of television programmes is that they have a history of parents reading to children at an early age in their homes on a Sunday afternoon in accordance with religious beliefs, etc. This is something that has been paramount and was put to me on a visit two years ago as a contributory factor. Early reading is essential.

Regarding the difference with Germany, Ireland does not score as well as Germany and other countries in what we refer to as the socio-economic factor, but it does in one area in particular. Our second level education programme is much more homogeneous and less selective. In Germany there are three different types of schooling and it is very difficult to move from one to the other. The results in the lowest level schools in Germany to which children would be transferred at the age of 11 are very poor. This has caused a huge problem and national debate in Germany on the education system.

There are variations between Lånder - states - in Germany. Bavaria is the best and would approximate to countries that would rank more highly. However, other Lånder, especially in northern Germany, fared very badly. They are critically examining their education system as a result. It was a huge shock to them. There is a great deal of difficulty in moving or making changes in the system.

Following the process, we had a national dissemination seminar in which we discussed the results with the different partners in education, such as teachers, teacher bodies, management, etc., and we produced our own national report which was referred to and copies of which Dr. Kellaghan has to give to committee members. It is our own national survey and we are examining sending a condensed 40 page version of that to schools and teachers, especially literacy teachers. There are messages in that. It will be available in a few weeks and we will give copies of it to committee members. We have been engaged substantially in the process. The message is that there is a need for a focus in each school and individual programmes in second level schools targeted to meet the needs of students with serious learning difficulties.

Regarding Deputy Enright's question about 15 year olds with a reading difficulty, the report states that the reading and literacy levels were much lower in that group of 15 year olds who had not left school but who had indicated in our national survey that they would leave school. That calls for more focused intervention in the second level system to target these people to enable them to stay. There is no doubt that the leaving certificate applied has increased retention. The evidence suggests that the tendency has been to stay a little longer. It has been difficult to increase the 82% figure. Reading and literacy levels of 15 year olds who intended to leave school, which would be about 14%, 4% having left already, were much lower, and that calls for earlier intervention.

It should be remembered that this survey is not based on our actual syllabus. It must be neutral and outside the syllabus to some extent if one considers how a survey covering the syllabus in Ireland and Japan or Korea might be framed. That has possibilities and negative points as well. One of the reasons our literacy results were better is that the framework of the tests were broadly comparable with our recently revised junior certificate examination in English. We ask questions in it about video and other forms of communication other than the written word.

We have never scored highly in other tests in mathematics, so our results are broadly similar with those in other tests conducted internationally in the early 1990s. It could be argued that some of the items were not on the junior certificate syllabus, which is true. Half the items were on measurement and geometry which seem to be a problem for Irish people in general as shown in earlier tests.

On the other hand, we look forward to the results that will emerge from the 2003 examinations in mathematics because we have a new syllabus in the subject at junior certificate level which is being examined for the first time. We have engaged in a significant in service programme over the past three years to assist in the teaching of mathematics.

There is a difference between what some call "real world" mathematics and what others call theoretical mathematics. The Australians and the Dutch have been the leaders in "real world" mathematics which is more problem solving and solution driven. We have had a tendency, as have other countries - we are not unique - to give ten different examples of the same procedure which means if a person is asked to perform it but in a different format, he or she is stumped. We must get around that and that is what we have been trying to do in the teaching of mathematics. In answer to Deputy Curran, I do not know what the outcome will be this time, but we have the new junior certificate mathematics and more in service training for teachers. We did not allow calculators to be used in the junior certificate at one stage, but they can be now. Some students used them and the results were slightly better for them.

Turning to science, which is close to my heart as a scientist, 10% of students do not do science, which obviously has an effect. Other countries do not have compulsory science in the junior cycle. We are looking at this issue. In some areas it is felt we should have compulsory science while in others it is felt we should be attracting people de facto and not de jure to science. We are introducing the new junior science syllabus in September and there will be considerable in-service support for that. It will be interesting to see in 2006 what the results will be. We have introduced quite a lot of science and technology in society and the appreciation of science into our leaving certificate syllabi, which is significant for chemistry and physics. It is not all doom and gloom, with the percentages of those taking chemistry and physics increasing slightly; it is not decreasing. The preliminary figures for this year will show a further slight increase in the numbers taking physics and chemistry. That is a percentage, so it may go down slightly in real terms as numbers taking the leaving certificate are going down. We are putting in place a programme of resources to equip schools better for the new junior certificate science syllabus but that does not figure largely in this report.

In relation to testing and so on, to which Deputy O'Sullivan referred, there has been some movement in that area in the last year. I am not sure of the exact nature of that work but it is in the area of developing standardised tests which can be used in sixth class; perhaps Tom could talk about that. We have been looking at various areas and trying to suggest that in school development planning and the school completion programme, as well as different schemes for the disadvantaged, we should be putting greater emphasis not on the structures but on the quality of teaching in the classrooms. That is where greater emphasis is being put at present.

We have been looking at a project called reading recovery, which is being trialled among pupils of five and six years, in their first year in primary school. It was trialled first in Monaghan and is now being trialled in 15 schools in Dublin. It is probably too early to be definitive about it but it is a very sustained one on one teaching programme with children who have very low reading levels when they start. It seems to raise the reading level of struggling pupils to the average level in a short period and it enables pupils with special needs and learning difficulties to make accelerated progress in learning to read, obviating the need for support and remedial programmes. Generally as teachers have become more used to it the practice has rubbed off on other teachers, so there is a backwash effect and teachers have developed different approaches to the quality of classroom teaching. However it is at an early stage yet and the full outcome of the second trial in the Dublin area has not been fully evaluated yet.

As for sampling, specifically pupils with special needs were excluded from this sample. That was a bone of contention among countries but they were excluded, which would have meant a 2% to 3% possible sample. They were identified by school authorities and excluded on that basis.

Dr. Kellaghan

Thank you for the interesting and wide ranging comments. I will probably be incoherent but I will address some points.

Generally speaking, this is not scripture; it is not the gospel. We should not start doing things on the basis of one report. Measuring achievement is not like measuring the length or weight of something. There is error involved and one gets out what one puts in, in some ways. Choices have to be made as to which areas to measure and, as Dr. Ó Dálaigh was saying, there are some aspects of this, particularly mathematics, which would have been less compatible with our curricula in schools than, say, English reading. The 2003 administration has taken place in mathematics, so it will be interesting to see with the full range of mathematics assessment where we stand.

International studies done in the early 1990s produced somewhat different figures. In 1991 14 year olds did not do well in reading literacy in an IES study, while in 1995 they did much better in mathematics in another study. One obviously considers these reports carefully and interprets them but they are not gospel and have to be looked at in that context.

There is quite a range in teacher-pupil ratios. The lowest seems to be in Denmark and Liechtenstein, at 17 to 1 while Ireland is 24 to 1 and in Hong Kong it is 38 to 1. What does that say about achievement? In Hong Kong there is another issue, like Finland, in that I am told the language is easy to read because it looks very like its sounds; it is not like English which is very difficult. One has to match sounds to written forms. I spent some time in Hong Kong and students are very focused and disciplined. There is enormous pressure on them in an exam-dominated system but they are highly motivated. They are disciplined, work very hard and that probably has something to do with it. It is not the size of class, as they have a ratio of 38:1.

Deputy Andrews asked if from the beginning 11% of the students could be viewed as functionally illiterate. No, do not believe they are functionally illiterate; they have basic literacy skills. It was the same with the international survey of adult literacy, with newspapers and the media taking those at level 1 as being illiterate and saying time after time one-quarter of the population was illiterate. If one was illiterate one did not get into the survey. One had to answer questions to show one could read. The survey was not about illiteracy but about levels of literacy. Obviously we should not be happy about people at level 1, as they would have some problems but they have basic skills.

Deputy O'Sullivan referred to a parliamentary question. I am not aware of the question but I suspect it had something to do with a proposal that all students would be tested and information on their performance would be supplied to the Department. I gather that did not happen due mainly to teacher opposition.

Deputies Curran, Stanton and Crowe referred to sampling. One must first define the population. At that stage there are exclusions such as special needs students. This does not vary much from country to country but a percentage would have been excluded at that stage. Once one has defined the population, a probable sample is taken. Each pupil in the country has the same probability of selection into the sample. It is a two-stage sample. Schools are selected on their probability proportional to their size and, within schools, a class is chosen at random. It is very important to retain this idea of randomness and probability, otherwise schools might be under-represented or over-represented. If that produces an over-representative or under-representative sample, which one might want in some areas, that is weighted back when one gets the results. I have no problem with sampling being representative of the population as defined.

In terms of error of measurement, if one were to take a different sample a week later one would not get exactly the same result. It is possible for one administration to calculate the range one would expect. Depending on the distribution in the sample, we can estimate the real probable score.

The audio visual display shows a significant variation between Israel at 9.3%? Why is that?

Dr. Kellaghan

That means there is a huge variation in the scores in Israel, which is unusual.

Deputy Stanton asked about cultural differences. The answer is, "No". However, some data were collected on whether students were born in the country. The only explanation for Luxembourg being so far down is that a large immigrant population took the test.

Dr. Ó Dálaigh

The Turkish population in Germany scored very badly. This is because the test was administered in German rather than in Turkish.

Dr. Kellaghan

We appear to rank fairly low in regard to expenditure on education. The OECD uses a percentage of GDP. We tend to do badly on estimates based on GDP. In most countries the difference between GDP and GNP is not great. It is larger in Ireland than in any other country because of the large multinational presence in the country and a lot of the money goes out of the country. We would probably come out better if it was a percentage of GNP rather than GDP. The most recent figures are approximately 87% of GDP.

It is a question of how one spends the money. For better or worse, we spend most of our money on teachers' salaries. This may mean we are getting better teachers into the system. It might be more important to have very good teachers than a lot of equipment, charts and so on, which we do not have in schools. Teachers' salaries here are the highest among OECD countries. It is a question of how we spend the money in terms of how it might affect achievement.

Is everyone happy? I thank the delegation for a very interesting session. It is wonderful to delve into these international matters and the mysteries of the systems in operation.

Dr. Kellaghan

I have a report of the Irish study which I will leave for members. It is not the full report, just a summary.

The joint committee adjourned at 1 p.m. until 11 a.m. on Thursday, 17 July 2003.
Top
Share