Northern Ireland Assembly Flax Flower Logo

COMMITTEE FOR EDUCATION

OFFICIAL REPORT
(Hansard)

Inquiry into Successful Post-primary Schools Serving Disadvantaged Communities: 2009 PISA Results

2 March 2011
Members present for all or part of the proceedings:
Mr Mervyn Storey (Chairperson)
Mr David Hilditch (Deputy Chairperson)
Mr Jonathan Craig
Mr Trevor Lunn
Mr Basil McCrea
Miss Michelle McIlveen
Mr John O’Dowd
Mrs Michelle O’Neill

Witnesses:
Mrs Katrina Godfrey )  
Dr Chris Hughes ) Department of Education
Ms Karen McCullough )  
The Chairperson of the Committee for Education (Mr Storey):

I welcome to the Committee Katrina Godfrey, director of curriculum, qualifications and standards; Karen McCullough, head of the statistics and research team; and Chris Hughes, head of the standards and improvement team. Members have been a given a copy of the relevant papers.

Mrs Katrina Godfrey (Department of Education):

Thank you, Chairperson. You have completed my first task and introduced my colleagues for me. You know that this is a game of two halves, and that I will be joined in the next session by Roger McCune for the next session on examination grading.

The Chairperson:

I hope that it is better than the second half of a certain football match last night.

Mrs Godfrey:

I am not a United fan, so I will not comment.

The Chairperson:

Which team do you support?

Mrs Godfrey:

I will say nothing, except that success is not something that has been known to me for some time. [Laughter.]

Would it be helpful to pick up on some of the points on value added measures?

The Chairperson:

Yes, please.

Mrs Godfrey:

It is an issue that we have been looking at. You may recall that value added was a particular feature of the new assessment arrangements that we came to talk to you about. Those assessment arrangements were being introduced in support of cross-curricular skills of communication, using mathematics and ICT, because we were conscious that if we got the Key Stage 1 assessments to be robust and teacher-moderated with a level of confidence and consistency in approach, they would provide one element of a baseline from which value added could be measured.

Additionally, we took the view that, rather than simply setting targets of expected levels for a percentage of young people to reach at the end of every key stage, we set a target that every pupil would be expected to progress at least one level, except in cases where there were good reasons. We thought that that would be a useful way of capturing those pupils who leave primary school with a very high level of attainment. Why would we not set them a stretching target for the first year of post-primary education? It would also be a good way of capturing young people who are achieving level 1 at Key Stage 1 and level 3 at Key Stage 2, and, thereby, are not at the expected level at either key stage, but making a significant amount of progress that we think should be captured.

Along with colleagues, a year or two ago I was involved in an Organization for Economic Co-operation and Development (OECD) working group. The interesting thing there was that nobody had cracked contextual value added measures in the way in which most of us, as parents, would instinctively understand it. The point that was made about getting the baseline right and measuring progress from the baseline is key, and one that is a feature of the new assessment arrangements.

With regard to the commercial tools, the Department’s key focus, beyond the requirement of key stage assessments and the use of InCAS in primary schools as a diagnostic assessment, is on schools deciding what is right for them. The key factor for us is how they use the data that they have, not that they have to require and gather particular data. In that context, schools will tell you some interesting things.

Not long ago, I was at a presentation at which a school’s head of maths was taking a group of other schools through their approach to value added and the effective use of data. Being a head of maths and good at such things, she had tracked the correlation between the commercial test that they had been using and the outcomes of the pupils. She found that there was a very weak correlation. She then tracked the correlation using the Key Stage 3 tests that CCEA still provide in maths, English and science, and she found a stronger correlation there than she did with the commercial tool. Subsequently, she took it a bit further and looked at the teachers’ internal tests in years 8, 9 and 10, and, perhaps unsurprisingly, the strongest correlation was between the performance of pupils in those internal tests and the GCSE outcomes. That suggested that the teachers had a pretty good grasp of what they were doing in that school.

She also made the point that it is important not to let those predictors become a self-fulfilling prophecy. We are always conscious that that is important. If a child is performing at a low level, for instance, you could easily get into a mindset whereby you would say that he or she is predicted an x and that you would be glad if he or she gets an x. However, the child could be capable of much more.

One of the other issues that schools feed back to us is the real importance of making sure that any predictions from those sorts of tools are looked at and reviewed constantly, rather than being benchmarked and the capacity of the pupil being forgotten about.

Those are some of the issues. It is for that reason that we do not dictate to schools what they should and should not use, beyond the statutory requirements. The Department and our colleagues in the inspectorate say that it is about how the schools use the data that they have. The most powerful data is usually that which is gained from the teacher’s assessment, testing and monitoring of the pupil’s performance.

The Chairperson:

Are we any the wiser after that? I mean no disrespect to Katrina for the way in which she presented it. I am asking whether members are any the wiser?

Mr B McCrea:

The answer is no. That is more to do with me, Katrina, rather than you.

The Chairperson:

Is the bogeyman in the room not continually the issue that there is the reluctance to use a standardised assessment just in case those bad schools might use it as a means of selection?

Mrs Godfrey:

I do not think so. For example, a large number of primary schools will make use of the commercial NFER test, and there is progress in maths. The key thing is how they make use of them. Do they just do them, or do they use them to identify pupils’ strengths and weaknesses and to plan their teaching and learning around those strengths and weaknesses. That is the key thing — not the test itself, but what people do with it.

Mr B McCrea:

Is that not the key issue? People want to use those tests for educational purposes and, therefore, you do not want to disturb them by teaching to a test. We, with our public oversight, have something different that we would like to do, which is to find out how we get the most effective teaching. Certain sectors are not performing well. I have seen the scattergrams that show that 30% of the pupils in certain schools have free schools means and, therefore, one should expect the same outcome in each school, yet it is not the same. In order to inform public decision-making, can we not find some way of getting baseline figures that show what is going on? It is not for teaching; it is to inform public policy.

Mrs Godfrey:

That is why we have taken the approach that we adopted with regard to the assessment arrangements that will come in next year. We have accepted that the central responsibility of any school is assessment for learning, and assessment of learning. We have said that we want to give schools the maximum possible flexibility and autonomy, but we have also said that, in the case of literacy, numeracy and ICT, the system needs reassurance and the public and the taxpayer need reassurance that the system is performing at an acceptable level. Therefore, the arrangements to be put in place for literacy, numeracy and ICT will be of a slightly different order. They will be consistent; they will be robust; they will be moderated; they will be reported on; and they will provide us with a more robust baseline from which to measure progress at system level for us, at school level and at pupil level.

Mr B McCrea:

Will they be contextual value-added measures?

Mrs Godfrey:

They will be value added in the sense that they will give the baseline for each pupil and they will allow for measuring the pupil’s progress.

Mr B McCrea:

The point is that we were looking for a contextual value-added measurement. However, the point was made that if you have people coming from different backgrounds and with different support at home, you would expect different outcomes that have little to do with teaching in school. We need some baseline figures that take those things into account.

Mrs Godfrey:

The baseline at Key Stage 1 will give us a sense of where pupils are at individual level, whether you are at school level, whether you are in a managing authority and at system level, or whether you are there for us. However, you also must be careful to ensure that you are not creating self-fulfilling prophesies by assuming that children cannot achieve at the expected level just because English is not their first language or because they come from a disadvantaged background. The scattergrams are interesting, but they do not actually provide any answers. They simply pose questions as to why a certain school is there; the make-up of its pupils; whether they are achieving to their full potential; and whether everything that can be done is being done to make sure that they have the requisite quality of teaching and learning support and, therefore, attain at the highest level possible.

Mr B McCrea:

That gets to the nub of it. Surely the Department should have some tool that is able to drill into the value-added issue. Without wishing to upset anybody, the point was made that you could have a school where 30% of pupils have free school means and come from an area where there is endemic and systemic underachievement. However, there could be another school in a different place, where, regrettably, a factory has closed down and, suddenly, a lot of children are entitled to free school meals. Those children are not in the same learning environment as other children.

There is a suspicion that, in the particular case of working-class children from a Protestant background, there are cultural and other factors that we are not able to pick up on and address. I am not sure that to simply take a baseline and conclude that such children have improved, but not by much, is where we are at. We need to know why they did not improve, or what other factors pertained. We keep having studies to get to the root of the problem, but I do not know that we have the information to help us do that.

Mrs Godfrey:

I would be less pessimistic than that about the new assessment arrangements that will be introduced. We also know that, from those schools that perform above the levels that some may expect, it is to do with the quality of teaching and learning, as well as leadership. Importantly, one of the differences, which arises time and again, is that someone, perhaps in the absence of anyone else, decides to have high aspirations and expectations for pupils. Members will have seen examples of that during your inquiry. That is the aspect that we cannot measure in our scatter plots, but if you read the inspection reports and talk to schools, that is the ingredient, as well as high quality leadership and good teaching, that you will see time and again. That is where the real challenge is.

Mr Lunn:

I am interested that, at the end of all this, Katrina, you tell us that the teachers’ own in-class assessments are more valuable than those expensive tools, which, apparently, come out at around £500 a pupil. I think that that was what Basil was getting at. The only one of those tools that we had explained to us here, which introduced the concept of contextual issues outside of the school’s control, was rubbished by the Government and phased out. I would have thought that that is the one that we were after. Do any of the other widely used tools find any favour with you, or do you think that it should be left within the classroom?

Mrs Godfrey:

No. They all have uses, and if teachers find them useful, we would say that that is fine. However, a teacher’s professional judgement is often as useful, if not more so, than a commercial tool that will not have been designed with any one particular school or set of children in mind, but against a statistical norm. The tools can be very useful in supporting, confirming or affirming a judgement; schools make very good use of them in many cases. Nevertheless, even those schools will tell you that the tools will not be a substitute, nor would they be used as a replacement for a teacher’s professional judgement.

Your researcher picked up on the point that the schools White Paper in England signals a move away from some of the contextual value-added measures. If anything, that points up the difficulties, and, as I said earlier, in talking about this with the Canadians and the folk from New Zealand and Finland at the OECD, it was clear to me that every country is struggling to find a perfect contextual value-added measure. Karen may want to say something about the research perspective.

Mrs Karen McCullough (Department of Education):

I know that the OECD has published a report on various value-added measures. One of my colleagues is looking at that and is drafting something up to have a look at what kind of conclusions are being reached across OECD countries. That may be helpful in informing the debate.

Mr Lunn:

Could it be that the contextual value-added measures are being phased out because, as far as I can see, they provide a league table of schools, which, perhaps, the Government would not be too keen on?

Mrs McCullough:

They can be accessed through the Westminster Department for Education or BBC websites, but the way in which they are presented makes them difficult to read and understand. They are not being used and understood by people; there are just lists of figures that are quite hard to follow. That is one of the problems, because the public are having difficulty in interpreting them.

Mr Lunn:

The Yellis tool purports to provide a predictor of performance at GCSE level, yet it is not related to the curriculum. I just find that odd.

Mrs Godfrey:

I do not have the material that the Committee has.

Mr Lunn:

That is exactly what is says: it is not related to the curriculum, but it provides a prediction of performance at GCSE. Am I missing something?

The Chairperson:

We will make the research paper available to the witnesses.

Mrs Godfrey:

That would be useful, and if there are any points that you want us to cover from a departmental perspective, we can do that.

The Chairperson:

I am trying to get some sense from this. I get the sense from everything that is going on that there is very diverse provision, which is left to the professional judgement of the teachers. As a result, we end up with a patchwork quilt in respect of the very important issue of how we assess and contextualise the issue of value added.

Trevor, were you finished?

Mr Lunn:

I am finished. That was what I was trying to get to, but you said it beautifully.

The Chairperson:

Thank you very much. That has set me up for the day.

Mr B McCrea:

Another electoral pact.

Mr Craig:

The Chairperson hit the nail on the head. Katrina, one of the things that became abundantly clear when I looked at the tools was that they are a mixed bag. I can understand where Basil was coming from earlier, and, to be quite honest, you would need a master’s degree to understand what the tools actually tell you.

Parents make little or no use of the tools. They look like something that came out of a shotgun — they are ridiculous. However, when you get in there and analyse it, there is some fascinating information that schools and teachers can use to adjust their learning techniques with individuals. I have seen that used to good effect, and I know what you mean when you say that teachers have their own ideas about why students are or are not performing. However, I have also seen the tools highlight instances where teachers got it wrong.

I do not know how you find the amazing balance between what the tools and the teachers tell you, but there is a contradiction at times. The tools are very complex and they almost give you too much information. Perhaps simpler tools are needed that will provide the same information.

Mrs Godfrey:

That is very much the idea behind trying to get the final assessment arrangements in place, so that they are common to all schools and provide something. The tools will never remove either the need for, or the wish by, schools to do something else to test their own judgement.

Teachers often tell me that, sometimes, it is the developed ability part of the tool that flags up the most useful feedback. It may show a pupil who is underperforming in either reading or maths, but whose developed ability is very high. That gives teachers a clue that that performance may be due to the way in which the child is learning or being taught, and that something is blocking them from achieving their full potential. I expect that that is the reason why different schools use different tools, because they find the one that works best for them. In the vast majority of cases, they find tools that are designed to support rather than replace their own judgement, and that helps them to find out whether they missed something or whether there is something that they need to home in on.

It was telling that, during the presentation that I attended, it was pointed out that the school’s own judgement, as measured through its class tests, was the most accurate tool, and that that had given the school a huge amount of confidence in its own professional judgement. That seems to be where you would want to be.

Mr Craig:

It is. The only thing that worried me about that approach was that, no matter where I saw it used, it highlighted instances, and perhaps even individuals, where the judgement was clearly wrong. I found it extremely useful —

Mrs Godfrey:

— to have that objective overlay.

Mr Craig:

Yes.

The Chairperson:

We will take a follow-up question from Trevor, and then move on to PISA.

Mr Lunn:

I just want to make a suggestion. Jonathan has obviously seen some of these things in action, and I have not. Would it be possible for the Committee to see a sample of a Yellis report to see whether we could make any sense of it?

Mr Craig:

Best of luck to you, Trevor.

The Chairperson:

I think that we may have one in the office. We will make that available.

Mr Craig:

It took the headmaster half an hour to explain it to me.

The Chairperson:

OK. We will not go there. Will the Key Stage 2 assessment be moderated?

Mrs Godfrey:

Yes. The decision is that Key Stages 1, 2 and 3 will be moderated teacher assessment.

The Chairperson:

Thank you. We will move on to PISA — another assessment.

Mrs Godfrey:

Yes; I am conscious that time is marching on. You have a paper that tells you what PISA is. We consider PISA, as do many other jurisdictions, including England, Scotland, Wales and the South, to provide a useful international benchmark of our 15-year-olds’ skills in reading, maths and science.

We consider it wholly appropriate to look at the performance of our system compared with others, particularly because as part of our and my Minister’s involvement in developing the Executive’s new economic strategy, we are conscious that young people leaving our school system here will be going out into a workplace that is increasingly operating on a global basis. Therefore, it is right that we would want to benchmark ourselves against the best in the world and not look at ourselves in the context of just these islands. That is why we attach particular importance to PISA as an international benchmark, and, as I say, so do others. Karen, who is the head of our statistics and research team, and our lead guru on PISA, will talk you through some of the key findings.

Mrs McCullough:

The most frequently quoted analysis in the reporting of PISA is the average score achieved by students in a country. Those are ranked to produce tables of countries’ performances and categorised as being above the OECD average, at the mean, or below the OECD average. That opportunity for real international comparison is the strength of PISA. In 2009, the assessments were administered in 65 education systems, including all 34 OECD member states and 24 members of the EU. So, how did our students do? Overall, we were placed among the average performing countries in reading and maths, and above average in science.

In each round of the PISA survey, students are assessed in those three areas of literacy, reading, maths and science. There is a main focus of the study on each. The focus in 2009 was reading, so I will start with that, say where we were, and highlight factors that affected the scores. Students here achieved a reading score of 499, which was not significantly different from the OECD average. Scotland, Ireland and England were similar to Northern Ireland. Wales, however, was significantly lower.

Nine countries performed significantly higher than Northern Ireland. Five are Asian countries: Shanghai-China, Korea, Hong Kong, Singapore and Japan. The other four are Australia, Canada, New Zealand and Finland. Finland was the only EU country with a mean reading score significantly higher than Northern Ireland.

In all participating countries, there was a significant difference in reading in favour of girls. In Northern Ireland, that was 29 points. That is lower than the OECD average, and Northern Ireland had one of the lowest differences between girls and boys in reading. One item also reported on is the range between the highest and lowest performers. That is measured by looking at the difference between the fifth and ninety-fifth percentile. By doing that, we take the outliers — the extremes — out of the data.

In Northern Ireland, the difference between the highest and lowest in reading was 315 scale points, which is slightly wider than the OECD average. However, the spread of data of 14 countries exceeded ours. That group includes some of the top-performing countries, such as New Zealand, Japan, Australia and Singapore.

That spread of achievement relates to another very useful measure: the percentage of students at each of the levels of proficiency in reading. The definitions for what is required at each level are covered in the report. However, as a guide, pupils are expected to reach at least level 2, which is considered to be the level at which students demonstrate the reading skills that will enable them to participate in society and future learning. Therefore, we are looking for them to perform to at least that level. Just under 18% of pupils here were below level 2 in reading, which is similar to the OECD average. That means that one in six students here is estimated to have poor reading skills. There was, however, a notable variation by gender, with 11% of girls and 23% of boys, which is more than one in five, not reaching the level of literacy considered necessary to participate effectively in society.

All countries, including top-performing ones such as Shanghai, have a spread of performance, with some students performing at the lowest levels of proficiency and others performing at the highest levels. The difference between countries is the proportion of students at each level. In Northern Ireland, 18% of pupils were below level 2, whereas just 4% in Shanghai were. Similarly, 31% of pupils here were above level 4, whereas 54% of pupils in Shanghai were.

Having described where we stand, I will examine the kinds of factors that contribute to improving scores in reading. What makes pupils in certain a country achieve a higher score? The first and, perhaps, foremost factor is socio-economic status. PISA reports that in the economic, social and cultural status (ESCS) index, which is based on pupils’ responses to questions about their parents’ backgrounds in education and the resources available to them at home. As members would expect, there is a positive correlation between reading scores and socio-economic status, with pupils’ average scores increasing through each quartile of that ESCS index.

In Northern Ireland, the change in score for each unit of that index was 48 points, which is relatively high and means that socio-economic background has a larger effect in Northern Ireland than it does in other OECD countries in general. However, that does not mean that pupils from the lowest socio-economic groups cannot achieve. In fact, PISA results show that it is possible for students who are in the lowest socio-economic group to achieve high scores. It is just not as likely for that to happen here as it is some in other countries, including, interestingly, Canada, Norway, Finland and Japan, where pupils are more successful at overcoming the effects of social disadvantage.

A second factor is reading for enjoyment, which has, as you might expect, a positive relationship with reading achievement. Unfortunately, 43% of our students — 35% of girls and 53% of boys — reported that they do not enjoy reading. However, the impact of reading for enjoyment is really marked. Reading for at least half an hour a day raises a girl’s score by on average 60 points and a boy’s by 93 points.

If we look at the group that does not read for enjoyment and split that into socio-economic groups, we see that those in the lowest socio-economic group score 428 and those in the highest score 511. So there is a difference there. That suggests, as it did in Ireland, that socio-economic status mediates, to some extent, the association between reading for enjoyment and reading achievement. So, again, socio-economic status is really important.

Three other factors that are linked to socio-economic status also impact on reading scores. Those factors are: having a large number of books at home; the number of computers at home; and family structures, with students from two-parent families scoring higher than those from single-parent families. The effect of computers and ICT on scores is quite complex. There is no clear correlation between scores and the frequency of using ICT. However, what students do on the computer appears to impact on scores. For example, pupils who chat online several times a day score lower than those who never chat online. However, those who use online dictionaries score significantly higher than those who do not. As a mum of three girls, I will have to keep them off Facebook and MSN.

School characteristics are also important but, again, this is closely tied up with socio-economic status. Given the variation in academic ability of their intake, it is not surprising that grammar school pupils significantly outperform non-grammar school pupils, scoring 572 and 451 respectively. That is a difference of 120 points on average. Analysis of proficiency levels also shows that less than 1% of pupils in grammar schools performed below level 2 in reading, and in non-grammar schools that was 28%, which is a big variation.

However, the average socio-economic status of the school is important. Pupils who attend schools whose average socio-economic score is in the lowest quartile of the ESCS index have an average score 140 points below that of schools in the highest quartile.

What the paper highlights is the considerable variation between the socio-economic status of our grammar and non-grammar schools. Analysis of the average score of pupils from each of those socio-economic quartiles shows that there were so few non-grammar schools with pupils from the highest socio-economic quartile that an average school score for that category could not be calculated. Similarly, there were so few grammar schools with children from the lowest socio-economic quartile that average school scores for those could not be calculated. That is very important.

Grammar school pupils significantly outperform their non-grammar school peers, regardless of socio-economic background, and that relates to the point that we made earlier that pupils from the lowest quartile can achieve at high levels. However, having said that, even within grammar schools, those from the top quartile significantly outperform those from the bottom. Therefore, you can do well if you get into a grammar school, but you are not going to do as well as those from higher socio-economic groups.

Gender intake of a school also impacts on the scores, and that varies according to whether it is a grammar or non-grammar setting. Generally, boys and girls do better in mixed-gender schools, except boys in a non-grammar setting, who do better in an all-boys school. There is also a positive correlation between parents’ expectations and pupils’ scores, especially in non-grammars. Those schools whose principals reported that the majority of parents have high expectations in academic standards score significantly above those that report that parental pressure for academic success is largely absent.

Those are the kind of issues that we have looked at locally so far. We got the data in December and we have looked at the Northern Ireland context. OECD has been looking at some other emerging trends. The things that are emerging from other analyses are: the effects of pre-school education, resources, class sizes, the classroom environment, assessment, and accountability. There is more detail in the large OECD reports, but interesting stuff.

I will just briefly outline the performance in maths and science. With respect to our overall score and position in maths, students here achieved a score of 492, which is not significantly different from the OECD average. Scotland, Ireland and England were similar to Northern Ireland. Wales was significantly lower. Twenty countries performed better than Northern Ireland, and Shanghai topped the poll at 600 points. Again, there is variation by gender, but unlike reading, boys’ performance in maths here was significantly better than that of girls. That is a pattern repeated in 34 other countries, including Scotland, England and Wales, and just five countries reported a significant difference in favour of girls.

Let us turn to the spread of attainment in maths. Northern Ireland had a difference between our highest and lowest achievers of 289 points, which is lower than the OECD average. The difference between us and the top-performing countries is the variation in the proportion of each level of proficiency again, and particularly the percentage achieving at the highest levels. We had 29% of pupils getting level 4 or above, but 27 countries had higher proportions with east Asian countries featuring strongly, with over half of their pupils — compared to just 29% of ours — achieving level 4 or above. In Shanghai, that is as high as 71%. One of the comments in the report is that we seemed to lack high achievers.

The final area is science literacy. As in previous rounds of the study, Northern Ireland’s score was significantly higher than the OECD average. England, Scotland and the South did not perform significantly differently to Northern Ireland. However, Wales’s score was, again, significantly lower. Ten countries performed better than Northern Ireland. They included two EU countries: Finland, again; and Estonia.

In science, there is no difference between girls and boys. That seems to be a trend in certain countries, although it varies very much between countries. There were gender differences in 32 countries, with 11 favouring boys, and 21 favouring girls. However, in the top-performing countries, there seemed to be no difference or the difference favoured girls.

The spread of scores between the fifth and ninety-fifth percentile showed a difference of 335 points. That is larger than the OECD average. In fact, only 11 countries had a wider distribution of high and low performers. Again, where the top-performing countries have students who performed at the lowest levels of proficiency, the proportions are lower than ours. The proportions of students achieving in the highest levels of proficiency are higher. Therefore, in Northern Ireland, one third of pupils performed at level 4 or above. In Australia, it was 40%; Canada, 38%; Finland, 50%; and 60% in Shanghai.

That is a brief run-through of what we have found so far. As I said, analysis is still ongoing.

The Chairperson:

I worry about how PISA assessments are used. It seems that they are used to talk down the education system in Northern Ireland. It came as a surprise to some of us when we discovered that PISA itself, in the structure that it uses, does not claim to test how well a student has mastered a school’s specific curriculum. Take, for example, its definition of mathematical literacy. Certain people will tell you that you cannot actually measure or define mathematical literacy. In fact, I found a quote from one academic. This is where you get into the whole area of academic war that goes on. Mathematician Tony Gardiner said that:

“The neutral observers I know who have tried to make an honest assessment of PISA have all come to the uncomfortable conclusion that there is something seriously amiss at almost all levels of the PISA program.”

Another observer stated that pupils who have not studied science often do better on PISA science tests than pupils who have.

Sometimes, I wonder about the rationale for our using PISA when its criteria and methods are not objective. In fact, during 1998 and 2003, when the Department spent £40 million on numeracy and literacy, there was improvement at levels 5, 6 and 7 and Key Stage 3 English. However, the PISA score did not improve. Therefore, what are we trying to achieve?

The issue has been going around and around in education for a long time. Every time a PISA assessment comes out, it has another go at the system. Finland is always used as a great example. People say, “Boy, if only we had a system like Finland’s.” I can only report on an assessment of the Finnish system, which stated:

“In the eyes of the researchers, Finnish school teaching and learning seemed to be very traditional, mainly involving frontal teaching of the whole group of students. Observations of individualized and student-centred forms of instruction were scarce. Given the enormous similarity between the schools, the observers were convinced of the high level of pedagogical discipline and order.”

That is completely different to what we are trying to achieve in Northern Ireland’s education system through the revised curriculum, which is all about moving away from the formal and traditional. So, where does that leave us?

Mrs Godfrey:

We and our Minister take the view, as do the education systems everywhere else in these islands and in most of the OECD countries, that a benchmark is important. Of course, we cannot measure the curriculum as delivered by individual schools because, by its nature, the survey is designed to be taken by 15-year-olds in 65 countries. However, the fact that it is applied consistently to 15-year-olds across 65 countries gives us a very good indication of how our 15-year-olds compare to others. That is an important factor.

The other thing that is valuable about PISA — its officials make this point — is that the focus on, for example, maths, tends to be on testing pupils’ ability to apply their knowledge in a given context. It is the same with science and, as you have said, is very much in line with the revised curriculum. If you take cross-curricular skills, we do not, in the new levels of progression, talk about numeracy; we talk about using mathematics. That is the concept of not just knowing something, but being able to take it out of the textbook and apply it in a real-life situation. Those are the skills that our employers tell us time and again that they want to see. They want literate and numerate young people, but they want young people to be literate in the sense of not just being able to read and write, but being able to communicate, articulate, persuade, negotiate, listen, and so on.

So the focus in PISA on reading and on application of mathematical and scientific skills is useful. The consistency of approach across 60-odd countries gives us an indicator of how we compare to those countries, and that indicator is very useful in the context of sending young people out into, as I said at the outset, an increasingly global economic climate. It is useful for us to know how we compare to others. We think that that is important, and other countries think that that it is important too.

You may have seen the Welsh Minister’s speech a couple of weeks ago. Wales, as Karen has said, did not do particularly well in its PISA outcomes. It is bitterly disappointed and is putting in place a programme of action to try to address that, because it thinks that this is an important survey that demands a response. That is very much our position. No survey will ever be perfect, but this is a useful benchmark for us to determine how Northern Ireland plc performs educationally compared to all the countries with which we compete in an economic context.

Mr Craig:

The Chairperson has raised an important point, Katrina. The top achievers in the performance tables are quite remarkable. They are Shanghai in China, Singapore, Hong Kong, Korea and Chinese Taipei. We then drop into the first European one, which is Finland, and the Chairperson pointed to that. It continues with Liechtenstein, Switzerland, Japan and Canada. What form of teaching takes place in all those countries? It is not the open and liberal type of teaching that we have in this country. In fact, I suspect that most of those countries, with the exception of one or two, probably still use the cane or use something worse. What does that tell us about this open and liberal system of teaching? Is it as good as those countries’ systems?

Mrs Godfrey:

You are right in that there will inevitably be cultural differences in how education is structured. However, we also know from the McKinsey research that the critical factor in the countries that perform well is the standard of teaching and learning and the quality of the teaching workforce. In that regard, we have very clear advantages through the quality of teachers that come through our system. The esteem in which teaching, as a profession, is held by society is also a critical factor. You are right: some of the countries — not necessarily European countries — will have very different cultural and societal characteristics to ours. However, others will not.

If you look at countries such as New Zealand, which wrestles with issues around social disadvantage and still pulls off high attainment, and Canada, which perhaps shares more cultural similarities with us, you will see the same characteristics. The OECD will also point to the flexibility that schools have, the autonomy that they have over what is taught, and how students are assessed. Countries that have that increased flexibility tend to perform better.

Dr Chris Hughes (Department of Education):

Analysis of the PISA results identified a number of other characteristics. It looked at what made schools successful, and some of those characteristics are outside the classroom. Successful school systems, those that perform above average, are better at giving students similar opportunities to learn, regardless of their socio-economic backgrounds. Here, socio-economic background is a very strong determinant of outcome in the PISA scores.

As Katrina said, flexibility is also built in. More successful school systems tended to prioritise teachers’ pay over smaller class sizes. Schools that performed well as systems tended to have a lot of preschool provision available for their pupils, and schools with better disciplinary climates tended to score better, so there is a cultural aspect.

Mr Craig:

That is a huge aspect. Discipline will not be an issue in those countries, because it will be very severely dealt with by both the schools and the state.

Mrs McCullough:

I attended a briefing recently that was given by Professor Ted Melhuish from the Institute of Education about effective preschool provision in respect of PISA. He commented on those countries that you mentioned: the top performers. He said that, a generation ago, they made the decision to invest in preschool work and build from there because they knew that it was going to take that long to turn these things around. Those places are now emerging with fantastic performances among students, and hold top positions in the global economy as well. That was his observation, and it is his job to research all of this.

Mr Craig:

I have a deep suspicion that preschool facilities are not the only thing contributing to their success.

Mrs McCullough:

That is a valid point, but one can only pick up so much from one survey. This is an assessment, and the information being picked up relates to education and schools. There are factors outside of that, but there is quite a lot of good attitudinal information there. The strength of the study is that it asks the same questions of all the countries. Furthermore, it is put together by an international consortium; it is not just one person. That consortium takes the debate. It is an international group that makes a decision on what is included. They will have listened to all of the academic debate.

Mr B McCrea:

I accept that you think that there is some value in this, but there is some criticism. I think that Katrina mentioned that. One could try to use vocational principles, such as the application of data. One question may be whether someone can read a train timetable, and there are certain countries where reading a train timetable is second nature because people use them all the time. However, given that we have just cancelled our trains —

The Chairperson:

At least the ones going south.

Mr B McCrea:

I just wonder if there is some difficulty in the questions that we ask. We also had quite an interesting analysis from the Finnish people. They are quite pleased that they are at the top, but they are a little bemused about why. They suggested that it may be to do with the type of language that they have. Finnish is particularly phonetic, and I think that could be an issue.

I wonder whether we are trying to do too much with this particular study. The area we are really interested in is why there is such a divergence between our lower performing cadres and our higher performing ones. Does PISA actually shed any light on that?

Mrs Godfrey:

Yes; that is part of wider work done by the OECD, which looks at aspects of equity in education through PISA and other data sources. That is probably more relevant to the point that you are making than the core survey. It looked at the characteristics of schools and education systems where there is not just high performance but a high degree of equity. Research and reports point up the key characteristics in the systems that perform well for all their pupils.

Dr Hughes:

On the issue of reading timetables, Northern Ireland pupils reported that 78% of their lessons included text with tables or graphs, compared to an OECD average of 59%. In our classrooms, we are using those elements noticeably more frequently than in other OECD countries.

Mr B McCrea:

I still cannot read a Northern Ireland Railways timetable. I do not know what that says about me.

Mrs Godfrey:

I am sure that there is a means of ensuring that we are not biased, because we do not have as big a railway timetable as others.

Mr B McCrea:

I want to pick up on Jonathan’s point from a slightly different angle. Looking at Shanghai, Korea and Singapore, I can see that those places are extremely disciplined. Therefore, the results depend on the survey sample. We may think that a fair chunk of those people do not go to school at all. It is about selecting the survey sample. That is where I get worried about these things.

Jonathan made the point that everybody tells us that we have a modern, enriched curriculum — he used the word “liberal” — and the entitlement framework, the whole idea of which is a broad-based educational experience. That is the direction of travel that you put to us. However, if you really wanted to do well on a test such as this, you may have to adopt a more narrowly focused, disciplinarian approach. This survey seems to be pointing us in a direction in which we do not want to go.

Mrs Godfrey:

The evidence behind the characteristics bears out the opposite point. PISA focuses on the application of skills. However, the characteristics are around the flexibility that schools have, and the measurement is around pupils’ ability to apply their learning, not just to know stuff. That comes out very clearly.

The other thing that we have to remember is that the cohort of children who went through the survey in 2009 would not have known the revised curriculum, because the timing would not have been right. Those children would not have been in the initial rolling out of the new curriculum.

Mr B McCrea:

That begs a question.

Mrs Godfrey:

How much better will we do next time?

Mr B McCrea:

Yes. Will it be better? Do you expect PISA to be so sufficiently finely granulated that you will be able to pick up that difference?

Mrs McCullough:

There is a lot of detail, which we have not looked at locally, on the breakdown of all the elements that the survey checked in each area of literacy. Therefore, we would be able to see where the changes are happening and could look at where we expect them to happen.

Mr B McCrea:

We spent £40 million on a literacy and numeracy campaign from 1998, and those students did not improve their PISA results.

Mrs Godfrey:

The other thing that you have to bear in mind is that we will be improving, but so too will others. That is the nature of a survey such as this. For example, if the Estonians invest a huge amount in education, they may improve at a faster rate than us. There is that dimension as well. That is why we look at performance against a number of layers.

I looked at the survey findings in respect of discipline. One of the questions asked of school leaders was whether they paid attention to and tackled disruptive behaviour in classrooms. Interestingly, our response rate to that was 97% yes, and the OECD average was 90% yes.

Mr B McCrea:

I am going to ask just one more question, because I am sure that others want to come in. As you said yourself, Katrina, in response to an earlier question, the key is having high expectations of attainment. When we went out and talked to schools, the thing that was new to me was that, where a school is embedded in the community, educational achievement seems to go well. I do not know whether the survey encapsulated that. It is not just about school leaders or the socio-economic background. If the school manages to convince the community that this is worth doing, standards will rise. Is that captured anywhere in PISA?

Mrs Godfrey:

It is. When it looked at excellence and equity, that was one of the points. The point that it started off with was homework. We have to recognise that some children do not have the parental support needed to get the most out of homework. However, it said that encouraging parental involvement, working with children at home and actively participating in school activities does improve results. Schools that foster participation by parents and help parents to support their children in their schoolwork tend to have better outcomes. That parental involvement is inextricably linked with parental expectations. That is borne out by the findings.

Mr O’Dowd:

I will follow on from Jonathan’s comment. There is something in the argument about cultural attitudes to society and how societies govern themselves, although I am not necessarily arguing for communism.

Mr B McCrea:

Hansard is here; we have got that.

The Chairperson:

I can picture a Minister in every classroom.

Mr O’Dowd:

There needs to be appreciation of the gift that is education or an education service. In many of the societies that are still emerging from poverty, access to education is still highly appreciated and perhaps better understood in broader society. As a modern Western society, does our view on education factor in how we measure success or failure? How do we, as an Assembly and an Executive, promote education? It appears to me that many in society think that education is a case of bringing children to school, kissing them on the head on the way in — until they are a certain age, and they ignore you — picking them up in the afternoon, and education is finished. I know that that is a wee bit convoluted, but it is about society’s attitude to education.

Mrs Godfrey:

That comes out in the point about parental involvement. It comes out in the McKinsey point about the esteem in which teachers are held, because that is also linked to society’s view of education. It comes out in the schools that are bucking the trend and are performing above the average level for their type of school. As I said before, it is generally because of good teaching and good leadership but also because somebody has decided to attach importance to getting a good education. In the ideal situation, that will happen through a family. However, schools in disadvantaged communities have often been particularly successful when they have decided, in the absence of anybody else, that they will have those sets of expectations for a pupil, a class or a school. I suspect that CCMS colleagues will pick that up later. It seems to us that the same principle of having expectations ought to apply to us as a system. Why would we not want to have high expectations for our system in an international context? If we tell schools, families and pupils that they ought to have aspirations, it seems to follow that we, as a system, should want to perform at a high level.

Mr O’Dowd:

If politicians, as a collective, praise our education system, there may be a view in the education sector that it does not need to improve. If we over-criticise our education system, morale will fall, so it about getting the balance right. I have a comment about survey results: if a survey does not suit an argument, people should use the Barry McElduff argument and say that that argument did not suit, so they were not going to use it. Occasionally, statistics or surveys will come out that support your argument, whereas another one will be against your argument. It is the safest way to deal with it.

The Chairperson:

I cannot get away from the assessment of the Finnish system, which is a very traditional system. I will read another bit from that assessment:

“Whole classes following line by line what is written in a textbook, at a pace determined by the teacher. Rows and rows of children all doing the same thing in the same way whether it be art, mathematics or geography. We have moved from school to school and have seen almost identical lessons, you could have swapped the teachers over and the children would never have noticed…in both the lower and the upper comprehensive school, we did not see much evidence of, for example, student-centred learning or independent learning”.

However, our system is telling us that if we use the students’ independent learning system and all that is in the revised curriculum, it will give us better outcomes. However, that is totally contrary to what is happening in Finland, which, according to the OECD and PISA, has a far better performance.

Mrs Godfrey:

You have quoted one piece of research, a lot of the OECD —

The Chairperson:

And PISA is only another piece of research.

Mrs Godfrey:

Yes it is, but a lot of the OECD research would also point up equity, autonomy and flexibility. We had a presentation recently by an educationalist from Finland who was visiting an education and library board here. She spoke about the importance that Finland attaches to literacy. Jonathan Craig made a point about societal issues, and she traced attitudes to literacy back to when Finland was occupied by Russia and they were not allowed to speak Finnish. When that became possible, they took real national pride in being able to speak and write in their own language. That was one issue that she pointed to with regard to embedding the importance of high levels of literacy in society.

The important issue for us is that this is a benchmark. We are not saying that it tells the entire story, but it does give us an incredibly useful indicator of how our pupils perform. We think that that is valuable and others clearly think that it is valuable because other countries participate in it; take it seriously; look, as we do, to what we can learn from it; and look to reflect some of the findings in the work that we are doing. That is the right thing to do with a survey of that nature.

The Chairperson:

The report by the education advisory group set up by the four parties concluded that PISA shows that there is no causal link between the type of education system and underachievement. Do you agree with that?

Mrs McCullough:

Reference was made to a series of OECD reports, and there is a whole volume that looks at what makes a schools system successful. By “successful”, the OECD means performing above average, having a high level of equity in the system, and providing all students with similar opportunities to learn. The executive summary of the PISA 2009 results continues:

“They tend to be comprehensive, requiring teachers and schools to embrace diverse student populations through personalised educational pathways.”

In addition, such systems:

“grant greater autonomy to individual schools to design curricula and establish assessment policies”.

They also spend large amounts of money on education and prioritise teachers’ pay. The students attend preschool, and they are systems with more positive behaviour among teachers and better teacher-student relations. That is what came out at the OECD level.

Mr B McCrea:

Nowhere has embraced the OECD and PISA more than Scotland. The Scottish brought in the OECD to tell them what to do. However, the differential has still not changed despite all the investment. It is hard to think of a more egalitarian system or a place where education is held in higher esteem than Scotland. You still have about 25% of the population underperforming, and that is localised in places such as Glasgow West.

Mrs McCullough:

It comes back to that point that Ted Melhuish made: this takes a generation. It starts before children go to school. When we talk about parental involvement, it is not about parental involvement in just education. It is about their involvement in children’s reading right from an early age and reading to them they are young. It takes that long.

Ireland did badly in the latest survey. However, it did not ignore that and brought in Statistics Canada to see why its performance went down. That is one of those things that it is unwise to ignore and that it why the likes of Scotland bring them in. What is it that makes a system good? It starts right back at preschool, and with parental involvement.

Mr B McCrea:

With respect, what you are saying is that we just have not seen the chance. I take the point about generational issues and preschool involvement, but you would have thought that the Scots, having had pretty good results, would be particularly keen to address the pool of underachievement. They took every action: every school has similar levels of resources and similar teaching — the whole thing. Nevertheless, we still have not got to the cause and effect of why there is still underachievement. That is what our inquiry is about. We have pockets of deprivation that the schooling system does not appear to be able to address. If you look at the performance of GCSE — and it is obviously a statistical issue — if we were able to improve the bottom, then, overall, we would rise quite dramatically.

Mrs McCullough:

Yes, we would.

Mr B McCrea:

All that I am saying is that we are seeking the Holy Grail as to how to fix the issue. We are all convinced about early-years intervention and going down that route. However, it has not been shown to work.

Mrs McCullough:

I think that we want to do that because we see an average going up and see that massive leap. Therefore, you will start to see changes in that proficiency level, and that has been observed. This time, OECD has pointed out that it is countries that are shifting people from the bottom-performing up a little bit, or those that are moving at a larger —

Mr B McCrea:

If they are not going to the very top, at least they are moving up.

Mrs McCullough:

Yes; that is where you start to see a small shift. It is a test every three years, and it will take longer than that to see a significant change in the overall average score.

Mr Lunn:

Is it fair to say that Finland has been at the top, in European terms, of all four surveys carried out since 2000?

Mrs McCullough:

I think that they have.

Mr Lunn:

I imagine that Estonia, if it were not close to the top, has been coming through and improving.

Mrs Godfrey:

Some of the eastern European countries have been coming through. If one goes back to the surveys of 10 years ago, some of them would not even have participated. However, that is where some of the marked improvement has been seen, and that is related to points made already about investment in education, valuing education and recognising the link between education and economic growth.

Mr Lunn:

Some people think that there is not much value in PISA and others think that there is. However, it seems to me that, if there is value in it, surely it is in pointing out the best systems that consistently perform well. Finland always comes up, and I fancy that Estonia will from now on too, because the Russians cleared out of there as well. I am nearly half expecting someone to say something like it is time that the English cleared out of here so that we can instil some national pride.

Mr B McCrea:

And we were having such a nice Committee meeting. What morbid aggression.

Mr Lunn:

Am I right in thinking that the Finnish system has a much later starting age for pupils at primary level? I do not really buy into the idea that its train timetables are more user-friendly or its language simpler. Does Finland use class testing of the type that we were talking about earlier? I am not expecting you to know that, but that is the sort of thing that we should be looking at. It is hard to get away from the fact that Finland uses a comprehensive system. Do we look at these results and analyse all aspects to see whether there is something that we could fasten onto that would have the potential to improve our system?

Mrs Godfrey:

That is very much the value of it. A lot of the analysis that we do through Karen’s team, and that done at OECD level, point up the characteristics in the top-performing systems. It will not be a case of just looking at Finland. It will be looking to see whether there are common characteristics that exist across most or all of the top-performing systems and setting out what those are, and that will provide us with the policy evidence base that we need to factor into the work that we do.

That is the real benefit of the analysis. It provides a snapshot survey of how Northern Ireland does, but the real information beneath that is what the common characteristics of a good system are. Our initial focus is always on the achievement of our own students, but we also go through what characteristics form part of the good systems, and what characteristics are the most important. We also discuss whether we could adopt those characteristics, and whether there are points that we can start to apply. That is evidenced with the increasing focus on the importance of preschool and one of the key findings was, as Karen mentioned, that the basic structure of an education system relates to the equity of its socio-economic structure. The OECD is also very clear about the need to select, track and stream, and to be clear about benefits, because school systems that use such measures are not among the higher performers.

Some of the key findings are contained in the OECD’s ‘Ten Steps to Equity in Education’, in which it looked at the most equitable highest performing systems. In that document, the OECD alludes to the need to limit early tracking and streaming; postpone academic selection; manage school choices to contain the risks to equity; provide attractive alternatives in upper secondary education; remove dead ends and prevent drop-outs; and look to systematically help those who fall behind. For example, the OECD did not see any evidence of a child repeating a year being a positive experience. We capture all of that and try to ensure that that —

Mr Lunn:

Is that a snapshot of the good features of the best performing countries?

Mrs Godfrey:

Yes. They are the features of those countries with the best performing schools and the highest equity.

Mr Lunn:

I take the point that Jonathan made about China and Far Eastern countries, where enormous discipline is instilled in the population from an early age. Is that also true of Finland? I would not have thought that that was the case.

Mrs Godfrey:

Again, the focus tends to be on the analysis of the characteristics of the systems. Therefore, I do not have that level of in-depth knowledge. However, if it would be helpful, we could certainly look into that.

Dr Hughes:

We have the evidence from the principals’ reporting that Katrina referred to earlier. For example, 10% of students here are recorded as having skipped classes, as opposed to 33% recorded by the OECD. Our reports tend to be much more positive on the whole issue of student behaviour than the OECD reports. Respect towards teachers, behaviour in classes and bullying of other pupils were all lower here.

Mr Lunn:

They were lower?

Mrs Godfrey:

Better.

Dr Hughes:

The instances of those were lower. We had a better behavioural and discipline environment in the classroom.

Mr Lunn:

I am surprised by that, but it must be right if is in the statistics.

The Chairperson:

Trevor made the comment that it seems like the comprehensive schools are the ones that are successful. The one caveat that I would add to that is that there is a vast array of schools that are underperforming below schools in Northern Ireland, including those in the Irish Republic, and, on some occasions, in England. Again, it is the Barry McElduff syndrome, and we need to be very careful how we use the statistics.

Mr B McCrea:

Would you mind if I came back on that point?

The Chairperson:

Just one last comment. I want to draw the session to a close.

Mr B McCrea:

There is a widespread perception that the school system in England does not work terribly well, and that its form of comprehensive — and I am not saying that comprehensives per se do not work —

Mr Lunn:

Neither am I.

Mr B McCrea:

There is a perception among people about grey comprehensives, yet the scores of the English comprehensives are remarkably similar to ours. Does that mean that the perceptions of the failures of the English education system are wrong?

Mrs Godfrey:

Other statistics, for example those that deal with GCSE performance, point to a marked improvement in the performance of English schools in recent years. Therefore, the gap is closing markedly in areas such as GCSEs, in which they would have lagged behind us. However, I do not have an in-depth handy guide to the —

Mr B McCrea:

The Chairperson wants to close, so I will not go back on this. However, it struck me that there is a truism about what is good and what is bad, but we are confronted with evidence that does not support the conjecture.

The Chairperson:

It would be interesting to look at the preschool provision in the countries that perform well and see how much it resembles or is unlike our preschool provision. If it is unlike our provision, how will we assess that, given the long debate that we have had around early-years provision? Have we put in place what we see in early years as a reflection of what is provided in those other countries that seem, according to PISA, to do better than we do? That will be an interesting correlation or comparison. Karen talked about seeing this over the progress of time, but we have not got a lot of time to wait.

Mrs Godfrey:

The findings in relation to preschool are being fed directly into the team that is working on the early-years strategy, to make sure that that evidence is available to them.

The Chairperson:

Thank you. That concludes the first session of the meeting.