Official Report (Hansard)
Date: 15 January 2014
PDF version of this report (179.39 kb)
Committee for Education
2012-13 Year 12 GCSE and Year 14 A-level Results: DE Briefing
The Chairperson: David, Dale and Gayle, you are very welcome. Thanks for taking the time to come to us. David, we ask you to make your comments on the paper, and we will take it from there.
Dr David Hughes (Department of Education): We are presenting a briefing on the statistics from the survey of annual examination results. Gayle will be able to talk briefly about the headline figures and messages from that. I remind the Committee that this is a survey of exam results of pupils in year 12 and year 14, and that is to distinguish it from the school leavers' results, which are also collected and on which system-wide measurement is made of the education system as a whole. I now pass to Gayle on the key messages from the results.
Dr Gayle Kennedy (Department of Education): Thank you for the opportunity to brief the Committee on the findings in the statistical press release 'Year 12 and Year 14 Examination Performance at Post-Primary Schools in Northern Ireland 2012-2013', which was published on 12 December 2013.
The data are taken from the summary of annual examination results (SAER) process, which collates summary school level examinations data and validates it with the schools. The 2012-13 figures reported in the publication, as presented here today, are based on information as at 9 December 2013. This year, for the first time, summary data on pupils with free school meal entitlement was also collected, validated and published in the statistical press release. Therefore, this database provides a rich source of information, and I will present the headline figures. One point to note is that when I refer to GCSEs and A levels, I also include their equivalent qualifications.
Firstly, we will look at the year 12 pupils. There were 22,580 pupils in year 12 who were eligible for entry to GCSE examinations in 2012-13, as included in the returns made by the schools. The majority of these pupils — about 60% — were in non-grammar schools.
Over time, there has been an increase in the proportion of pupils achieving five or more GCSEs at grades A* to C, from 71% in 2008-09 to 80% in 2012-13. During the same period, there has also been an increase, but at a lower rate, of the proportion of year 12 pupils achieving five or more GCSEs at grades A* to C that include GCSE English and GCSE maths. That has risen from 57% in 2008-09 to 61%, as reported in the most recent publication.
In 2012-13, the achievement gap between the percentage of pupils achieving five or more GCSEs at grades A* to C and those with the same level of qualifications but also including A* to C in English and maths was 18·7 percentage points. This gap has increased by one percentage point since last year, 2011-12, and it is 4·8 percentage points wider than the 13·9 percentage point gap reported in 2008-09.
Looking at gender differences in year 12 pupils, we see that females generally perform better than their male counterparts — 65% of female pupils in year 12 achieved five or more GCSEs, including English and maths, at grades A* to C, compared with 56% of males, which is a gap of nine percentage points.
Analysis by school type shows that, in terms of achievement at the end of Key Stage 4, grammar schools have higher attainment in all key performance indicators. In 2012-13, 97% of grammar school pupils in year 12 achieved five or more GCSEs at grades A* to C, compared with 67% of non-grammar school pupils. In recent years, there have been signs of that performance gap narrowing, from 53·2 percentage points in 2005-06 to 30·1 percentage points in 2012-13. This is due to the higher rate of increase in the percentage of pupils achieving five or more GCSEs at grades A* to C in non-grammar schools, where there has been an increase of 24·4 percentage points, compared with a 1·4 percentage point increase in grammar schools.
As I mentioned, changes to the data collection methodology this year have enabled the Department to collect summary data that relates specifically to pupils in year 12 and in year 14 who were entitled to free school meals. At 83%, a higher proportion of non-free school meal pupils achieved five or more GCSEs at grades A* to C compared with pupils entitled to free school meals, for whom the figure was 63%. That shows a 20·7 percentage point gap. There is a 32·8 percentage point gap if we look at the achievement of five or more GCSEs at grades A* to C, including English and maths; 34% of year 12 free school meal pupils achieved that level, compared with 67% of non-free-school-meal pupils. That concludes the overview of the year 12 results.
I will now turn to the year 14 results, which refer to A levels and their equivalents. Almost 13,000 pupils in year 14 were entered for A-level examinations, and almost all — 98% — of pupils in the final year of an A-level course of study achieved two or more A levels at grades A* to E. There has been no change in this indicator since 2011-12. Ninety five per cent of A-level free school meal-entitled pupils achieved this indicator. Almost three quarters of year 14 pupils — 65% — achieved three or more A levels at grades A* to C. This is slightly higher than the figure in 2011-12, by 0·4 percentage points. Just over half — 51% — of A-level free school meal-entitled pupils achieved this indicator.
Females in year 14, as in year 12, generally performed better than their male counterparts. Sixty eight per cent of female pupils in year 14 achieved three or more A levels at grades A* to C, compared with 62% of males.
A greater proportion of year 14 pupils — 62% — attend grammar schools than non-grammar schools, which account for 38% of year 14 pupils. This contrasts with the year 12 cohort, in which 41% attend grammar schools and 59% attend non-grammar schools. Grammar schools had a higher percentage of pupils gaining three or more A levels at grades A* to C than non-grammar schools. In 2012-13, 77% of grammar school pupils in year 14 achieved this standard, compared with 45% of non-grammar school pupils. This performance gap has shown signs of decreasing in recent years. In 2009-2010, the gap between grammar and non-grammar achievement of three or more A levels at grades A* to C was 34·7 percentage points. In 2012-13, the gap is 31·8 percentage points; the same as in 2011-12. This reduction in the performance gap in the percentage of pupils achieving three or more A levels at grades A* to C is due to an increase in achievement in non-grammar schools by 2·4 percentage points and a decrease of 0·5 percentage points in achievement in grammar schools between 2009-2010 and 2012-13.
This presentation has provided an overview of the headline information that is contained in the statistical release, 'Year 12 and Year 14 Examination Performance at Post-Primary Schools in Northern Ireland 2012-2013'. There is a lot more information included in the report's text, charts and tables, which is readily available on the Department's website.
The Chairperson: I have a couple of questions, after which we can get into the minutiae of statistics. If you are a statistician, this is the world that you love to be in, but I am not a statistician.
There seems to be an issue at table 11 in the Northern Ireland Statistics and Research Agency (NISRA) report, and I wonder whether you could shed any light on it. Non-grammar schools in the Southern, Western and North Eastern Education and Library Board areas do better in five GCSEs including English and maths and in three A levels than those in the Belfast and South Eastern Education and Library Board areas. Is there any explanation as to why that is the case? Is it an urban/rural split?
Dr Hughes: I am not sure. I certainly have not seen any analysis that indicates any reason, but it is a question that is worth exploring.
The Chairperson: This goes back to what we were discussing earlier in the previous presentation, which was to do with the policy mix that there is at the moment. We have the Department, statistics, area planning, sustainable schools and all sorts of things going on; it is a real cauldron. There seems to be a variation between the Southern, Western and North Eastern Education and Library Board areas and the Belfast and South Eastern Board areas. I hear what you are saying David, and although not now, will you give that some consideration and try to indicate to us your view as to why that is?
Dr Hughes: Certainly, yes; it is a question that is worth exploring, and there may be a number of different directions in which that can be taken.
The Chairperson: The data also show a large attainment gap between free-school-meal pupils who attend grammar schools and those who do not. We have seen that reported. However, can the Department explain why eligibility for free school meals is taken to be a good proxy measure for educational underachievement? NISRA's data appear to show that the best indicator for educational underachievement is prior attainment. Why does the Department differ on that?
Dr Hughes: It is important to make the distinction that there is between measuring pupils who are entitled to free school meals and those who are underachieving. The fact is that figures show that children who are free-school-meal-entitled are more likely to be underachieving. That is not to say that they are necessarily underachieving. Likewise, there may well be pupils who are underachieving, but their achievement is not low nor are they in any particular position of social deprivation; nevertheless, they are underachieving. That is the kind of thing that a school will be aware of by looking at prior attainment and expectation for that child. So, we do not make an absolute correlation between free school meal entitlement and underachievement, but there is a strong connection between the two because pupils who are free-school-meal-entitled are more likely to underachieve.
The Chairperson: Is that exemplified when you look at the narrowing attainment gap for free-school-meal pupils at A level? That raises the issue of why we continually bound the statistics by age, at very fixed points in the age process, whereas some pupils, as indicated by these figures, achieve better later.
Dr Hughes: Going back to what I said early on, that is why we have these figures, which are about examination results at year 12 and year 14, but the actual measurement of the system is based on school-leaver results. That, I think, responds exactly to the point that you make. It is far fairer to look at how the system has served the individual at the point at which they leave it. To be honest, I think that that is only fair.
The Chairperson: My next question may be a bit unfair because it requires a policy context and that is a political decision. It is always a bit like when the police try to change the rules for reporting crime and you wonder whether they do so to make the figures look good but there still may be as many crimes committed and criminals out there. However, what could be done to give us a fairer and more accurate reflection of how the system has served the individual and not how the individual has served the institution? What could be done to have it in that right way? Is the issue about moving the time when examinations are taken by some pupils? Is it about having that flexibility in the system? Have you any thoughts?
Dr Hughes: There are a number of different ways in which performance can be measured. Some of them are relatively easy. When you take one performance measure in isolation from others, you inevitably get only part of the story. I certainly hear from head teachers that what many schools want to see measured is the progress from a child's performance on arrival at the school to when they leave. That is a desire that is generally held, and that should become possible once the assessment at the end of each Key Stage is embedded and we can say that a child performed at a certain level at the end of Key Stage 1, the end of Key Stage 2 and the end of Key Stage 3, so that you can see the progression. It would show the child's performance at GCSE or the equivalent qualifications at 16 and the performance later. We are saying that we have already got the end point measurement, which is performance on leaving school, but having that progression would be a very valuable element of it.
Of course, the performance measurement also has set points. So, in these figures, we are often looking at five or more GCSEs or equivalent at A* to C, including English and maths. There is also a measurement of seven or more. At A level, it is two or three. There are a number of other different places at which one could draw a line to say that it is very interesting to see how they are doing at the very top end —
we could look at what the A and A* figures are — for example. Also, what about figures on performance at any level at level 2 or even performance at any level at level 1 for those who have got a qualification that they may not otherwise have got if they had not been served so well by the school that they are in? There is a whole range of points at which you can measure, and, to be perfectly candid, the more points at which you are measuring, the less likely you will be to be affecting the way in which the school is thinking about the performance of pupils.
The Chairperson: The problem that we face with the current system is that, ultimately, a school is judged by its performance in its pupils gaining five or more GCSEs. This is where the challenge comes to decipher and to separate the school from the individual. They are inseparably and inextricably linked. Whether or not the school will survive is based on that measure, even though there will be pupils who progress very well from when they entered the school and have got three GCSEs, not five. In some cases, they may only get two, but, for that individual pupil, two GCSEs is a huge achievement. I have been to a school in my constituency that is deemed by the Department to be a failing school, and a pupil there has achieved seven GSCEs at A* to C. That is an outstanding achievement for that pupil.
Dr Hughes: We need to be quite careful about how we say that schools are being evaluated and by whom, because, of course, the principal method of school evaluation is inspection, which looks at the entire range of the provision in the school and not just, as it were, a single performance measure based on GCSE or equivalent qualifications. So, that is a critical element of this. An inspector going into a school will be able to say, "We look at the school's intake and the quality of its teaching and its leadership. If we were to be so callous as to look only at its examination results, we could be very critical, but, in fact, we know that the school is doing a magnificent job". The value of the inspection process is that that wide evaluation of school performance is possible.
Mr Hazzard: Figure 7 shows improvement in the gap between the grammar and the non-grammar sectors. It highlights the very commendable success in the non-grammar sector over the past number of years. Maybe it is a loaded term, but does the coasting of the grammar sector at that level reflect a coasting per se, or can that be taken to 100%? Is there a certain level of achievement at which it is accepted that it is nearly impossible to go above it? What can be done to take that higher and to continue the growth in the non-grammar sector? I will not ask too many loaded questions.
Dr Hughes: It is worth looking at that figure alongside — of course, I will not be able to find it now — the one that demonstrates that the size of the grammar sector has remained stable even though the school-aged population has dipped. To characterise it crudely, the grammars fill up first, so the population of the non-selective sector is growing smaller. The logic of that is that the population of the grammar sector is having its academic and educational potential broadened. The table at figure 7 is a tremendous endorsement of what the non-selective sector is doing in improving results with a population of children, the best of whom have been creamed off already, yet their results are being improved. That is very impressive.
At the same time, it is saying something about the grammar sector. Not in all grammar schools, I am sure, but in many, the breadth of the educational ability of the pupils in the school is growing, and they are sustaining their performance. There will always be those who criticise the Department and others for going on and on about standards, results and performance and so on, but that is because the public expects performance and results. Here we see that there is a positive impact in both sectors, with both seeking to do their best for the children in their sector.
Mr Rogers: Thank you. You are very welcome.
To go back to a point that the Chair made earlier, you talk about evaluating schools, but parents evaluate schools by results. They do not look into prior attainment, and so on. How can this more accurately reflect the performance of the school? You talked about the levels of progression. How can they at the end of Key Stage 2 be a more accurate instrument as a baseline for Key Stage 3? What I am getting at is seeing the value added in schools, because I think that we all agree that many non-selective schools make fantastic progress to get these statistics, which was Chris's point.
The other point that was made by someone in a presentation a couple of weeks ago was how we measure holistic education, by which I mean all the other aspects that make a school a good school.
Dr Hughes: I do not deny that, to the public at large, the easiest and quickest way of assessing the performance of a school is by looking at its exam results. I am sure that most people looking at exam results will also be filtering what those exam results really mean. They mean that, yes, this grammar school has got tremendous results, but the other message is that of course it should have such results, given its intake. I think that an increasingly sophisticated assessment of schools, and the encouraging and enabling of that, is enormously valuable.
To be completely topical, these are the weeks in which parents of P7 and P6 pupils are going into schools and looking at them. They are listening to head teachers, teachers and pupils, and they are evaluating far more than just the exam results by looking closely at the school. That is a very telling exercise.
On the levels of progression, I think that there is a real necessity for allowing and enabling the assessment at the end of each Key Stage to take place and to be embedded in the practice of schools. A very important element of that is for post-primary schools to understand how it is done at the end of Key Stage 2, and, to be perfectly frank, what the P7 teacher is doing giving a level 4 to so-and-so, so that the year 8 teacher understands precisely what that means when he or she sees that so-and-so has a level 4. That communication and understanding of how the levels of progression are assessed, and what they mean for teachers at the different stages in the different phases, is absolutely critical to ensuring that teachers are able to measure progression so that a child moves from level to level either at the expected rate or, indeed, at a faster rate but not at a slower rate. That is perfectly possible. However, it is quite a shift in practice and emphasis, which needs to be encouraged and supported.
Mr Rogers: There are two issues there. Level 4 in school A and level 4 in school B may not necessarily be the same thing. There is large variation in level 4, particularly in English, across the band. Do you not feel that post-primary schools need a lot more information than the levels alone?
Dr Hughes: You have put your finger on a very important point. The point has certainly been made to me on a number of occasions that the levels are very broad. There is an argument that they may be of more value for different purposes if there is a greater degree of differentiation between them. That is worth bearing in mind. At the same time, a more granular approach to levels may provide more information than is strictly necessary for other reasons that the data is collected. For individual children, it may be very important to know whether they are solidly doing the best in level 4 or just scraping into level 4 because they have demonstrated just about enough. I can see that, at the individual children level, that may be enormously valuable for the child, their parents and the teachers who take them the following year. There may be an argument that it is valuable at school level. The argument certainly falls away at system level when looking at broadly what proportion of children reach level 3, 4, 5 or whatever. You make a very valid point.
Mr Rogers: On tracking children on free school meals, you said that there is a different method of collecting or organising data now. Can you track the achievement of students on free school meals irrespective of the school that they attend? For example, are there tables on the achievement of free-school-meals children who attended non-selective schools as opposed to free school meals children who attended grammar schools?
Dr G Kennedy: The information that we have collected this year for the first time allows us to identify the qualifications of pupils who are on free school meals. We cannot say that pupil A got these qualifications and pupil B got another set of qualifications, but we can say, in summary, that all free school meals-entitled pupils received these qualifications. From that perspective, we can report on the qualifications of those pupils in a particular school who are entitled to free school meals.
Mr Rogers: Therefore, it can be broken down depending on the school that they attended.
Dr G Kennedy: Yes, depending on the school type. We are able to provide that information only because of the change to the methodology of collection this year.
Mr Rogers: That will be useful. I would like to see that.
The words "or equivalent" are stated in brackets in the title of figure 7. What percentage of GCSE subject passes are equivalent subjects rather than GCSEs? Can you elaborate what the main qualifications are when you talk about equivalents?
Mr Dale Heaney (Department of Education): I am not sure that we have the percentages for you, although we can try to come back to the Committee with that information. There is an increasing trend towards those equivalent qualifications, particularly given the value that some schools put on them for further progression opportunities. I would still estimate, based on the figures that I have seen, that GCSEs and A levels are predominantly the main factor. However, the equivalent qualifications that seem to be increasing in popularity tend to be BTECs, for example, as one of the vocational routes. There are also level 1 qualifications that lead to level 2 qualifications that ultimately lead on to a GCSE. Those instances would be relatively high in number. We can try to break that down for you, if that would be useful, and get you some percentages.
Mr Rogers: Thank you.
Mrs Dobson: How do you feel that the increasing proportion of year 12 pupils — 7% in 2012-13 — who were ineligible for inclusion in the figures impacts on the overall picture? How many did that 7% represent in total?
Dr Hughes: Sorry, will you point to the relevant place?
Mrs Dobson: I am just trying to find out why those 7% of pupils were excluded. I am trying to find the page that refers to the 7% being "deemed ineligible" for inclusion in the year 12 figures.
Dr Hughes: The exemptions from inclusion.
Mrs Dobson: Under "Overall year 12 performance", it states:
"Approximately 7% of the overall year 12 cohort were deemed ineligible for inclusion in the summary of annual examination results returns in 2012/13."
How many pupils is 7%?
Dr G Kennedy: I do not have that information with me today, but it can be provided.
Mr Heaney: There are eight exemptions in total for which schools could determine to exempt a pupil from the statistics. Those exemptions include a pupil dying, being pregnant, being ill, having special educational needs (SEN) or being considered best managed through education other than being at school (EOTAS). We can certainly provide the Committee with a list of the exemptions.
Mrs Dobson: I am just trying to explore how the exclusions affected the full picture. Did you say that there are eight categories for exclusion?
Mr Heaney: Yes.
Mrs Dobson: It would be useful, Chair, for us to have the 7% broken down into numbers, but you do not have those figures.
Dr G Kennedy: Not with me today.
Mrs Dobson: OK.
The Chairperson: Has that percentage increased in recent years?
Mr Heaney: I believe that it has, yes.
The Chairperson: Have we any idea why? Again, that would be useful, because it has an impact, and I think that where you are going, Jo-Anne —
Mrs Dobson: We could consider the impact if we had the exact figures.
Mr Heaney: It is of increasing concern to us that the capacity to exempt certain pupils varies from school to school. We are keen to try to identify the reasons for that. You are right: the trend has increased over recent years, from 4·5% to 7%, I think.
Mrs Dobson: It would be useful for us to know the reasons that they were excluded. That would help us to form a broader picture.
Mr Heaney: I am pretty sure that I can get that information for you.
The Chairperson: A look at the reasons for exclusions shows that SEN plays a part. If you take it that there has been an increase in post-primary provision for SEN pupils in selective and non-selective schools, is there a correlation? Do some schools find it convenient to be able to exclude some pupils because they happen to be on the SEN register? Are they being particularly penalised?
Mr Heaney: There is a possibility of that.
Dr Hughes: I am wading into an area in which I am not absolutely confident, but I think that it is quite —
The Chairperson: I am keen that the Department give some detail, and it would be interesting to get it. I think that Dale has an idea of where I am going with this, which is that it may be in some schools' interests to have better outcomes if they can — I do not want to say "manipulate the figures", because that may be too blunt.
Mr Newton: I will not say that you said it, Chairman. [Laughter.]
The Chairperson: However, if the process allows exemptions, are such exemptions validated enough to underpin the legitimacy of the reported outcomes? More importantly, where does it leave those young people? That, for me, has always been an issue for schools, whatever sector they are in, that determine that some pupils would be better not doing A, B and C. Why? Is that best for the pupil or a better outcome for the institution? It will be interesting to see how much more light you can shed on that, David.
Dr Hughes: I will correct myself if I have got this wrong, but people with special educational needs are not necessarily exempted. However, there are those with special educational needs whom it would be fair to exempt.
The Chairperson: Yes, but it should be on the basis of on what objective criteria that can be determined, not just on the basis of a get-out clause that schools can use. Ultimately, this is about trying to protect pupils. That is what we want to be sure of.
Mrs Dobson: I definitely want clarity around those eight categories, particularly on SEN, because we are not getting an accurate picture. It would certainly help to have that information. I have perhaps opened a can of worms with that, but I am glad that I did.
I want to ask about the 2012-13 figures in your briefing, which are based on information up to 9 December 2013. Is that correct? Are those figures likely to have changed since and, if they have, by what margin do you expect them to have changed?
Dr G Kennedy: That was the date on which the database closed. We have to close the database at some point in order to run the analysis so that we can write the report.
Mrs Dobson: Have there been any information on changes since then?
Dr G Kennedy: No, not at this point. We do not have any information.
Mrs Dobson: Do you envisage margins of change? Do you have any idea, based on those statistical patterns?
Dr G Kennedy: No, not unless a school were to inform us that there has been a change. The data collection methodology is that we request the information from the school at the start of the academic year, and the school provides us with a return. We then draft a set of summary tables. We send those tables to the school and ask it to confirm that the figures are accurate. When that is signed off and returned by the principal, those are the figures that we use. That will be the most up-to-date information that we have received from the school.
Mrs Dobson: I was hoping that you would know by guesswork even whether they had changed. I am just trying to explore whether the figures are likely to change more with GCSE or A-level results. I am curious to find out whether there have been changes since then. Is there any way of getting us that information?
Dr Hughes: It is not that the survey is updated to reflect re-marks or anything like that, and that is the point. Do stop me if I am getting this wrong, but the school leavers' data is the qualifications with which the pupil leaves school. That comes much later in the year, because it captures the final set of qualifications.
Mrs Dobson: It would be useful if the survey were ongoing and we could get projections of how things are changing. I understand that the cut-off point was 9 December, but it would be useful for the Committee to explore how the figures are changing as time goes on, but I guess that you do not have that information.
Dr G Kennedy: No.
Dr Hughes: No. It would have to be sought again specifically to have a second survey, presumably.
The Chairperson: There are no other questions. Thank you very much David, Dale and Gayle for your attendance, your assessment and your commitment to providing some additional information, which is much appreciated.