Official Report (Hansard)
Date: Wednesday, 24 October 2012
Committee for Education
Computer Based Assessment (NINA and NILA)
The Chairperson: We will move to the subject of the briefing, which is computer-based assessments. I thank you, David, Carl and Ruth for coming.
You will be aware of the issues that have arisen recently with the administration of the computer-based assessment programme for numeracy and literacy, the so-called NINAs and NILAs. The National Deaf Children's Society has written to the Committee to highlight concerns regarding the suitability of those tests, particularly for deaf children.
Members will find the Committee Clerk's briefing paper and the relevant correspondence from the Minister and the National Deaf Children's Society in today's pack. The Department has also submitted a paper, and it is in the tabled papers. I think that you will find those papers useful as we go through this matter this morning.
Thank you for the information that you have provided for us, which has been useful. Perhaps, you could speak on the issue. Then, members will ask questions.
Mrs Katrina Godfrey (Department of Education): Yes, indeed, Chair. I will start by introducing Ruth Kennedy from the Council for Curriculm, Examinations and Assessment (CCEA). Carl Savage has been with me at the Committee during the past three weeks. In particular, I am introducing David Hughes, who, from this week, is the director who is taking over responsibility for curriculum standards and qualifications issues. Therefore, in future, it will be David's face that you see on those issues, not mine.
The Chairperson: You are very welcome, David.
Mrs Godfrey: We are very conscious that the Committee will have heard concerns from primary schools about intermittent technical issues that have arisen with the new computer-based assessment programmes in literacy and numeracy.
By way of background, it may be useful to remind the Committee that there is actually a legislative requirement in force that applies to children in primary-school years 4, 5, 6 and 7. It is a requirement to carry out diagnostic computer-based assessment in the autumn term and to report outcomes as part of a parents' meeting, which, importantly, also takes place in the autumn term. I say that because we know that engagement between schools and parents is far more effective when it takes place at the start of the school year, because that allows parents to become much more engaged in their children's learning and in supporting the work of the school.
A couple of things need to be stressed. First, the computer-based assessment is not a test. They are not high-stakes tests. The results are not collected anywhere outside the classroom. The primary focus of the assessments is diagnostic; helping to inform teaching and learning and engagement with parents early in the school year. Another point that is worth making is that people will ask why we cannot have pencil-and-paper tests. Well, beyond the legislative requirement, one critical aspect of the assessment that you can carry out by means of computer-based assessment is its adaptive nature, which allows children to demonstrate what they can do, not what they cannot do, and, in doing so, to identify where their strengths are and where their areas for improvement are.
Clearly, Chair, if the requirement in law is that those assessments have to be carried out in primary schools, then we have a particular responsibility to ensure that the systems that we provide through procurement are working. In my Minister's view, it is not a tenable position for us to require something to happen and then not to do everything in our power to ensure that it works smoothly at the school end. Therefore, the reported technical glitches that you will have heard about and that we have heard about are a matter of real concern to my Minister. He is absolutely determined to see them resolved quickly.
The paper that the Committee has received sets out some of the actions that have been taken in the week or so since the problems first came to the Department's attention. In looking at that, it is fair to say that the intermittent nature of the technical issues that have been reported by some schools make it a very difficult issue to resolve. However, be assured that we are determined to resolve it and to do so quickly.
What we know is that the majority of pupils in the majority of schools have completed their assessments. What we also know is that there is a series of steps that schools have to take — essentially, computer-hygiene steps — before they start the assessment process. When schools have consistently taken those steps correctly, the issues that we have been hearing about with regard to screen freezes are much less likely to happen.
Therefore, in summing up, the key issue for us is the determination of the Minister, the Department and our colleagues in CCEA and the Western Education and Library Board, which is responsible for the C2k system, working with the three private-sector providers that support us in all of that, is to ensure that we get to the root cause of screen freezes and the issues that have been happening and to do so quickly, so that we get back to a position where schools can confidently get their pupils to complete the assessments safe in the knowledge that those issues that have been occurring are sorted out and that the problems and the root causes have been identified.
Beyond that, we are very happy to take questions that the Committee might have on the issues that have arisen, either from the departmental policy perspective or, in Ruth's case, on the operational issues and the ongoing work that CCEA is leading to try to make sure that the issues are resolved.
The Chairperson: Thank you, Katrina. If Members will indicate if they have a question, we will make sure that they are taken.
I say this only as an opening comment, but there is an irony in that, in these papers, the Department is at great pains to stress that these are not tests, that they are not standardised, and that there is a requirement in law for the Department, whose stated position is opposed to the very high stakes test, namely academic assessment. Given this crisis, I think that all schools would be glad to get back to one test. My worry is that this is the Department of Education's version of the Ulster Bank situation.
I note that the phrase "technical glitches" was used repeatedly in the papers and in this morning's presentation. These are more than technical glitches. I got a letter from a group of principals in my constituency. I declare an interest that they are from my constituency. They wrote to the Minister, myself and Paul Wright of CCEA. I will give you a flavour of what they said. It proves that this issue is about more than just technical glitches. The comment that really concerns me is this:
"Parents are being presented with a report that does not reflect their child's ability."
We would never want to test children in any other way other than for diagnostic purposes, because, according to the Department, that would be an awful thing to do. However, we can test them for diagnostic purposes. If that diagnostic test is not able to give us accurate information, you have to seriously question why we are doing it.
I went down through the list of issues that the principals raised. It also worries me that they say that:
"A number of schools were involved in the CCEA pilot during the 2011-12 academic year. They expressed their dissatisfaction at the outcome of the process in this cohort of schools and advised CCEA of a number of concerns with the computer-based assessment, including technical issues, and requested that an another pilot year be considered before making the assessment statutory. Disappointingly, the views of our members and the others in the pilot cohort were not acted upon, resulting in similar issues being experienced by a large number of schools in our association as they endeavour to operate the new statutory computer-based assessment."
They then list the issues:
"The computers freezing. Pupils taking one and a half hours to complete an assessment that was expected to last 20 minutes. Pupils being required to use an extensive range of IT skills to complete their tasks, leading some schools to query if the computer-based assessment assesses fine motor skills and IT competency rather than communication and using mathematics, as was intended."
Then, there is another issue that really should have been sorted out between the two organisations that won the tender.
"The colour-coding system of NILA and NINA and the reports will result in confusion to parents. The colour purple in NILA represents below what is expected in the child's curriculum year, whereas the colour purple in NINA represents that the child was able to do it."
Katrina, you will remember that the Committee was mystified when you brought us the levels of progression and all the colour-coding. We now have that replicated in the NINAs and NILAs. This is an issue for parents. Purple means one thing in one test and something completely different in another. The nub of it is this: what are we going to do to rectify this very expensive and costly process?
Mrs Godfrey: I will address a couple of those points, and then I will ask Ruth to pick up on the detail of the pilot.
As I said in my opening remarks, the Minister is absolutely committed to making sure that the action is taken to identify the source of the issues around screen freezing and to get them sorted. That work is happening with a significant degree of intensity at the moment. It involves not just the Department but colleagues in CCEA who are leading the process, supported by C2K and the two private sector providers of the NINA and NILA assessment tools, and Northgate as the C2K provider working together to make sure that the technical issues are identified and sorted. That is absolutely the top priority.
I also wanted to pick up on the longer-term value of the project. After the end of the autumn term, we will ask our colleagues in the inspectorate to carry out a report to look at the value to schools of the information that comes out of the system and how schools have used that information, along with other information, to inform their judgements on pupils' strengths and areas for improvement and to engage with parents. We will not have that information during the autumn term, because we have to allow schools to complete the process, but after it, when we will be clearly focused on the educational benefit.
This is not a case of handing a parent a sheet of paper with purple or any other colour on it. This is a parents' meeting, where the teacher will use all his or her professional knowledge and judgement to give a parent the best information that he or she can about a child's progress. This is one piece of information that is made available consistently and, therefore, there is a real value in that, but it is a meeting between parent and teacher that is set in a much wider context of everything that the teacher knows about the child, not just this one piece of information. That is important as well.
We had absolute assurance that the issues that were identified during the trial were picked up, categorised and acted on before the roll-out; otherwise, we would not have specified those two computer assessment tools as the legislation requires. We had absolute assurance that the pilot ran, that principals were listened to when they told us that there were issues and that action was taken.
Ruth might want to say a wee bit about how the pilot informed the end product.
Ms Ruth Kennedy (Council for the Curriculum Examinations and Assessment): Some of the recommendations from the pilot and the actions taken are up on the NI curriculum website. I will pick up on some the examples that were raised in the letter.
Schools reported in the March trial that assessments were taking too long. In response, we aimed to ensure that the assessments would be about 35 minutes, so the length of the assessment was reduced. The latest feedback that we have on the assessments running in the autumn term is that the average time for a literacy assessment is 36 minutes. The numeracy assessment is in two parts, and most pupils will complete each part within 15 to 20 minutes. We also introduced a pause function so that pupils can take a break if they need to and can come back after a break or, indeed, the next day.
Schools also reported issues with the functionality of some of the drag-and-drop questions, so improvements were made to that. They reported that they were not keen on the voiceover in the numeracy assessment, so that was changed. A number of actions were taken.
In the autumn term we want to continue to listen to schools and to get feedback to see what further refinements can be made. We acted on the feedback that we got in the trials.
The colour purple issue was raised in the parents report, which is one of the report formats that are available to schools. I am reliably informed that it is two different colours, but that when some printers print out, the colours are too similar. Again, we want to look at that and make sure that we address it so that there is no confusion due to the similarity of the colours.
Mr Savage: It may be worth saying as well that the parental reports are optional.
Ms R Kennedy: Yes. Different reports are available to schools. There is an "objectives report", which gives feedback on every question that the child has answered, what it was assessing and whether the child got the answer right or wrong. In numeracy, there is a "diagnostics report" as well.
What the "parent report" aims to do is to pull out some of the main messages for parents. We have said to schools that if they find that the other formats of report are more meaningful for schools and parents, those can be used as well. There is a range of reports that can be used.
The Chairperson: Is it not an issue, Katrina, — and I speak as a parent as much as Chair of the Committee and an MLA — that a parent only wants to know whether John or Ruth is at the top, middle or bottom of their class?
I accept that such assessment places huge pressure on teachers. We need to guard against that, so that the teacher can be objective and that his professional judgement is not in any way damaged. However, we have in place a very costly, complicated system, which, it seems, does not have the confidence of a large proportion of the school population. They have no confidence that it can deliver what the taxpayer expects.
That begs the question: was the final version of the NILA and NINA trialled? As I understand it, the final version is not what was trialled; it was a variation of what became the final version. Now, the contract has been signed. Perhaps you could clarify the cost of those two contracts because it is no small amount of money. I spoke to the Minister this morning on that issue, and it is confirmed in the papers that he has met the providers. He has met you and the CCEA, and he has been very clear that this issue has to be dealt with and addressed. I welcomed that. He was very forthright about it.
What was the cost; and was the final version what was trialled, or was it a variation of it?
Mrs Godfrey: As to the cost, the contract costs for the two tools is approximately £400,000 per year; that works out at less than £5 per pupil to carry out the assessment. In the first year, clearly, the cost is higher. I think that the total cost is in the region of £900,000 per year, but that reflects the development work that goes in at the start of any process of this nature. Those are the costs that we are talking about for a system that is rolled out and applicable to every child in years 4, 5, 6 and 7; that is, about 80,000 pupils.
The Chairperson: You said that it is less than £5 per pupil; that is interesting. I have spoken to principals in schools who have gone online to another provider — ironically, it is GL Assessment, which, I understand, provides another test that the Department has a bit of an issue with at year 11 — and for £5 per pupil, they can get a report that is more robust and reliable than that provided by the Department. Why have we had to have all this process for the last number of years?
Mrs Godfrey: It is important to remember that this is a procurement process, so that any of the companies that you mentioned could have bid, and many did. Therefore it would be inappropriate of me to say that one service was better than another. This went through a procurement process supervised by the Central Procurement Directorate (CPD). Any of the companies in this business could have tendered. Many did, and the tender process was gone through robustly, correctly and in line with the rules. The outcome is that it is not one of the companies that you mentioned; it is different companies. However, that is the outcome of the procurement processes that we have.
It makes much more sense for an assessment of this nature to be procured at a regional level than to expect every school to go through the bureaucratic and administrative burden of sourcing their own information. However, as I said at the outset, and as the Minister has probably already reflected to you, when we do that, the responsibility to make sure that it is glitch-free is even greater. That is the focus of our attention at the moment. It is very important to make sure that the thing works effectively, smoothly and correctly when schools arrange to have their pupils take the assessment. That is a key point.
The key differences between the product that was trialled and the product that schools are now seeing will have come about as a result of the trial. Of course, things will not look exactly the same, but the reason for that is that when schools piloted them, they pointed out issues, as Ruth explained, that did not work for them. There will, of course, have been adjustments between the trial and the roll-out because otherwise, arguably, there is no point in having the trial. That is a key point as well.
I had not picked up on your earlier point about ICT skills. You are right: these are assessments of literacy and numeracy, but we have to remember that — it is relevant to the conversation that we had at the outset about the levels of progression — there are three cross-curricular skills and that ICT is a skill. The assessments should not require pupils to demonstrate any level of ICT that they are not already required to be taught and to learn through the revised curriculum. That is also a key point.
The Chairperson: OK. I want to hand over to members, but I will conclude by reading to you the view of a group of principals — professional teachers — from across the sectors in Ballymena:
"However, introducing an assessment tool that is unworkable, unreliable and not fit for purpose undermines the confidence of teachers, pupils and parents in the computer-based assessment. The training and implementation phases overlap too closely, the process of administering the assessments has worked out to be time-consuming and schools do not have the level of IT hardware and internet that is required to operate CBA. This unreliability will unfairly penalise pupils."
The way in which this issue is seen by teachers and schools is of the magnitude of the Ulster Bank situation. The question that has not yet been answered is how it will be addressed. We have been given the figures and we know that 663 schools have started the assessments, which amounts to 75% of the total number of schools registered, and that 167 schools are to be completed, which means that 55,000 pupils have completed the assessments.
There is a concern about the inspectorate doing an assessment of this. It is no reflection, because the chief inspector will be here after this presentation, but my concern is that we should be absolutely sure that what was being said was as it is being told to the inspector and that we would get a fair, independent and no-holds-barred assessment.
There is a concern because of the position that the inspectorate has in the Department's management structures, not that I would ever accuse the Department of trying to manipulate any report, but I would want to be absolutely sure that it was a fair, open, honest and transparent assessment of the process.
Mr Savage: That is exactly what we have asked for, not just of these tools but of the use of InCAS computer-based assessment data. It is about looking at the use of data in schools from the broader perspective.
The Chairperson: What was the assessment of InCAS? It was a contract process. I take Katrina's point. I worried about that, because there was not capacity in the CoPE to put out a procurement exercise of this size. I understand that that was why central procurement was used. Some of us have concerns about that whole bureaucratic process.
We have a situation where there had been InCAS. What assessment was done of InCAS that informed a process whereby we had type A and we wanted to go to type B? When we were writing the framework for type B, what assessment had we of InCAS as to its reliability and functionality before we set the new contracts?
Mrs Godfrey: That is a fair question, Chair, because you will recall that there was a year in which we had difficulty with InCAS as well. It is one of the challenges around procurement, because, if you are in this process, you would, ideally, love to be able to give schools certainty over a much longer period so that they could not only get familiar with an assessment tool but allow it to embed and settle in over a much longer period. That was not possible within the rules that we were operating under. An awful lot of learning was captured from the experience of InCAS, which was our first computer-based diagnostic assessment tool. That was reflected in two ways, first, in the drafting of the specification, and, secondly, when it came to the pilot and the rolling out. Ruth might want to add to that.
Ms R Kennedy: During the functioning of InCAS, it was evaluated every year in order to make sure that refinements were fed into it. For example, refinements were made around the reporting to parents or the sort of information that was coming out of InCAS. That was an ongoing process. As Katrina said, it then fed into the drawing up of the specification in two ways. One was to ensure that we talked to principals and teachers and issued questionnaires to ask what they wanted, based on their experience of InCAS and computer-based assessments. We also drew on our own experience of InCAS, the sort of functionality that was required and the things that were needed, to feed into the specification. That became the very detailed specification that was used for the procurement process. So it informed that.
As Katrina said, one aspect that we learned from InCAS related to the font type and size and the colour of background that was most accessible for children. What we learned during the trial was fed into the development of the new assessments. There has been a learning process.
Mrs Godfrey: Chair, I want to pick up on a couple of the points that you read from the Ballymena letter. The specification was absolutely designed to respond to the level of ICT hardware and internet connection that are in the schools, so there could be no question, nor should there be. We would not ask schools to deliver a computer-based assessment programme that they did not have the computer resources to deliver. That is absolutely clear.
The penalising of pupils ought not to be possible, because this information is used only by the teacher and the school and, as I said at the outset, is used alongside all the other information that a teacher has about a pupil. It is not a test score that is used for anything else; it is designed to help the teacher, pupil and parent to work together in the best interests of the child achieving his or her full potential. Therefore that should not be an issue.
You made a point about parents preferring to know Johnnie or Jilly's class average. There is a difficulty there immediately, because, by the nature of an average, half the class, no matter how good they are, will be below the average. The sort of information that a computer-based assessment gives you is much more tailored to your child, what they are good at, where they need help, where they need encouragement to build on their strengths and is much more personalised. As I have said frequently before at the Committee, that allows for much more meaningful engagement with parents.
The other piece of information that will become available after standardisation is the standardised score. Ruth, it might be helpful if you explain briefly what will be available to schools in January.
Ms R Kennedy: Once all the children have taken the assessment, providers will generate the age-related outcomes for pupils. That will be a gauge of where they are in relation to their age. There will also be a standardised score, which shows where the child is in relation to the population.
During the trials, we looked very closely at the assessment to make sure that it was accurate and that it reflected ability. The decision was taken that the best thing to do around age-related scores and standardised scores was to wait until all the children had taken their assessment. That means that it would be based on the entire population and not on only 20% of it, which would have been the case from the trial.
The Chairperson: Yes; I think that the cohort used for InCAS was 10,000.
Mrs Godfrey: One of the key things was to make sure that the standardised information is available. That information is far more useful than the class information, which could be skewed for a number of reasons. Unless you take it from a very small sample, as Ruth said, or from a sample in England or somewhere else that may not be representative, the only way that you will get that standardised information is through our own children taking the assessment.
Perhaps there is a lesson for us in making sure that we communicate to schools in which there are no issues as well as those where there are. There certainly should not be issues of pupils being penalised; that should not be possible. There are not issues because schools do not have the kit to deliver it: this was planned on the basis of the kit that we know to be in every school.
The Chairperson: Could members please switch off their mobile phones if they have not already done so? BBC's 'Democracy Live' is having problems. If you want your comments to be reported accurately, please switch off your phones.
Mr Kinahan: As I said before, when this was brought to my notice, I rang 22 different schools. Apart from three or four that had not yet started, all of them were having problems. The letter that the Chair read out outlined virtually the same problems that we have all heard.
The chart tells us, for example, that 848 had uploaded to the information system for NINA and that 847 had uploaded to it for NILA. However, the figures for those schools that completed assessments are only 167 for NINA and 174 for NILA, which is about one fifth. Does that mean that many schools are nearly there?
Mrs Godfrey: Absolutely.
Mr Kinahan: What worries me is that there is a huge cost in getting from 90% to 100%. When will we know the technical, training or other costs? Have you any idea of what they will be?
Mrs Godfrey: The first key thing is that there is no requirement on schools to complete this until the end of the autumn term. It is fair to say that most tend to do it in October. Either side of the half-term break are the two big blocks in which you see vast numbers.
You are absolutely right, Danny. I asked the same question when I saw the figures. The table gives you part of the answer; it shows that more than 400 schools are more than 90% of the way there. It may be that those schools had children absent on a day or have one small class. That explains the variation between 847 and 174 and the 65%. However, some 57,000 pupils have already completed the assessments.
As with anything of this nature, and quite understandably, we tend to hear only about those schools where it has not worked well. We tend not to hear from those that have run through the assessment without any significant issues. That is the nature of things, and we would not expect —
Mr Kinahan: That is why I rang; I wanted to find out. I rang and heard that they were all having the problems. My concern is that hidden in here is a problem that will get bigger and bigger as we try to get to 100%.
Mrs Godfrey: We also know that technical issues are much less likely to occur where schools have correctly completed all the hygiene issues. That is where they have completed all the pre-assessment instructions such as making sure that their computers are on, that the log-in has been done correctly, and that they are using the right web browser as specified in the instructions. In fact, one of the cause-and-effect factors that we have noticed is that problems do not happen where we put C2k, CCEA or even Northgate or the other providers into schools. That may be related to the fact that having somebody in makes you more careful in carrying out the pre-test checks.
However, there have still been screen freezes in some areas. We absolutely have to get to the bottom of why that is happening. We know — because we have been able to test it — that it is not because of bandwidth or internet connections. Schools thought that the problem may have been caused by overloading the internet connection, when they were actually only at something like 26% capacity.
At the moment, we know much more about what is not the cause. We have experts in the small number of cases where it is not linked to schools not following the right pre-test procedures to make sure that we can pinpoint the exact causes. That is how you fix them.
Mr Kinahan: It was designed to take 35 minutes, and some teachers said that it was taking two or three hours.
Mrs Godfrey: That should not happen.
Mr Kinahan: You may have completed 50%; however, if it is taking three hours instead of 35 minutes, we still have a huge problem with the system. Are you sure that we are getting the three hours down to 35 minutes in every case, whether through the machines being right or the teachers being trained properly?
Mrs Godfrey: That is a key issue. In response to the training, we have simplified and reissued the pre-test instructions in the last few days. It is a bit like the conversion from analogue to digital. We have set out the things that you have to do before you start. We have reissued the instructions, through CCEA, in a much simpler and more user-friendly format. They are literally on either side of an A4 sheet. We think that that was a sensible thing to do to address an element of this. Training and the clarity of the instructions is not the only element, but it is almost certainly related to the problems. We have already addressed that.
Mr Savage: We can also monitor the average time that it takes to do assessments.
Ms R Kennedy: Yes. I reported earlier that they have fed the figures from the literacy assessment through to us. The average time is 36 minutes. They are obviously looking at pupils who take longer. Part of the reason that it is hard to monitor is the pause function. Pupils can walk away from the assessment, leave it and come back an hour, a day or a week later. That is why it may appear longer, and we are looking very closely at that. However, the average time for the literacy assessment is 36 minutes.
Mr Kinahan: We look forward to hearing more. Thank you.
Mrs Dobson: Katrina, computer-based literacy and numeracy tests have been around for more than 10 years. Why could the Department not have used one of those tests rather than trying to reinvent the wheel?
Mrs Godfrey: It is for the reasons that I explained to the Chair. You are right: there is a large number of providers of computer-adaptive tests across the world, not just in the UK. A procurement process was run. Any of those providers could submit a tender. As I explained, a process was gone through for those that did. Others may have decided, for whatever reason, that this is not a market that they want to be in. The critical thing is that providers had an opportunity to bid to be the contracted provider of these tests.
Mrs Dobson: So the Department had to over-complicate it because of the procurement rules.
Mrs Godfrey: It is not an over-complication. The simple fact is that if you are in this business, you had an opportunity to tender. It was a commercial decision for businesses whether they wanted to take that opportunity or not.
Mrs Dobson: Was the adaptation to fit the local context a reason for many of the glitches that occurred in schools?
Mrs Godfrey: One of our areas of learning has always been that you have to make sure that anything that you operate in schools is tailored to our audience. That applies to any form of assessment. A key consideration of the pilot was that it had to look and feel as normal as possible for children in our school system.
Ms R Kennedy: We wanted to ensure that whatever criteria were being assessed were aligned to our curriculum and that questions were aligned to our context. If you are asking whether that was a root cause of the technical issues, I would have to say no. The issues were technical.
Mrs Dobson: Was it a significant job for the software designer to fit it to the local context? Can you describe how you made the test fit the local context? Can you guarantee that that did not cause the glitches?
Ms R Kennedy: There are two providers. It has been subcontracted to a local provider, so one provider was local to begin with.With the other provider, many of the adaptations related to the content, and it is really just a question of what is being assessed. As Katrina pointed out, also part of the specification was that the providers were given the instructions on the hardware infrastructure that schools had in place to make sure that it was running there. I cannot answer for whether the adaptations have caused the glitch. We are trying to find out what is causing the technical glitches, and we are looking at the entire picture. However, they will not have been the result of adaptation.
Mrs Godfrey: That would have been a requirement of the specification. Those tendering would have had to assure CCEA and CPD that the product that they proposed to sell us was fit for purpose. That is a critical dimension of any procurement process.
Mrs Dobson: Was there extensive consultation with C2K and Northgate on the local contexts?
Mrs Godfrey: Absolutely, and another dimension of that was the point that I made earlier: we made sure that those tendering for the business understood the capacity of schools and their ICT — their connectivity, hardware and ICT kit — so that was absolutely tailored to the ICT in our primary schools right now. Ensuring that those expressing an interest provided assurance that their product was able to respond to the requirements of the specification was a critical part of the specification and evaluation.
Mrs Dobson: Why did you take the decision to roll out the assessment to all schools, despite the trials showing up issues with crashing computers, displays and sound? You said that schools need to take computer hygiene steps. It seems that the Department is placing quite a lot of blame on the schools.
The Chair touched on this earlier: why did you blatantly ignore the views of the schools?
Mrs Godfrey: As we explained earlier, we absolutely did not. The process was tendered for in line with the specification, which was drawn up with input and involvement from principals and teachers. At the very outset, when we went out to tender, the products that we sought to buy were the subject of consultation. Then there was trialling and piloting to make sure that what we were going to buy was fit for purpose and worked. At that point, there was extensive consultation with schools, as Ruth outlined, on what they liked, what they did not like and what they thought could be improved on. That information was then incorporated in the revised specification for the full roll-out.
It was rolled out fully because it is a requirement on all schools. If we require it of all schools, we have to make it available to all schools. It would have been untenable for us to have a requirement in law without giving schools the tools to comply with it. That is why it was rolled out to all schools. At every point, CCEA sought feedback, which was categorised and responded to in the best way possible in the relationship between the specification, the design of the pilot, the learning from the pilot and the carrying through of that to the go-live product.
Mrs Dobson: Do you recognise that this process has led to considerable stress and disruption for all pupils across Northern Ireland and that it is affecting their educational experience? What is the next step for schools, now that this test has proven to be a total failure?
Mrs Godfrey: First, this has not disrupted pupils' education. The very fact that 56,000 or 57,000 pupils have already completed a programme that they do not have to complete until December is evidence of that. Absolutely, we sought and took assurance, from the trial to the roll-out. It is a real concern that, despite all that, there are still issues such as screen-freezes, and that is not acceptable to us.
As I made clear, the Minister has had a series of meetings, not just with CCEA and the Western Board, which operates C2K, but with the private sector providers to try to make sure that all who have a role to play are working together to identify the root of a problem that has arisen, not in every school but in some. We are determined to get to the bottom of that problem. This, however, is a system designed in a very low-stakes way, so it should be working properly — absolutely. There is no doubt that it should be working smoothly in every school. In schools where it is not working, we are working with the experts to find out why and determine how we will fix it.
As I said, we looked again at the training instructions for the pre-tests and prerequisites to make sure that they are explained as clearly as possible. The real experts are on the ground; they are in schools, daily. They are in schools that have told us that they have problems and in schools that have not told us that they have problems, just to see whether they can put all their expertise into pinpointing the one, two, three — we do not know yet how many — issues at the root of this so that they can be sorted, and sorted quickly.
In the meantime, the Minister has made it clear to schools that, if they do not want to run the risk of the programmes not working, there is no pressure to run them at this point. He has already written to every single primary school and said that they need not complete the assessment until he can tell them that he is satisfied that the issues have been run to ground and fixed. So there are a number of steps and, as the Chair mentioned, the Minister is absolutely determined that this will be resolved, and resolved quickly.
Mr Craig: Katrina, I listened with interest when you stated on several occasions that not every school was having difficulties or problems with implementation. There was an extensive pilot scheme — 10,000 is a large sample of the school population — and issues and difficulties were raised.
I have to be honest and say that I am starting to take exception to what you are saying. I get a feeling that the Department does not realise the scale of the difficulties or is choosing to talk down or bury the problems. Yesterday and this morning, I spoke to my local principals' association, which represents 52 local primary schools. I asked a simple question: is any one of the 52 schools not having difficulties? The simple answer was no: every single one was having problems and difficulties. The Department needs to sit back and think about that. For a start, the scale of the problem is much wider than anybody wants to admit. I want to know whether the Department was provided with accurate information about the difficulties in the pilot scheme that was inherited by CCEA. If the information was accurate, given that there are still widespread difficulties, somebody here needs to ask the obvious question: why was it rolled out?
Mrs Godfrey: There is absolutely no question of the Department ignoring this. As the Minister said, one school experiencing problems is one school too many. I go back to what I said at the start: if we require schools to carry out this assessment, we have an absolute responsibility to make sure that the tools that we give to them are fit for purpose. There is no doubt about that, and the Minister is absolutely clear on that.
We can go only on the evidence that we have, which tells us that, as you rightly said, a significant number of schools have experienced and reported problems; significant numbers have not. Where problems have been reported, we have made sure, through CCEA and C2K, that they were followed up on every occasion. That gives us a pattern of the nature of the problems and allows us to work out which of them may be due to something that has nothing to do with the assessment system. Some might be to do with schools making sure that they have the right login information or something like that. Others, which seem to be the critical ones, relate to, for example, screen freezes, which many schools have reported.
There is no question of our not being absolutely robust. If the programme does not work for a single school, that is not good enough for us — it needs to work. For the pilot, we sought and received absolute assurances that it was appropriate for the Department to specify the new NINA and NILA tools as the statutory assessment tools from September. Had we not received those assurances, or had we spotted anything that would have given us cause for concern, we would not have done that. So it is very clear that, going on the best information available to us, we took the best decision that we could. The fact that there are now issues coming to light is, as I said, a matter of great concern and one that the Minister is determined that we get to the bottom of quickly.
It is not a case of not listening; it is a case of listening to all the feedback from all the schools that contact the Department, CCEA or, in many cases, the C2K helpline. It is a case of categorising in significant detail what they tell us about the nature of the problem and then being able to look at that to identify whether there is a different source for that problem or whether something else is happening. Then, to address those problems that appear to be related to the two assessment tools, we put every expert available to us, including those from private companies, on to the ground to make sure that the exact source of the issue can be identified and sorted. I am happy to give the Committee a commitment that that will continue to happen until all problems are identified and sorted.
You are right that it is a difficult problem because it is intermittent. We can send 20 experts to a school that has told us that it has problems but they do not see any problems, and that presents challenges for us.
As you said, there was a significant trial. Ruth may want to pick up on this, but it is clear that CCEA was very careful to pick up and respond to feedback.
Ms R Kennedy: As I outlined earlier, when picking up on points in the principals' letter, schools involved in the trial were telling us about problems, so we took action to address them — very much so. That included any technical issues raised, on which we worked closely with C2K to monitor the situation to make sure that schools had adequate bandwidth and internet provision to enable the assessments to run smoothly. There was a lot of work on that, which is why we felt, as Katrina said, that we had done everything possible within the timescale to address anything raised in the trials that would have caused concern.
Mr Craig: Katrina, I am sorry, but I listened with interest to everything that you said. I know that you will not like this, but I spent 20 years after university working in IT in the aircraft industry, and some of the things that you said about needing a specific browser to a certain standard, the IT industry would, quite frankly, find laughable. I mean that — your statement this morning was laughable.
Any trial should have picked up the problems, and they should have been rectified. Frankly, it is not a case of whether a school has browser A, B or C. The programme should have been made more flexible so that it could have been used by anybody. Those are all basic, simple IT issues that should have been detected by the pilot scheme before the programme ever hit the schools for mainstream use.
Much more worrying than the technical issues is that principals are complaining to me about the accuracy of what comes out the other end. Some teachers have been in the profession for 30 or 40 years and are good judges of what pupils are capable of. I have never denied that they are very good at their job. However, when they see a pupil who is academically very good take a test that says that he or she is not, I know who I would choose to believe. If we cannot get to the bottom of that fundamental problem with the system, we should not be using it.
Mrs Godfrey: You are absolutely right. We would say exactly the same thing. The key voice to hear in all of this is the overall assessment of the teacher. This is an important piece of information, but it is only one piece of information. Teachers, as professionals, are amazingly good at taking the whole set of information they have about a child and using that to give parents the best possible advice on how they can support the child's learning.
We said exactly the same about InCAS: the critical thing is to set the information from this assessment in a wider context. We have heard most consistently that teachers, and particularly parents, like a consistent form of reporting. They find that useful, and they find that it helps them to have the sort of conversations that might be more difficult without it. So we would not want to lose a means of engagement with parents that we know is very effective. We also know, and it is something that Ruth might pick up from the trials, that one of the key factors was the nature of the reports and how those would be designed to be of most use for schools.
Ms R Kennedy: In the trials, part of the May quality assurance was that both providers engaged with focus groups of teachers to ask them whether the outcomes from the assessments were consistent with their teacher judgement. The message from the May quality assurance was that the majority of teachers — 79% — said that they were. The remainder were either a bit higher or a bit lower, but there has been no significant difference. So the feedback from trials and the QA is that the assessment is consistent with their teacher judgement.
Reporting to parents is vital, and it is a matter of determining how best that can be communicated to parents and what formats of report are most useful in supporting teachers in their conversations with parents. Again, part of the trials involved looking at the objectives report to see how those looked and felt. Obviously, we maintain our focus on the parent reports to see whether there is anything that could make them clearer and more helpful for teachers to use.
Mr Craig: Everything in my experience, 20 years in the IT industry, tells me that the pilot should have been extended and that we should not be in this position.
Miss M McIlveen: I think that you will accept that there is, unfortunately, a cost to the reputation of the Department, CCEA and everyone associated with the problem. I also assume that there is a monetary cost associated with some of the issues raised by the schools. Who has liability for what? It goes back to what Jonathan said about the software, and so on. What elements are the responsibility of the two providers, and what costs will there be to CCEA, the Department and, possibly, schools?
Mrs Godfrey: That is a very good point. It takes me back to the critical point of identifying and pinpointing exactly what causes issues such as screen freezes. There are a number of possible scenarios, and if there is any issue related to the contracts not being delivered to the specification to which the providers signed up, clear procedures will apply, including, if appropriate, clawback, sanctions or penalty clauses. However, we are not at the point of having evidence that this is an issue with software, hardware, training or accessibility. Michelle, should there be anything related to under-delivery against the contract, the Committee can be absolutely assured that that will be followed up relentlessly by CCEA, C2K and, if necessary, the Department. Before getting to that point, we have to be absolutely sure what the issue is. On the basis of the reports, it seems that there could be more than one issue. Some problems may well be to do with the clarity of advice and instructions sent to schools. For others, we are reliant on the experts to try to pinpoint the issues quickly. Should any issues relate to the contracts, that would automatically trigger a series of actions on behalf of CCEA, which is the signatory to the contracts.
Miss M McIlveen: The initial cost for the year was £900,000. Was that to cover some of those issues?
Mrs Godfrey: That was to cover some of the development costs in the first year and making the software fit for purpose. The cost will lower in future years as the programme is rolled out over the period of the contract.
There is also a potential hidden cost, which is the cost of schools' time. If schools are wasting time that they should not have to waste, dealing with screen freezes or something that is not working properly, that will not show up on a budget sheet but it is a real opportunity cost to schools. That is why it cannot be allowed to continue. We have to get to the root of the problem and get it fixed, because it is not tenable for schools to be wasting time, because this should operate in a way that allows children to sit the assessment in the timescales that Ruth mentioned so that the information can be gathered and used for planning purposes. I am worried about the unquantifiable costs related to schools' frustration with the assessments. We cannot lose sight of that, because it is important from a cost perspective, from a teaching and learning perspective and, as you said, from a reputational perspective as well.
Miss M McIlveen: Was there a risk analysis?
Mrs Godfrey: There was a full risk analysis, which was reviewed and updated at every stage in the process. As with any case of this nature, risks were identified and were being managed at every point. Ruth might want to say something about the risk process, but there was a detailed risk register.
Ms R Kennedy: We have risk registers at project, operational and corporate level. This was one of the risks reflected in CCEA's overall corporate risk register. In addition, a gateway review was carried out during the procurement process to identify any possible risks. There has been a very thorough risk overview.
Miss M McIlveen: On a different subject, there were issues for hearing impaired children. I note that you hope to schedule a meeting shortly. Were those groups spoken to in advance of implementation, and were they part of the trial?
Ms R Kennedy: The trial involved children with moderate learning difficulties (MLD) and children with special educational needs. Accessibility to computer-based assessments is a challenge for hearing impaired and visually impaired children. We drew on the experience of InCAS to ensure that we were doing what we could, for example, on font size, background and the images used to make sure that the screens were as clear as possible. We also introduced a pause function in the assessment so that pupils could take a break. The numeracy assessments include a voice-over that children with a visual impairment or reading difficulty can listen to in order to help them with the questions.
We want to make sure that the assessments are as accessible as possible to as broad a range of children as possible, which is why, in the autumn term, we were looking in more depth at issues such as hearing and visual impairment and special educational needs to see what adaptations could be made in the assessments or in the access arrangements to the assessments. Some pupils will have significant access issues, no matter what, so we have a close focus on that to see what can be done as we move forward with the development. To that end, we issued an invitation to heads of service to discuss hearing impairment issues to see what else can be done to address accessibility. We are trying to set up that meeting.
Mr Savage: Accessibility was not an afterthought. The companies had to demonstrate equality of access for pupils who have visual or hearing impairment. That was in the specification. We had built that in at the start of the process and flagged it up as something that providers had to take into account and demonstrate that they could do.
Mr Rogers: You are very welcome. I do want to repeat everything that has been said. However, I agree with the comments expressed by other members.
Let us go back to the reason for the system. It was about improving teaching and learning, and it was about using data to inform teaching and improve learning. You need to take on board teachers saying that this is a waste of time and that they should go back to planning and teaching.
I also agree with the Chair that this is not just a technical glitch. There are other deep-rooted problems, and I have heard people comment that the questions are inappropriately worded. The practice questions for primary 4 pupils, for example, are the same as those for primary 7 pupils. Also, although it is a computer-based assessment, pupils might have to complete some work on paper before going to the screen.
Katrina, you said that you were sure that the specification matched the schools' hardware, broadband provision, and so on. Broadband can be a problem, and the further into rural areas you go, the worse it gets.
I spoke to a particular principal, whose school has grown greatly over the past number of years. He told me that, even if this system did work, to carry out the assessment, they would have to unplug every laptop in the school and put them all into one room. In other words, for children in other classes, there would be no more whiteboard or anything, not just for 45 minutes but for the two or three hours that the assessment would take. The level of hardware just is not there.
Some numeracy skills seem to be more appropriate for testing ICT rather than the other way around. My other point is that, if a child does not have good IT skills, will the assessment fairly reflect his or her numeracy or literacy?
Ruth, you mentioned that the assessment is 79% consistent with teachers' judgement. My point, and what teachers are telling us, is that it tells them nothing more than they know already. So what is the point?
Frankly, I believe that there should be no reporting until the scores are validated and verified in some other way. A larger pilot must happen before this becomes statutory.
Mrs Godfrey: I will pick up on a couple of those points. The figure of 79% was mentioned by Ruth. Obviously, there were teachers who said that it was very consistent with their judgement. However, we have said to teachers all along that, if their response is that this assessment does not tell me anything that they do not already know, that is what we would expect a good teacher to say. This is extra information that validates a teacher's judgement but in a format that makes engagement with parents in the autumn term much easier. We know that from the feedback.
So if you are saying, "Yes, that tells me that Johnny is exactly where I thought Johnny would be", that is exactly what we would expect a good teacher to say. In the same way, teachers doing any other form of assessment will have a sense of how they expect pupils to perform because of their professional judgement and knowledge. So I would never see that as a problem, Sean. That is what we hope would be the position for the vast majority of pupils. Teachers should be saying, "Yes, that is exactly where I see Johnny, Jane or whoever." If teachers were not finding that or were finding the opposite — the assessment was not telling them exactly what they expected — that would be a far bigger problem. That is one of the reasons why we particularly wanted our colleagues in the inspectorate to look at how good schools used the data from this and other sources to inform their teaching and learning. That is the critical point of the evaluation that we want to take forward in the spring so that we can learn from it and see how it works in practice.
Mr Rogers: I suppose that the real challenge is for you to find schools in which it is working well.
Mr Hazzard: I have a couple of issues. First, when you look at the issues around the trialling, it is quite obvious that serious questions need to be asked around the functionality of the trial. What is your assessment of the functionality of the trial? We have various principals saying that the same problems have come up in the trial as are coming up now. If that is the case, do we have any idea about whether those principals raised those problems during the trial? Did the trial work in that way?
Also, in the tabled items, a group of principals have described the computer-based assessments as unnecessary, meaningless tests and an utter waste of money. That is very worrying to me, for various reasons. Do we have a section of principals who are not buying into this? Is the responsibility on the Department as well as the principals to buy in to the benefits of these tests? Will that ease the progression of this and the understanding of why we need to advance this? I am aware — at least I think I am, reading through this — that the Minister wrote to the schools to say that they could stop if there were any problems at all. Is it the case that schools are persisting, even though they have been told that they can stop until a solution is found?
Mrs Godfrey: You are right. That was very quickly communicated to schools by the Minister in response to feedback. To pick up on Michelle's point, the last thing we want is, where there are issues, schools wasting time that they do not have to waste. That letter went out to schools, probably because of the very point that we were talking about earlier. Principals and teachers are so professional and, in many cases, they will want to get this done because they will see the value of it. In some cases, they may have wished that a different tool was provided because there might be ones that they are more familiar with than others, but that is not within our gift in the procurement process. However, you are right; there is a point about going back to the central purpose of this and looking again at how we communicate it to schools. That is incredibly difficult to do while you are having technical difficulties. That is why the absolute first priority has to be to sort the technical problems and to make sure that when a school gets its children to undertake the assessment, the system is working for that school, and then to look at how valuable it is. This is a new product, and we have to keep it under review and look at not just how valuable it is on the day, but how valuable it is in the longer term in planning the teaching and learning to effect improvements and outcomes for pupils. Therefore, we have to keep all of that to the forefront. However, we cannot credibly engage in conversations about the second and third issues until we absolutely fix the first one. That is the order of priority on this.
Ms R Kennedy: To pick up on one of your other questions, of the 206 schools that contacted CCEA or C2k around technical issues, in the phone-backs and following up with those, over half of them — 119 schools — have indicated that they are continuing with the assessments. Looking at the trials, obviously, one thing that we have done now is to look back over all our records. During the trials, schools did not contact the CCEA help desk with the technical issues in the way that they have been contacting us during the full roll-out. However, part of the trials, particularly the big March trial and the May quality assurance, involved an evaluation process where we sought feedback from schools. That then fed into the recommendations, and I have outlined some of the actions that were taken around the recommendations. That informed the actions that were taken before we moved to full roll-out.
So, where schools reported that the length of the assessment was too long, we worked with providers to cut the length of the assessment. Where they talked about the drag-and-drop functionality, we worked with providers to improve that. We also looked at where schools reported technical issues, such as the assessments running slow or freezing. Part of the trial was to work with those hard-to-reach schools — the schools with low bandwidth — to see what the experience was in those schools and to ensure that we were working with C2k and the Department to make sure that those schools were being prioritised in terms of any actions around that, and they were given advice around how to take that forward. That then fed into the checklist of prerequisites for schools.
So, we tried to pre-empt some of the issues that schools might have. There was a process of listening to feedback from schools in the trials and acting on it before moving to full roll-out.
Mr Lunn: I have the same concerns as everybody else. I am the last to speak, but I will try not to duplicate what everybody else has said. However, I will go back to Michelle's point about the cost of this. It is not unknown — in fact, it is almost normal — for a new computer product or software application not to work when it is introduced to the public service. You only have to go back over the minutes of Public Accounts Committee meetings in the last few years to see plenty of examples of enormous amounts of money having been spent — this is actually a relatively small amount — but the product not being fit for purpose and having to go back to the drawing board. What normally happens is that the computer companies will then run rings round the various Departments, and, instead of saving or clawing back money, which is the thrust of what Michelle said, it will cost you more money. I will watch with interest to see how much more money the Department has to expend on this. My opinion, based on previous experiences, is that the contract was not properly specced to start with. The computer company, Northgate or whoever, will just say, "We did it exactly according to the spec. If it does not work, we will have to do more work on it. It will cost you several more hundreds of thousands of pounds." Which is a bit depressing, but it is a fact.
Beyond that — again, not for the first time — the Department and the people out there at the chalk face appear to be saying completely different things. We have heard it right around the table again today. Danny heard from 22 schools, and Jonathan had 52. I could name several dozen, although, to be fair, they are probably the same ones as Jonathan mentioned. We have a list the length of your arm of schools from north Down and Ballymena, all of which have confirmed that there have been serious problems.
I will take a basic example. You said that this is not particularly disruptive to schools. At the same time, you have said that there is a cost application to the disruption that has been caused to schools. I think that I heard one of you say that the new system was tailored to existing equipment. However, the schools say that the old computers and laptops, which they are still using, are not up to the job. So, I have a question about the standard of computer equipment currently in schools. Is there some truth in the comment that some of that stuff is so old that it just cannot cope? I heard Jonathan's point about browsers and so on, but it is not my area of expertise.
The whole thing seems completely odd. It has just gone off at half-cock, as things tend to do. I heard questions about the trial not being up to scratch. Jo-Anne has suggested that there should have been a much bigger and more extensive trial. I really do not know. Having said all that, it will probably work out. These things generally work out eventually, but only after a lot of heartache and disruption, with children being demoralised, bored, and uninspired. I am reading this from comments from the primary principals' group. Answer my question about the age of the computer equipment. Are you satisfied that, across the board, that equipment is fit to run the system?
Mrs Godfrey: What I do know is that the IT equipment in primary schools here is possibly better than in most other countries in the world. It is incredibly well-developed. What I can also tell you, Trevor, is that the specification was designed to respond to the kit that is in schools at the moment; not just the computers and laptops but the connectivity. The roll-out is designed to respond to the equipment that schools have at their disposal.
It would be wrong for us to set a requirement and then not give schools the ICT equipment that they need. So, it is the other way round. The requirement reflects the ICT equipment in the schools. That is a critical part of the specification, and it is why C2k has been so closely involved in this from the outset. They are the people who know what is in schools and who made sure that we were not designing something, through CCEA, that did not match the ICT provision in the schools. That absolutely should be the case.
Mr Lunn: Here we go again. Schools are saying one thing, and the Department is saying something else. We do not know who is right, but we can listen to both and make our own assessment; and that is not to question anybody's judgement. One of the comments was that a lot of primary schools do not have colour printers. Adults are spending time colouring in these wee charts, presumably with crayons. What is going on here? To put it bluntly, is the system so bereft of finance that a primary school cannot have a colour printer?
Mrs Godfrey: Those are the issues that would be absolutely designed into the spec — the availability of kit in the schools.
I want to pick up on your point about disruption. We know that there has been disruption in a significant number of schools, and that is simply not tenable for us. We absolutely have to get the bottom of what is causing the issues in the first place. It is not a case of not listening. As I said before, we hear the reports. We categorise the calls that come into the various help desks. We categorise the issues that are raised. We are using experts on the ground to identify the difficulties. The sooner we do that the better. We cannot ask schools to do things when some part of the system is not working. That is the absolute priority at the moment.
Mr Lunn: I forget who it was — it may have been Sean — who asked if you could you produce a school that is satisfied. All that we can produce are schools that are quite clearly not satisfied. I do not know how you resolve that problem. Perhaps we are talking to the wrong schools. There must be a cohort of schools out there that we do not know about. They must be in Brigadoon or somewhere, because we do not know about them.
Mrs Godfrey: Do you want to pick up on the feedback from schools?
Mr Lunn: Yes.
Ms R Kennedy: We have had schools and pupils contact us recently. We have had schools contact us to say that everything has gone fine. Pupils have e-mailed us to say how much they have enjoyed the assessments. There are such schools and pupils out there. Another part of the trial was to get pupil feedback. Pupils were given questionnaires to gauge their experience of the trials, whether they were enjoying the assessments and what their experience as pupils was. So they do exist.
Mr Lunn: I am not being facetious, but I find that the average 12-year-old is far more computer literate than the average 50-year-old. So, there are problems if pupils are having trouble with this. Were those schools and pupils asked to give feedback? They have not voluntarily phoned the Department to say, "We think this is great."
Ms R Kennedy: We have had voluntary feedback as well.
Mrs Dobson: From which constituency? We are very curious.
Ms R Kennedy: We would need to check.
Mr Lunn: Did you get calls from Finland or somewhere?
Ms R Kennedy: As part of the trial, pupils completed a short questionnaire. It asked pupils about their experience of the assessment, and that fed into the trial as well.
Mr Lunn: I will not rant on, Chairman, because I do not really have a question. However, you can probably tell that I am extremely —
The Chairperson: You are doing reasonably well for someone who does not have a question. [Laughter.]
Mr Lunn: I am mighty sceptical and cynical about the whole thing. I have no doubt that you will get it sorted out eventually. It is just a pity that it could not, for once, have been sorted out before it started.
The Chairperson: I want to follow on from the point about cost that Michelle raised and Trevor picked up on. According to the report of the inspector, who will be here next, £470 million was invested in C2k between 2000 and 2011. You have CCEA, which has an element of involvement in this. The Western Education and Library Board looks after the C2k contract. You then have a contract with the two organisations, which we are told is a £900,000 contract. Is it a five-year contract?
Mr Savage: It is three years, with an option for a further two.
The Chairperson: That is a huge amount of money. However, Trevor's comments are very reflective of comments that are made in other places about printers not working, computers freezing and all of that. Have we ever had a procurement process that was successful? Four hundred and seventy million pounds is no small amount of money. We have ended up in a situation in which it seems that all we do in the Committee is concentrate on the real problems that are out there. It is the point that Trevor made. Are we living in some world where all we are getting is the negatives, and we are not picking up any of the positives? I would love to see those positives and know that that £470 million is being properly spent.
At least three organisations are involved in the delivery of all of this, and we still cannot get it right. I am glad that the Department was not responsible for the digital switchover. Nobody would have had any TVs working tonight, or we would be huddled together in one room, looking, as we did in 1953 — well, I did not, because I was not about in 1953 — when there was the Coronation, because it was a new thing to have a TV. I really do despair.
I talked to one teacher in my constituency, and the way she put it to me was that it just adds to her frustration. We all know what it is like to go into our office and then something takes ages to load up, or you are waiting and the wee circle keeps going round and round. You know that you have 101 other things to do, but a principal or a teacher has 1,001 other things to do. It is that frustration. That problem is out there, and I am very concerned that there is an issue around it.
The inspectorate goes on to say that there is concern because the use of ICT to support learning and teaching in around 50% of all post-primary schools inspected was evaluated as being less than good.
Mrs Godfrey: We have invested very significantly over the past 10 to 15 years in ICT in schools. It is absolutely right that schools should have access to modern ICT services. What the inspectorate has identified is not dissimilar, I suspect, to what most of us could identify in relation to our own computers or mobile phones, to use a different analogy. We do not make enough use of the full range of things that the technology can do for us. That is a critical issue that has come out of the chief inspector's report. There is investment. There is investment, which, we know, aids teaching and learning; allows for innovative use of ICT to inspire pupils; and allows for ICT to be used in ways that are particularly effective at encouraging and raising the expectations of pupils who face barriers to learning. However, we are not yet harnessing the full potential of the ICT, particularly in post-primary schools.
In primary schools, you will see wonderful examples. To go back to the levels of progression, the ICT levels of progression will come in a year after literacy and numeracy, but we know that primary schools, in their droves, already take part voluntarily in assessment and accreditation in ICT for their pupils, and that pupils leave primary schools with very high levels of competence in ICT. We have not yet managed to get to a position where we maximise the full potential of ICT that is available and at the disposal of the teacher and the school, and the inspectorate report is clear on that. That remains a real challenge for us. However, the solution to that is certainly not to invest less. Our children are going out into a digital age; they are going out into a world where employers identify ICT capability as the third essential skill, as we were talking about earlier. One thing we absolutely need to ensure is that when pupils leave school, they leave with the technological skills, as well as the other skills that we need. In the longer term, the Committee may well decide to come back to that point of how we are getting the full potential from the ICT investment that has been made. It is a critical priority for us as well.
The Chairperson: The criticism of the Department in relation to all of that is that, having known that in terms of the Education and Training Inspectorate, and having known the problems that there were with InCAS and the trials, the problem was not addressed. That is the frustration. What worries me, Carl, is that if this is a three-year contract, with an option to go for four and five, teachers will be worrying now and saying, "Surely, they are not going to go for another procurement process." Somebody from Azerbaijan could win the contract the next time, and I do not mean any disrespect to anybody in that country. We have thrown all of this money at it, and here we go again; another cycle, another few hundred thousand pounds thrown at it, and teachers have to pick up the pieces.
We will want to come back to this after Halloween. This is an issue that the Committee wants to be absolutely sure of. We will endeavour to find schools that have been successful in this as well. That might be useful.
Mrs Godfrey: We would be happy to assist in that.
The Chairperson: I think you get a sense from the Committee this morning, Katrina, that we want to be helpful, but we want to see progress.
We have noted a number of things. Obviously, we want to return to this after the Halloween recess. We would like to see the risk analyses. Reference was also made to the gateway assessment that was carried out. This may be something that we will take up with the inspectorate: how the inspectorat, along with you, are going to analyse this or give us an analysis of that. We would appreciate an update following the discussions with the National Deaf Children's Society, because it expressed very serious concerns. It had written to the Department, but there had not been any response. Alan Sheeran said that he had raised the issue with CCEA some weeks ago and that he was promised a meeting to discuss the matter, but no date had been set. I understand that assessments are to continue. However, I understand from what you have said this morning that a date has now been set.
Ms R Kennedy: We have invited people; we are still trying to set the date, according to availability.
Mrs Godfrey: The issue there is that there are a number of organisations, and it is appropriate for CCEA to speak to all of them — and more sensible to speak to them together. So that will be a factor.
The Chairperson: Okay, Katrina, Carl, David, Ruth. David, you got off very lightly. We will have to bring you back again.
Mr Hughes: I am looking forward to it.
The Chairperson: Thank you.