Official Report (Hansard)
Date: 21 January 2009
COMMITTEE FOR FINANCE AND PERSONNEL
Monitoring of Post-Project Evaluations
21 January 2009
Members present for all or part of the proceedings:
Mr Mitchel McLaughlin (Chairperson)
Mr Simon Hamilton (Deputy Chairperson)
Dr Stephen Farry
Ms Jennifer McCann
Mr Adrian McQuillan
Mr Declan O’Loan
Mr Ian Paisley Jnr
Ms Dawn Purvis
Mr Peter Weir
Mr David Thomson ) Department of Finance and Personnel
The Chairperson (Mr McLaughlin):
The Committee now moves to the monitoring of post-project evaluations. I welcome David Thomson, who is the Treasury Officer of Accounts in the Department of Finance and Personnel’s central finance group. I invite David to make some opening remarks. I will then open up the session for discussion. I remind everyone that this session is being recorded by Hansard. All mobile phones must be entirely switched off.
Mr David Thomson (Department of Finance and Personnel):
Thank you, Chairperson. I am very glad to be here. My normal involvement with the Assembly is through the Public Accounts Committee, which I attend on a regular basis. This is my first visit to this Committee.
I am not responsible for DFP Supply, and it was in the context of a discussion about the roles of DFP Supply that the post-project evaluation (PPE) issue arose. However, wearing my Treasury Officer of Accounts hat, I am responsible for guidance, advice and financial management. The issue of post-project evaluation is something that I have considered. It is something with which the Public Accounts Committee has been heavily involved since its first report, which was on the subject of the Belfast to Bangor railway line, through more recent reports on issues such as the rating IT system and Civil Service reform projects.
PPEs are important and, as with appraisals, they should be carried out. I do not want to sound flippant; PPE is what is says on the tin — a post-project evaluation. It is about projects, and we have been sucked into doing PPEs on things that are not projects.
A Northern Ireland Audit Office (NIAO) report presented during direct rule recommended that DFP should oversee large capital projects. The Department agreed to monitor such projects; that is how the DFP monitoring of PPEs evolved. My colleagues in DFP Supply helped by using its database of business cases brought to the Department as a tracking mechanism for PPEs. More than just projects come to the Department, and I think that that is how some of those scary numbers have arisen. I am concerned that DFP has lost some of its focus.
I will provide a flavour of some recent developments. Our project-management arrangements are more sophisticated than they were a decade ago. There are now senior responsible owners (SROs), who are obliged to complete lessons-learned reports. More specifically, the gateway review process, derived from the Office of Government Commerce, has been introduced. Before projects start, they must all be assessed against a risk matrix. If a project scores more than a certain number on that risk matrix, it must undergo the gateway review process, which is itself the subject of a current Northern Ireland Audit Office report.
In recent years, we have also revised the core guidance and issued a document entitled ‘Managing Public Money’. That document contains the accounting officer memorandum — the appointment letter for accounting officers — and places a requirement on accounting officers to complete lessons-learned reports. Those reports are now being done, but my colleagues in DFP Supply are not ticking off those projects to indicate that a PPE has been completed.
I would like to make a personal point. I am the SRO for Account NI, which is a major reform project that established a new accounting system and a new financial shared-service centre for the Northern Ireland Civil Service. To date, I have completed three lessons-learned reports. However, the DFP database does not contain any document with “PPE” on it; instead there is a cross at the relevant box. Therefore, Account NI is being flagged up as not having conducted a post-project evaluation when, in fact, I have done three lessons-learned reports. That is an internal issue that must be sorted out.
The future role of DFP Supply must be to ensure that major projects are subject to risk assessment and the gateway review process. We must concentrate on large projects and on ensuring that lessons-learned reports are produced and disseminated. Thereafter, we should conduct test drillings. Departments engaged in smaller projects and smaller business cases should be asked to provide a sample rather than get sucked into the big issue of PPEs. I am happy to take questions.
That presentation was useful. You said that the Department should concentrate on test cases. Given the volume of cases that is highlighted in the appendix, has the approach been too scattergun? Many cases seem to have passed their target date, perhaps for understandable reasons. For practical purposes, is the volume of cases reaching overload?
That is my view. We record every business case, of which there is a huge number that comes to DFP for approval. However, in order to get value for money out of an evaluation process, it is not possible to carry out a significant PPE on every project.
It is not an effective use of time to try to do it for every one.
I note the volume of those that you have evaluated. You appear to prioritise the important projects. Many of those on the list have run a long time after the target date. Perhaps that is because they are smaller or less significant projects. Does pressure of work dictate that you concentrate on the bigger projects?
I am fairly satisfied — though not confident — that most of the big projects are now going through the recognised gateway processes and all the stages of project management. That is not to say that things will not go wrong with them: that is why I hesitate slightly. Many of the other projects for which PPEs have not been done, are small projects. A project that involves only small expenditure, and which has, for some reason, come to DFP for approval, need not require a major evaluation.
I take the point about not being sucked in on every case. However, what about the recommendation that the Departments should conduct ad hoc test drilling of economic appraisals and PPEs? According to our briefing, Departments should report on those findings and submit reports to DFP Supply on an annual basis.
How many Departments conduct test drilling of economic appraisals and PPEs? Are targets set for that and, if so, are Departments meeting those targets? I know that you are reviewing the guidance on economic appraisals. If Departments are not doing any of this, will you be recommending in your guidance that they do?
It is something we should consider. There are no specific targets set, and Departments are not monitoring, so I cannot answer your question as to whether they are conducting ad hoc test drilling. It is done for big projects, where we know that the gateway reviews are being carried out. As for the rest, I do not know. There would be merit to placing some requirement on Departments to monitor economic appraisals.
Ms J McCann:
I am sorry that I missed your presentation; however, I have been looking over the papers we have on this.
We talked about project management at the last briefing. It seems to me that project evaluation should be built into every project: that is how one can tell whether a project has been effective. For many of the projects listed, evaluations have not been completed. Where does that fit in with ensuring that project management is being applied as it should? Projects should be evaluated in order to determine whether continuing them is in the public interest. Evaluation is an essential component in any project. I wonder why there are so many evaluations outstanding.
I totally agree. Perhaps you missed what I said at the start. Project evaluation is important. However, the question is: what is a project? Many of those on the list are not really projects; they are expenditure proposals that had to come to DFP for approval. The Department recorded them on the database and I have used that information as regards following up on the post-project evaluations. What I said at the start is absolutely right: projects should be evaluated. That is in all the guidance that we have. We are tracking the big projects using the gateway process, and we know the exact progress of every such project.
There is a wider issue. I am not sure that the lessons learned are being disseminated correctly. Certainly, project leaders and directors will know what the lessons are. However, whether individuals are sharing lessons learned properly with their colleagues is an issue to be addressed.
Ms J McCann:
Is that issue being addressed?
Yes; before you arrived, I explained that my prime responsibility is with the Public Accounts Committee, and it is to ensure that all the lessons emanating from that Committee process are cascaded down to staff. I have several processes for doing that, and I am reviewing how effective they are, and whether we need to do something else to pick up on other lessons learned. Lessons are being learned beyond the Public Accounts Committee remit, and we need to make sure that those are cascaded down too.
In that explanation, David, there is quite clearly a serious difficulty in evaluating the management performance. The Committee has recently held discussions on the payment of bonuses. You are telling us that, in some instances, it may be questionable as to whether or not a project should be listed as being in default of the post-project evaluation process because, for example, the scale of the project does not justify a PPE. However, that project still appears on the list as an outstanding issue.
It would be in everyone’s interest, including ours when dealing with issues such as the performance of senior staff and those in receipt of bonuses, to see that this is part of the responsibility in their job description which is not being carried out.
For major projects, anybody who is appointed as a senior responsible owner — which I am for Account NI — has a specific responsibility placed on them to make sure that they complete lessons-learned reports, and, eventually, a post-project evaluation.
A lot of the items on the list are not projects; rather, they are concerned with the use of consultants. Following on from the Public Accounts Committee’s review of the use of consultants, we have carried out a major piece of work on that, and new guidance is about to be issued; I have a draft of that guidance on my desk.
Consultancy assignments are one example of where I do not feel that a full PPE is necessary in every case. If, by putting a lot together, we have learned the key lessons, we can disseminate those key lessons without looking at every single case in detail. On the other hand, an assessment must be made as to whether the consultant did what we asked him to do. It is about getting a balance between what is worthwhile to look at and what is not.
Do you get a lessons-learned document irrespective of whether the project was big enough to justify a full PPE?
Yes; it is in the accounting officer memorandum to learn lessons, therefore, an accounting officer has a duty placed on him to do so.
It has been an interesting and valuable discussion. I would have approached the issue of post-project evaluations believing that it is something that must happen in every case, and that when it does not happen, some sort of failure has occurred. The points that you have made, David, have shed new light on the matter for me and have changed my perspective on it.
I notice from the briefing paper that you are reviewing the process, and you elaborated a little on what that might involve. It will be interesting to see the conclusions. Given the current situation, in which post-project evaluations are required in all cases, I want to ask you about the possible consequences of those evaluations not being carried out.
I think it is the case that DFP will not be approving future projects unless there have been a PPEs on similar projects before. Are you concerned that that may hold up projects? Obviously, everybody is quite keen to get projects moving forward as quickly as possible, especially the big capital projects. Is there any evidence that those projects are being slowed up in any way?
I am not aware of any project that was held up by DFP Supply colleagues because PPEs have not been conducted in the past. However, I am aware that colleagues have given conditional approvals — provided that something was checked or done.
As you said earlier, a post-project evaluation is exactly that, and lessons are meant to be learned from them. Is there any evidence that failures to carry out PPEs have had negative impacts on further projects? If IT systems are taken as an example, there barely seems to be a single example of a new IT system being rolled out — in any Department or agency — that does not contain some significant failing. Is there any evidence of lessons not being learned or issues not being picked up because of the absence of a PPE?
The problem is that a lot of lessons-learned reports and PPEs highlight items that should have been done but were not done. Usually, whatever was not done is just good common sense and practice, and it is likely that there is guidance that already states that the procedure in question should have been carried out. For instance, the recent rates system experienced difficulties because the system went live before testing was finished.
We will tell people to stick to guidance because it instructs them to complete testing before programmes go live. However — as sure as eggs are eggs — another IT system will go live without testing being carried out, so we have to continually reinforce the key messages to people. The trouble is that when one drives home key messages something else can get overlooked. This is the circle in which we find ourselves sometimes.
I am interested in the idea of a two-track approach. It is something to consider. However, if there is a more informal approach being taken to some projects, it is important that it does not formalise the lack of learning lessons in these ones. Formalising the element of learning lessons in these types of cases could be a useful way to move forward.
I am not a member of the Public Accounts Committee, so this is somewhat new to me. Would you clarify your role? Is your role purely at the point of project completion, and at looking back at what was done, or do you receive information about what will be coming through the system so that you can join up your own programme of work on post-project evaluations? Perhaps risk assessments could be built into projects that have been formulated, which could highlight to you the projects that need to be red lined for particular focus and prioritisation afterwards.
I do not get involved in specific projects. Apart from facilitating the public accounts and audit processes; being the main interface with the Audit Office, and responding to reports from the PAC when those are published, a key part of my role is to provide guidance. I write dear accounting officer (DAO) letters, which effectively become the law for accounting officers.
It is part of my job to make sure that I give guidance to Departments and to accounting officers. I give advice and guidance whether it comes directly from the Public Accounts Committee and the Audit Office, or whether it is just something that popped up that we have identified. When guidance is issued — or if guidance on post-project evaluations is revised — I am the person who will issue that guidance. I do not get involved in specific projects.
Having said that, colleagues in DFP Supply — because we work in the same building — often ask me what they should do about a particular matter. I am able to point them in a direction, or tell them that a project board should be in place, etc.
I do not know what projects are coming along unless someone tells me. I do not know what projects my DFP Supply colleagues are considering at this point in time.
Is it the case, then, that you do not have the ability to rank the projects that are coming through the system as regards risk, and prioritise those with the highest risk for post-project evaluation?
All I can do is keep re-emphasising to my colleagues to ensure that they have carried out a risk assessment. If the target for risk, which I believe is 30, is exceeded, then the project should be subject to the gateway process. In giving DFP approval for such a project, we should be saying that it can proceed provided that it goes through the gateway process.
Regarding post-project evaluations, I presume that some issues affecting delivery will be internal, that is, where the people delivering the projects will have full control over the outcomes. By the same token, there may be external imponderables that could easily knock a project off course. Projects that are premised on things such as land sales, which we discussed previously, are an example of that. How do you separate those factors when you are assessing projects?
We have a rigorous project management process to deal with that. My position is that the guidance exists and should be followed. For Account NI, we have a fortnightly meeting — there was one yesterday — to assess all the risks, including external and internal risks. As long as people are following the guidance they should be able to cope. Risks could involve market changes or whatever.
The information that you have provided for the Committee indicates that there is a compliance issue in relation to the current requirement for post-project evaluation. You have indicated that that is being reviewed, and I readily recognised that there might be value, in terms of cost and time, in considering a new approach. Is there a danger that the compliance issue will remain unresolved and would deepen if a less rigorous sampling process was used?
We must not throw the baby out with the bathwater. I said at the outset that evaluation is an important part of the appraisal process. We must make sure that that evaluation continues, but it must be proportionate. We will concentrate on the big issues and have a more manageable and routine process for the smaller ones.
How do we distinguish between major projects, big projects and small projects? What do those terms mean?
If a project or proposal is above a delegated limit, it will come to DFP Supply for approval. DFP Supply may spend a day considering it, give its approval, and the process would move forward.
However, for definition purposes, it might be useful if the table of projects that members have could differentiate between projects and smaller spending proposals on routine matters.
I agree that that would help, and it would save having to bulk out a summarised report. As regards compliance, I would like to know what the implications might be.
I do not know whether it would be possible to do so in the database, but maybe the data could be put into the categories of projects and use of consultants. I suspect that a large chunk of the projects listed are consultancy assignments and other spending proposals. It would give me a huge amount of confidence if there was a tick beside the projects that are subject to the gateway process.
Even if a PPE has not been completed, gateway five requires that it should be done. Therefore, I would be relaxed as long as projects are going through a process. We may need to look at how we record items on the database.
Thank you very much.