Northern Ireland Assembly Flax Flower Logo

PUBLIC ACCOUNTS COMMITTEE

OFFICIAL REPORT
(Hansard)

Public Service Agreements — Measuring Performance

8 October 2009

Members present for all or part of the proceedings:
Mr Paul Maskey (Chairperson)
Mr Roy Beggs (Deputy Chairperson)
Mr Jonathan Craig
Mr John Dallat
Mr Jeffrey Donaldson
Mr Trevor Lunn
Mr Patsy McGlone
Mr Mitchel McLaughlin
Ms Dawn Purvis
Mr Jim Shannon
Witnesses:
Mr Gerry Lavery ) Department of Agriculture and Rural Development
Mr John McMillen ) Office of the First Minister and Deputy First Minister
Mr Damian Price )
Mr David Sterling ) Department of Enterprise, Trade and Investment
Mr Richard Pengelly ) Department of Finance and Personnel
Also in Attendance:
Mr Kieran Donnelly ) Comptroller and Auditor General
Ms Fiona Hamill ) Deputy Treasury Officer of Accounts
 
The Chairperson (Mr P Maskey):

Today, the Committee will address the matters raised in the Audit Office report ‘Public Service Agreements — Measuring Performance’. Mr John McMillen, accounting officer for the Office of the First Minister and deputy First Minister (OFMDFM), is here. You are very welcome.

Mr John McMillen (Office of the First Minister and deputy First Minister):

I am joined by Damian Prince, head of the economic policy unit, who oversees the operation of the Programme for Government monitoring, and Richard Pengelly, public spending director and head of the performance and efficiency delivery unit (PEDU) in the Department of Finance and Personnel (DFP). His team helps OFMDFM in looking at departmental performance. I am also joined by Gerry Lavery, senior finance director of the Department of Agriculture and Rural Development, and by David Sterling, permanent secretary at the Department of Enterprise, Trade and Investment.

The Chairperson:

I believe that this is your first time before the Public Accounts Committee. You have done well to escape us so far.

Mr McMillen:

Please be gentle with us.

The Chairperson:

We certainly will.

Paragraph 1.1 of the report outlines the important role of public service agreements (PSAs) in reporting performance, improving service delivery and promoting accountability. In retrospect, is your Department’s oversight role in that process sufficient?

Mr McMillen:

Before I start to answer, I wish to put it on record that the Department welcomes the Northern Ireland Audit Office report. It is, and will remain, an excellent source of advice, best practice and guidance on this subject.

The report highlighted a number of weaknesses evident in the data and the procedures used to monitor the reporting of the 13 PSAs in 2006-07. As the Committee is aware, however, we have put in place revised procedures, protocols and delivery mechanisms to monitor and report the current Programme for Government. Although the design of those new arrangements preceded the publication of the Audit Office report, we were able to draw on the emerging findings of the audit team and the work carried out by the National Audit Office, and also on Treasury guidance. We have since issued guidance to Departments requiring them to produce delivery agreements. Those agreements aim to strengthen accountability and confidence in delivery and, where PSA outcomes cut across departmental boundaries, clearly explain the contributions of each Department.

Importantly, the delivery agreements also require the production of a measurement annexe for each target, and it is that which deals with any data issues. I am not saying that we now have the procedure perfect. Indeed, the delivery report that was delivered to the Executive and Assembly in June 2009 still draws attention to some failings in the data supporting delivery of PSA targets. However, the fact that we are now aware of those shortcomings and reporting them — and, indeed, actively challenging them — demonstrates that we have taken on board the issues that are highlighted in the Comptroller and Auditor General’s report.

Returning to your question, Chairman, the findings of the Audit Office report were certainly disappointing. However, it was looking at the previous Programme for Government, and at particular complex data-set issues, which, in the main, relate to a minority of the PSA targets overall. Indeed, of the PSA delivery agreements for OFMDFM, complex data sets relate to about 25%.

Therefore, although the report is disappointing, I do not think the conclusion could be drawn that it reflected poor performance across all Departments. Indeed, the Audit Office said that that is not evidenced on those issues.

The Chairperson:

Do you regard the Department’s oversight role as having been sufficient?

Mr McMillen:

The previous technical notes were not as robust as what is now required in the delivery agreements and the guidance that followed. In the early part of the decade, we were still learning how to operate the system. It was evolving, as it did in Whitehall and other jurisdictions. We were learning, and we applied what appeared to be best practice at the time. Perhaps we did not apply it as rigorously as we should have. We have learned from that, and we now apply it rigorously.

The Chairperson:

Have lessons been learned?

Mr McMillen:

Yes.

The Chairperson:

Paragraph 1.12 of the report states that a “radically different approach” to PSAs has been adopted and that this provides an assurance that the new data systems are “fit for purpose”. Can you outline how that has been achieved?

Mr McMillen:

We used a system that was based on a different approach to the PSA targets. Previously, targets tended to be departmental. We have now tried to embed the PSA targets in the strategic priorities for the Executive. They are much more cross-cutting.

The guidance requires public service delivery agreements for each of the PSAs, and for there to be a responsible Minister and senior responsible owner within the Departments. The agreements require the establishment of PSA delivery boards for each PSA, which are made up of senior officials from across the various Departments that contribute to that delivery. There is also guidance for the Department on how performance is to be measured, including a measurement annexe that includes issues around data sets and issues raised in the Audit Office report. There is a much more robust system of guidance for Departments to follow.

The key point is the challenge function that is applied at the centre by a team comprising officials from the economic policy unit in OFMDFM, PEDU and the Supply division in DFP. That team monitors performance on a quarterly basis and challenges Departments on how they are performing. That is ratcheting up the delivery mechanisms and making Departments more accountable.

The Chairperson:

What is the time frame? Paragraph 1.12 refers to targets that were set in 2006-07. There was devolution in May 2007.

Mr McMillen:

The main guidance which improved the situation was issued by DFP early in 2007 ahead of the Programme for Government development. The research was being done in 2006-07. That guidance has followed through into the formation of the Programme for Government. The monitoring system that was set up was also put forward at that stage, although it was not approved by the Executive until March 2009. However, Ministers, through DFP and the monitoring rounds, have been monitoring performance against the new guidance and the PSAs on a quarterly basis as part of the monitoring of the Budget.

The Chairperson:

Paragraph 1.11 of the report refers to a “good practice checklist” developed by the Audit Office. If the Audit Office revisits that area and tests a sample of the new PSA data systems using the checklist, will the resulting report tell us a story of significant improvement?

Mr McMillen:

All Departments welcome the checklist. It is very useful to be able to assess how we are doing against a checklist. If the Audit Office looked at how the new system matches up against that checklist, we would fare much better. However, I think that the real value in the checklist will be in the next Programme for Government, because it gives very good advice on defining and setting targets and on how those targets are going to be measured. That will form part of the evolution of how we are improving performance management across the Government.

The Chairperson:

Are you saying that if the Audit Office revisits that, no improvements will be seen until the next Programme for Government?

Mr McMillen:

No, I think that the current Programme for Government monitoring processes are showing an improvement in how we are reporting. Evidence of that is contained in the view of the Confederation of British Industry (CBI), which, in response to the delivery report, observed that there was a big improvement on what has gone before. The CBI saw it as being transparent and open, and it welcomed that. The Assembly’s Research and Library Services has also seen the critique, and also welcomed it and said that it had showed a big improvement. Therefore, the current performance system is better than what was in place previously. There are still problems with it, but we have reported those to the Executive, and Departments are dealing with them in each monitoring round.

The Chairperson:

The scale of some of the weaknesses in the data system led to some real concerns about the reliability and accuracy of the claims that were made about performance. The Committee would like your assurances that such data systems were not relied on when senior managers’ performance bonuses were being determined.

Mr McMillen:

Senior managers’ bonuses are a matter for the permanent secretaries. I am sure that many factors play into those; however, I cannot assure the Committee one way or the other as to how performance on the delivery of Programme for Government targets feeds into such determinations. Data sets and their management would be a small part of the process, but I have no information on how that feeds into the determination of the bonuses.

The Chairperson:

Would it be possible for you to go back to the Department and ask whether the Committee could get a look at some of that information?

Mr McMillen:

I shall certainly see what I can do for you; however, there may be some difficulty in how it relates.

The Audit Office report noted that, given that the data sets themselves are not being managed, it is difficult to conclude that performance is being achieved. However, the report did not necessarily say that it is not being achieved. Therefore, that is a difficult issue. However, I shall take your point on board and see what I can find out for you.

The Chairperson:

The Committee may end up drafting some written questions to the Department on that matter.

Mr Shannon:

You and I meet regularly in Committees, so we are not strangers. My membership of the Committee for the Office of the First Minister and deputy First Minister means that I am aware of your role in OFMDFM.

It is clear that OFMDFM has a central co-ordination role, and I know that you and Assembly Members understand that function. However, it seems that you have not provided guidance on designing or operating PSA systems. In the absence of such guidance, how do you intend to ensure the quality of the data that is produced? That is what the issue is about; you must be able to report to the Assembly on those data so that it can see the targets and the performance.

Mr McMillen:

DFP issued the guidance in March 2007, I believe. It was put together by a joint team from OFMDFM and DFP. It was issued by DFP ahead of the agreement on monitoring frameworks. The joint guidance is out, and it tells Departments how they should set up delivery agreements, how they should put their measurement annexes together, and how they should demonstrate that.

Mr Shannon:

It seems that the data is not there. I understand that the purpose of the system is to make the Assembly, the media and other Departments aware of the data that is produced. If the data is not there and has not been there, is it not difficult to ascertain performance?

Mr McMillen:

We need to make a distinction between the position in 2006-07, with which the Audit Office report deals, and what is extant today. The guidance was issued subsequent to the dates that are covered in the report, and the Programme for Government reporting system now monitors against that new guidance. The central team is made up of staff from the economic policy unit, PEDU and DFP supply. That team issues challenges against that guidance and takes data issues into account. I am on PSA delivery boards and, from my experience, I know that they work on data sets and data issues. Therefore, a mechanism is now in place to challenge the data issues.

Indeed, some targets were given red or amber status in the delivery report because of data issues, where the central team said that standards for data were not being reached, so it pushed the targets back to Departments for them to address for future monitoring rounds.

Mr Shannon:

Are you telling us that the new system will overcome past difficulties?

Mr McMillen:

I believe that those difficulties are being overcome and that that will continue; the situation will improve incrementally.

Mr Shannon:

Will the same monitoring system be in place for reporting to the Assembly and all the Committees so that they will be aware of what is happening through the new system?

Mr McMillen:

The First Minister and deputy First Minister will take the report to the Executive and the Assembly. I believe that a take-note debate on the first report, which covers 2008-09, is scheduled for some time in this session.

The Committees have also been receiving departmental responses. As you are aware, I was up in front of the OFMDFM Committee to talk about the performance of my Department.

Mr Shannon:

Paragraph 2.9 of the report sets out the key components of technical notes. It states that they should:

“set baselines; provide definitions of key terms; set out clearly how success will be assessed; describe the data sources that will be used; and outline any known and unavoidable significant weaknesses or limitations in the data system.”

Those are the criteria. However, paragraph 2.10 refers to several deficiencies that were found when the Audit Office examined the technical notes. It seems that there were deficiencies in the technical notes provided by Departments in which at least one of those guidelines was not used. It seems very unusual, if not wrong, to have guidelines and then ignore them.

Mr McMillen:

I accept the finding in the Audit Office’s report that the criteria were not being applied in 2006-07. The previous system lacked a challenge function at its centre to push the Departments back and examine what was coming through. We now have the central team who carry out that function. We have published measurement annexes for delivery agreements, which set out the criteria. The guidance requires very similar criteria to what was set out in paragraph 2.9 in the guidance for Departments to include in their measurement annexes. That is now being assessed by the central team. If something is not satisfactory, it will be reported back to Departments. In some instances, red or amber status is given to a target because it cannot be evidenced.

Mr Shannon:

Paragraph 2.10 also states that:

“a number of the Technical Notes were factually inaccurate.”

Will you assure the Committee that that will not happen again, and give us an explanation as to why it happened?

Mr McMillen:

I do not have an explanation for individual Departments, but I assure the Committee that we now look at the accuracy of technical notes as part of our central monitoring. We also question baselines; if there are errors, we immediately go back to the Department to ask why.

Mr Shannon:

The point that we are making is that it cannot be that difficult to have technical notes in place. We look forward to a vast improvement.

Paragraph 3.22 of the report refers to a delay — and again, no delivery report. The Public Accounts Committee has a role as the policeman of the Assembly and its Committees. We are concerned about the delay in publishing the 2006-07 report; performance for a large number of targets went unreported as a result. From the Programme for Government and Budget website, I see that the last old-style performance report was for 2005-06. Therefore, there is a gap between the old-style and the new-style reports for the year 2007-08. How can the Assembly assess the performances of Departments in 2007-08 if we do not have a report?

Mr McMillen:

I accept that there were delays in getting the 2006-07 report published, which, as reported, were due to resource constraints. The Department is committed to producing a report on the Programme for Government as per the monitoring framework. As Members will be aware, we produced the first report for 2008-09 in June. We are gathering and recording information for the next report for general information within Departments, and we will be producing other reports every six months. We are committed to doing that.

Mr Damian Prince (Office of the First Minister and deputy First Minister):

I can reassure members that, since the Executive approved the framework for monitoring the Programme for Government on 5 March, we have produced three reports on the performance of each Department. We produced an indicative report at the end of December, which was a dry run to ensure that we had the correct systems in place to properly monitor what was happening. We also produced a delivery report as at the end of March, which members will have seen, and which has been placed in the Assembly. We have also commissioned a report as at the end of June 2009, and that is a good way along.

Members can be assured that there is now a process in place. Even the delivery report at the end of March is not a full stop; it is a punctuation mark along the whole journey of reporting back on the Programme for Government. There will be regular update reports and regular challenges. The benefit of the new system is that it has changed the culture and the approach. The delivery of the Programme for Government is being driven, monitored and interrogated by the centre, rather than just by individual Departments.

Mr Shannon:

I understand the explanation that you have given, but are you saying that there was no formal reporting of PSA performance for either 2006 or 2007? Has that been lost, or acted on, or are you saying that it is just a mistake from the past, but that you have now moved on and will get it right in the future? Is that what you are saying?

Mr Prince:

My understanding is that, for the end of each of those years, performance was reported in the accounts of individual Departments and through their departmental boards. There was no composite report, as had been produced earlier. At that stage, in 2006-7, we were on the cusp of a new direction with a new Administration and the design of new systems. Most energy and effort went into getting the new system right and getting something nailed down to monitor the current Programme for Government. That is where the main energy and attention now centres.

Mr Shannon:

Are you simply putting your hands up and saying that you got it wrong for those two years, but that you will get it right now? Is that it?

Mr McMillen:

I would point out that that was prior to devolution. We have now produced a report for the first year of the new Programme for Government for 2008-2011, and intend to continue to produce reports for Committees every six months or eight months.

Mr Shannon:

In relation to paragraph 3.21 — I think Dusty Bin was on ‘3-2-1’ a long time ago. That is probably showing my age. Paragraph 3.21 of the report, which deals with the introduction of PSAs, notes that, in order to comply with good practice, performance reports should:

“Include latest outturn figures, compare performance against baselines and provide historical trend data”.

I do not see any of those three things in OFMDFM’s new-style delivery report, issued in June 2009, which the Committee has seen. Why does the new approach not follow basic best practice?

The Chairperson:

I ask members and witnesses to speak up a bit, as the sound system is of very poor quality.

Mr McMillen:

The delivery report to the Executive, and thus the Assembly, is not intended to be a detailed analysis of each individual target, but is rather designed to inform the Executive and the Assembly of the direction of travel, showing where we are doing well and highlighting areas that need further examination. It does not go into detail.

Mr Prince:

The OFMDFM Committee also requested the detail that you are looking for. One member said that people do not feel the greens that are being highlighted in the reports. The delivery report is essentially a document to give the Executive a high-level overview of what is going right and what is going wrong, but it is built on a lower level of detail, which is contained in the delivery agreements. Those cover things like what the baseline is for each individual target, what progress has been made, what percentage of completion has been achieved, what milestones have been reached and what has been missed. The central delivery team reviews those reports and challenges Departments.

The red/amber/green (RAG) status is determined on the basis of those reports. When a report makes a claim that something has green status, we interrogate it further to look for the evidence that it should have that status. At that point we get down to the lower level of detail. If Departments are not at the point that they said they would be at, we will mark the rating down. That information is available not in the delivery report, but at the next level down, in the delivery agreements and the information that Departments submit to substantiate their RAG status.

Mr Shannon:

The key issue for Committee members is how we can access figures if the three best practice outlines that we have in front of us have not been followed. If the figures are not made accessible through the best practice system, the Assembly is almost being misled about how that is done. I am sure that that is not your intention; perhaps you need to reassure us on that point.

Mr McMillen:

I can reassure you that there is no desire to mislead anyone. It was a balance between overloading information and having a composite report that is acceptable, easily read and quickly digested. It may be that we have overdone that in some instances. We would welcome any feedback from the Committee on whether the level of information should be expanded. The report was welcomed by the CBI and others who considered it to be a big improvement and regarded it as being transparent, open and quickly accessible. I can appreciate some of the detail, and we can look at the possibility of that being accessed through the departmental delivery agreements on departmental websites.

Mr Dallat:

Glossy reports such as this are about real people; people who are badly disadvantaged, children who are living in poverty, and so on. The Assembly depends on the data collected to influence the political change that is needed. Therefore, what we are discussing this afternoon is something that is really serious. Do you agree?

Mr McMillen:

I do.

Mr Dallat:

Public service agreement 13 is the only target selected, and it did not meet any of its criteria. Is that not an awful indictment? PSA 13 is about child poverty; the failure to achieve the target means that children who are suffering from deprivation and poverty will not have their needs addressed because the data is toxic or is being googled from the Internet and is not relevant to what is happening on the ground.

Mr McMillen:

I certainly accept the point that real data is important in making policy decisions and driving forward. I do not necessarily make the connection that the row of Xs against PSA 13 leads to the conclusion that we are not making an improvement on that. In mitigation, PSA 13 is a good example of a target that was badly drafted at that time. It includes targets such as improving prospects and opportunities, which are very difficult things to measure and to get the data sets in place for. Many of the Xs are there because we could not evidence some of those issues.

The report refers to inaccuracies in the hard data on child poverty. A lot of that arose because of the system for measuring child poverty. We wanted to include data going back to 1998-99, but it was not collected until 2002-03. Therefore, we had to use other factors to work the data baseline back. As we were not able to make that data statistically accurate and to put intervals on it, from a statistician’s point of view, it was deemed to be a plausible estimate but not accepted as being statistically accurate. However, a Treasury assessment done in a different way came within 2% of our estimate. So we were working with figures that were plausible and could be taken forward to demonstrate improvement, but they were not statistically accurate.

Mr Dallat:

Surely you should have been at the forefront of this initiative. What methods were you using that did not hit at all on the reality of child poverty?

Mr McMillen:

We ended up with statistical evidence that set the baseline for child poverty, which was within 2% of an assessment carried out by the Treasury but was not statistically rigorous in terms of professional statisticians’ standards. However, our statisticians were happy that that was a plausible and good target for us to work off, and that was the baseline that we worked from.

Mr Dallat:

Do you agree that child poverty is one of our single greatest injustices? The Good Friday/Belfast Agreement was signed 11 years ago, and a promise was made in section 75 of the Northern Ireland Act 1998 that there would be equality. You have had all the available resources at hand to produce statistics that would allow political decisions to be taken to address the difficulties faced by the most vulnerable people in society. Do you agree that you have failed in that regard?

Mr McMillen:

I believe that we have produced baseline data that has allowed us to measure improvements in what we are doing to address child poverty. I accept that it has not met the required criteria, and we have put systems in place to try to improve on that. Child poverty is still a key target for the Department in the current PSAs.

Mr Dallat:

Following on from that, I note from the 2008-09 delivery report that the delay in achievement against this target:

“reflects outstanding decisions on a measurement for severe child poverty”.

Is that not a classic example of putting the cart before the horse?

Mr McMillen:

When Ministers were developing the Programme for Government there was a desire to do something about severe child poverty, but there was no accepted definition of what child poverty was, nor was there a way to measure it. Nevertheless, Ministers wished to do something about it, and it is perfectly right that politicians should be able to say what is important and to express aspirations. Statisticians in the Department have been examining how we can measure severe child poverty; they have presented a number of options to Ministers and are awaiting a decision.

Mr Dallat:

Do you agree that a definition should have been established before you started measuring it? Otherwise, you do not have a clue what you are measuring.

Mr McMillen:

That is an important factor, but it should not necessarily limit the wishes of politicians and Ministers in setting targets for issues. Targets can, sometimes, be aspirational, after which we can try to find systems for measuring them. Certainly, where definitions can be arrived at beforehand, that makes it much easier to measure performance.

Mr Dallat:

Do you agree that you were shooting at a target that you could not see, because you did not know what the target was?

Mr McMillen:

At that stage, yes.

Mr Dallat:

That is an awful indictment, given the levels of inequality in Northern Ireland that affect children, and, particularly, children who live in child poverty as we speak. Let us hope, Chairperson, that something positive comes out of this sooner rather than later. Otherwise, this Committee is wasting its time.

Mr McMillen:

There are two sets of targets. There are targets for child poverty, which have statistics and measures agreed with them. We are trying to reach agreement on a definition of, and measurements for, severe child poverty, for which Ministers set targets.

Mr Dallat:

Do you agree that those were the targets that you should have been focused on — the ones for severe child poverty?

Mr McMillen:

When we can put a definition on it. Everyone should be focused on the targets for severe child poverty.

Mr Beggs:

The report sets out specific limitations in the old PSA targets for the Department of Agriculture and Rural Development (DARD), including a failure:

“to provide baseline figures or describe how a net increase in jobs would be calculated”.

Incorrect baseline figures were provided in another instance, and, in a third case, it was found that:

“the data system used to determine baseline information was not subsequently used”.

I will turn to the new system, and, in particular, PSA 4, ‘Supporting Rural Businesses’. One of your new targets is to:

“reduce by 25% the administration burden on farmers and agri-food businesses by 2013.”

Can you outline the baseline position for that target and define each of the key terms in it?

Mr Lavery:

Under PSA 4 we have committed to reduce the administrative burden created by DARD on the agrifood sector by 25%. To take that forward, we had an independent review of the current burden on the agrifood sector. That review reported on 30 June and has completed its public consultation period. On the basis of the responses to that consultation and our own review of it, we will make a Government response.

That review used an international standard methodology to measure the impact of red tape on business, and the Department expects that that methodology will be followed through to measure its performance.

Mr Beggs:

Can you define what that target is? A target must be SMART (specific, measurable, attainable, realistic and timely) if it is going to work; it must be clearly measurable. What is the Department actually measuring?

Mr Lavery:

The Department will measure the burden on a farmer of, for instance, having to read, absorb and comply with the guidance to complete the Integrated Administration and Control System form or the single farm payment scheme form; being required to accompany a veterinary officer in the course of a brucellosis or bovine TB test; and completing his claim form. All of those operations take time, and that time will be calculated in accordance with the standardised methodology and converted into a financial impact. Currently, the sum total of the financial impact has been calculated at £50 million per year, and the Department is striving to impact on that figure.

Mr Beggs:

Are you confident that an objective rather than subjective measure — which could cause great variation in the figures produced — will be used?

Mr Lavery:

Absolutely. The Department will have an independent assessment of its performance; we will not simply be assessing it internally and marking our own homework. The Department will submit its findings to an external reviewer.

Mr Beggs:

OK. Moving on, the fundamental principle of PSAs is to measure performance against the key business objectives of a Department. Paragraph 2.19 of the report refers to a complicated target on the annual supply of timber using agreed sale lots. It is not clear whether that target gets to the heart of the efficiency and effectiveness of the Forest Service, which is one of the Department’s key business areas.

Do you accept that measuring wood output does not measure the efficiency or the effectiveness of the Forest Service, as the Department could sell more wood by dropping its prices, or move too much wood and become unsustainable. Do you agree that that is not a SMART target?

Mr Lavery:

First of all, it is important to have a range of targets when undertaking performance measurement. The more targets a Department has, the better it is for the public and for managers who are trying to drive performance. In this —

Mr Beggs:

What does that target tell the Department?

Mr Lavery:

In that case, the output target for the Forest Service was purposely set at 400,000 cubic metres of timber per year. The performance was driven to achieve that level of output, and the Forest Service was successful in achieving that target throughout the PSA period.

That, in itself, is very important and significant, but the key message behind that target was to reassure the timber processing industry that, during that three-year period, the Department did not intend to have an increase or a diminution in supply. That was an important reassurance to an industry that is a substantial employer in rural areas.

Mr Beggs:

What does that target tell the Department about the efficiency or the effectiveness of its business? Are there any commercial-style targets in that area? The setting of an output target does not provide that commercial information on its own.

Mr Lavery:

In addition to that target there is a suite of targets contained in the Forest Service business plan each year, on which the Department reports. It is that accumulated position that allows the Department to measure the efficiency and effectiveness of the Forest Service. In the area of timber supply, the Department benchmarks its prices and performance against the rest of the UK and the Republic of Ireland, and it examines the out-turn prices that it achieves. Therefore, there is a suite of commercially-driven targets in addition to the output target.

Mr Beggs:

Would it not have been better to include those targets in the Department’s PSA?

Mr Lavery:

At the time, we were under a regime that invited Departments to specify 10 targets that encapsulated them. The Department of Agriculture and Rural Development is complex and wide-ranging, and we made the best selection that we could at the time in the knowledge that the Forest Service was publishing its annual performance publicly.

Mr Beggs:

I will move to another target. Appendix 1 of the report states that PSA target 1 seeks to:

“Reduce the gap in agricultural Gross Value Added (GVA) per full time worker equivalent (measured as Annual Work Units) between NI and the UK as a whole by 0.6 of a percentage point per annum between 2003 and 2008, i.e. from 34% in 2003 to 31% in 2008.”

Will you explain the meaning of “annual work unit”? Most people have an idea about GVA. However, that sentence is a very complicated construction, never mind how difficult it is to understand its meaning. What does it mean to the average man on the street who is meant to be reading these reports and assessing government actions?

Mr Lavery:

That target was popular with the economists, if perhaps less popular with the man on the Clapham omnibus. However, the idea was to recognise the gap in productivity between Northern Ireland industry and the average productivity in the United Kingdom, and then to strive to reduce it. To do that, the Department has a suite of programmes that range from education and training to capital grants. We were trying to encapsulate the overall impact of those programmes, which, after all, must be intended to improve the competitiveness of our agriculture industry versus that in the rest of the United Kingdom and worldwide.

We had to try to get an overall measure. In that context, we had to produce a unit, which became the annual work unit. It is simply a way of finding a statistical unit that can be used as a unit cost or, in this case, a unit of performance. The industry has a wide range of working practices; there are full-time farmers, part-time farmers, farmer spouses and labourers. We reduced all those roles to a particular annual work unit, which is a statistic that can be used as an agriculture performance measurement that is acknowledged across the United Kingdom and that enables us to benchmark against the UK average.

Mr Beggs:

Why was the targeted improvement so modest at 0·6% a year? Moreover, why did a 31% gap remain at the end of the period? If the target is so modest and the outcome so minimal, is all the effort to record the figures worthwhile?

Mr Lavery:

A 3% improvement in productivity compared with the United Kingdom average would have equated to around a 10% improvement in our relative position. That would have been significant. However, the difficulty with this target, as the report makes clear, is that gross value added is a good way to measure the health of a sector or the health of a region. It is not a good way to measure the impact of our performance. We have discontinued its use to measure performance, because gross value added encapsulates many factors, most of which are outwith our control, including the exchange rate and movement in prices, which, unfortunately for our industry, is happening in both directions at the moment.

Mr Beggs:

Have you set this as another target that has not been a SMART target? It must be borne in mind that smart targets are a fundamental thing taught to anyone in business. Do you accept that this is the third area where you have not had a SMART target?

Mr Lavery:

At that time, the focus was on outcome rather than output. SMART targets are a very good way of measuring the output of an area. I could specify that during the course of 2008-09 I would train 1,500 people and that they would arrive at a certain level of qualification. That is a SMART target; it measures the output. Our focus at that time was very much on outcome, and what that would mean to the economy. That target was designed to try to convey an overall economic out-turn.

Mr Donaldson:

I declare an interest as a former Minister in OFMDFM, though not during the period covered by the report. Paragraph 2.3 notes that senior managers in Departments are ultimately responsible for the quality of PSA data systems and so on, yet paragraph 2.4 shows that the departmental management boards appear not to have taken an interest in PSA quality control and data accuracy — and that is not exclusive to your Department by any means. What systems and controls does DARD have in place to enforce that quality assurance function?

Mr Lavery:

First, we have sought to comply with the guidance that OFMDFM has helpfully issued. Therefore, I chair a PSA delivery board, and that board meets quarterly. Each quarter we receive a progress report on our PSA targets, and we review those targets and look at any data quality issues that may present. The six-monthly reviews go to the Minister and the Committee for Agriculture and Rural Development, and we take care that they are also seen by our entire board, including the independent members. Each PSA now has a senior responsible officer (SRO) who is a member of the Senior Civil Service, and those officers are supported by data quality officers. Data quality officers are under absolutely no illusion that they are responsible for precisely that: the quality of the data that goes in to support the target, and so forth. That is the system.

Mr Donaldson:

Yes; that is the system, but the evidence presented in the report indicates that, although management boards took an interest in ensuring that targets were in place and monitoring progress towards the achievement of those targets, when it comes to the PSA data systems — the data accuracy and quality control — there was no evidence that Departments had put in place formal guidance or policies specifically in relation to the standards expected. Although you have lines of management, which is good, have you now put in place those systems? Have you now got policies and formal guidance specifically designed to establish whether the PSA data systems, quality control and data accuracy are working for you?

Mr Lavery:

I can offer you some reassurance. First, we have learned lessons from the Audit Office report. Secondly, we have ensured that there is a delivery board in the new PSA framework, which I chair. Because the delivery board is focusing on PSA performance, it has the time and the focus to drill down into the PSAs. Perhaps the departmental management board would not have the time to go that far in its meetings. Having a separate board to do that work will, obviously, improve the focus.

The third reassurance is that the Committee for Agriculture and Rural Development, I am delighted to say, takes a keen interest in the performance of the Department, and specifically our reporting of the performance on PSA targets. With that degree of invigilation, interrogation and the stimulating enjoyment of debate, I am satisfied that no stone will be left unturned.

Mr Donaldson:

I am afraid that we are not, Mr Lavery. The problem is that I am not hearing that you have put in place formal guidance and policies to deal with this. It is all very well being able to drill down, but if you do not know what you is looking for — the standards that are required for quality assurance, data accuracy and quality control — how are you able to assess whether the people who are doing the drilling are doing their job and getting to the core issues?

Mr Beggs referred to your targets, and you gave an explanation for them. Perhaps it might assist if you were able to more clearly establish whether the whole system is delivering in relation to data control. I am anxious to hear whether you have put in place formal guidance or policies.

Mr Lavery:

In that case, there are two further reassurances. First, the good practice checklist, which is contained in the annexe to the Audit Office report, was sent to all of our SROs and our data quality officers. To that extend, people know the standard; they know what they are trying to do.

Secondly, we have commissioned our internal auditors to carry out an in-depth review, specifically to establish whether DARD PSA data systems and reporting procedures are compliant with the good practice checklist. That has been commissioned, and it will report at the end of October. Thirdly, the guidance that was provided by OFMDFM in 2007 has been closely followed by DARD.

Mr Donaldson:

It is good that we have guidance. Are you able to provide the Committee with a copy of the policies that you have to monitor the progress that is being made, particularly in relation to the standards that are expected from PSA data systems?

Mr Lavery:

I am happy to do so.

Mr Donaldson:

Although the report identified several weaknesses in the data systems that were used by your Department to support PSA targets in 2006-07, paragraph 1.12 outlines assurances from OFMDFM that things have improved significantly, and we heard the evidence from Mr McMillen about that. Can you explain to the Committee exactly how you have ensured that the data systems that support more recent PSA targets for your Department are robust?

Mr Lavery:

All of the PSA targets were developed in the context of guidance that was set out by OFMDFM and circulated in the Department. They have all been the subject of reporting on progress to date. They have also benefited from the draft and final reports of the Audit Office in this investigation being circulated to the business areas named in the report, and from the good practice checklist that was circulated to all of the SROs and the data quality officers. Following the completion of the process, the memorandum of reply and the Public Accounts Committee’s conclusions will be circulated in the Department and specific lessons will be learned.

Mr Donaldson:

Paragraphs 2.14-2.17 outline the need to involve professionals in the PSA process. Therefore, in reference to your new PSA, can you explain the extent to which your professional economists or statisticians were involved in designing your targets? What did your Department do as a result of its advice to ensure that each data system was fit for purpose? Going back to the point that was made by Mr Beggs, if you are falling well below a PSA target, do you involve your economists and statisticians to assess why that is the case and whether the target is robust?

Mr Lavery:

The issue of involving specialists in the formulation and design of PSA targets and, indeed, in their operation is an area in which, in 2004, we did not perform as well as we should have. The reason for that is that, with the exception of the gross value added target, we relied heavily on administrative systems and management systems that collect information to measure out-turn. We did not see that the involvement of an economist or a statistician would have added significantly to those systems. In hindsight, that was wrong. Had we involved a statistician, we could have avoided many of the deficiencies and weaknesses that the report points to. We have taken that on board. This time, an economist has been involved from the start of the PSA dialogue through the design of the targets. In addition, the data quality officers all have access to an economist and a statistician to support them.

Mr Donaldson:

Thank you for your honesty. Hopefully, in due course, the results of that will be seen in the PSA targets and the Department getting closer to meeting those targets.

Mr Lunn:

Mr Sterling, you have had the benefit of listening to Mr Lavery answer a question on behalf of his Department. I ask you the same question. The report has identified weakness with the data systems used by the Department of Enterprise, Trade and Investment (DETI) to support the PSA targets that were in place during 2006-07?

In paragraph 1.12, OFMDFM assures us that things have improved significantly. On behalf of the Department of Enterprise, Trade and Investment, can you explain what has been done to ensure that the data systems supporting your Department’s more recent PSA targets are robust?

Mr David Sterling (Department of Enterprise, Trade and Investment):

I will go back to 2007. The issue of the guidance was quite fundamental to the changes that we have put in place. In the autumn of 2007, that guidance was used to inform the development of PSAs within DETI, at the time that the PSAs and targets for the current Programme for Government were being developed. The intention was to comply with that guidance at that time.

In the Department, at that time, we left the responsibility for developing PSAs and targets with our economists. They took the lead in developing the draft PSAs, and were involved in negotiating those in the Department with our Minister and, ultimately, with OFMDFM and the Executive. Since then, further measures have been introduced. We have created a new division, the strategic planning, economics and statistics (SPES) division. We have brigaded our economists and statisticians, and cemented with that division the responsibility for the development of PSAs and targets, the maintenance of the data systems, the validation of the data systems, and the monitoring of performance. We have given that responsibility to our specialists.

The oversight of performance occurs at a variety of levels. Working from the bottom up, I can tell you that, on a quarterly basis, our SPES division —

Mr Lunn:

What division?

Mr Sterling:

The strategic planning, economics and statistic division; it is a bit of a mouthful.

That division engages with our various business areas and with our satellite organisations, such as the Tourist Board, Invest Northern Ireland, the Consumer Council, and the Health and Safety Executive. The division challenges the various business units within the Department, and the non-departmental public bodies, on their performance. There is then a formal oversight and liaison meeting with the various business units, and the non-departmental public bodies, at which senior management in the Department — including me, as senior responsible officer — engage with our counterparts in the various areas. The results and reports from those oversight and liaison meetings are reported to the departmental board.

Delivery boards are also in place for PSA 1 and PSA 5. Ultimately, we report our performance to the Enterprise, Trade and Investment Committee at the year-end. The Minister discussed Departments’ performance for 2008-09 at a Committee for Enterprise, Trade and Investment meeting in June. That is a broad overview of the framework that we have in place.

Mr Lunn:

That sounds very comprehensive. Is that in line with what the Agriculture Department has done?

Mr Lavery:

Yes. We are not quite twins, but we are closely related.

Mr Lunn:

At paragraphs 2.23 to 2.26 it seems that limitations with your PSAs were not clearly disclosed. Why did you not consider it necessary to fully disclose limitations in the quality of the data, the methodology or the underlying systems for those important areas?

Mr Sterling:

I distinguish between the previous Programme for Government and the arrangements that applied then, and the arrangements, processes and procedures that we have put in place for the current Programme for Government. I acknowledge all the deficiencies that have been identified.

We have sought in the technical notes for PSA 1 and PSA 5 to ensure that all limitations in our current data systems and procedures are identified. To the best of my knowledge, that is the case. However, to provide further satisfaction in that regard, our strategic planning, economics and statistics division will conduct a further review of the adequacy of the data systems in the run-up to the production of the next Programme for Government. We were satisfied, in a sense, with what was produced in 2007; however, we will validate that again before the process for developing the next Programme for Government starts next year.

Mr Lunn:

Do you think that the quality of the data may still be deficient in some areas?

Mr Sterling:

I do not think that we are perfect; however, we have addressed all the deficiencies that were identified in the report. Some issues will always be difficult to resolve entirely. One of the issues identified was the use of GVA as a measure of performance, because of the long time lag. There can be more than two years between performance and getting accurate GVA information from the Office of National Statistics (ONS).

Nonetheless, we still believe that GVA is a good measure of living standards and prosperity, because it is an internationally recognised standard. It is the right thing to be measuring. The fact that we do not have up-to-date information is a problem, but we are trying to address that. We have commissioned forecasts from an independent organisation. We will get the first set of forecasts in the next few weeks. Those forecasts will never be as accurate as the material from ONS. Nonetheless, they will be produced using a broadly similar methodology. We cannot address that limitation entirely; however, we believe that we are doing as much as we can do in the circumstances.

Mr Lunn:

Do you accept that the financial information that was presented to the Assembly in 2007 should have carried a health warning or at least a caveat that it might not be reliable?

Mr Sterling:

Yes.

Mr Lunn:

What would you say about 2009?

Mr Sterling:

Without going over what we did in 2007 again, we have sought to identify risks in the Department through our risk management processes. We have concluded at corporate level that our data systems do not pose a major risk. We have, therefore, concluded that our data systems are fit for purpose. We recognise that we will be challenged on that at some stage. We are carrying out an internal challenge, and risks have been identified in certain areas at lower levels.

For example, our statisticians have identified a risk with their role in dealing with ONS data. Through the Northern Ireland Statistics and Research Agency, they have engaged with ONS to ensure that we are satisfied that the quality of data is ample.

Mr Lunn:

The Department’s agency companies, particularly Invest NI, have received some unfortunate publicity in the past few weeks. I do not agree with all of it, but are the statistics upon which the criticism was based accurate?

Mr Sterling:

Without going through the press commentary, I can say that we have put as much attention into the development of the data systems, targets and performance management arrangements for Invest NI as we have for the rest of the Department. The report identified deficiencies with some of Invest NI’s measures, for example, the use of GVA and the measures in relation to R&D expenditure. We believe that we have addressed those. On reflection, the comments about Invest NI’s performance in the press do not really include a criticism of its data systems. I hope that that answers your question.

Mr Lunn:

That is a fair answer; thank you.

Ms Purvis:

I shall follow on from the point that Trevor made. Mr Sterling, you spoke about your previous and continuing reliance on data sets from ONS. Paragraph 2.19b of the Audit Office report expresses reservations about using those data sets, particularly for the monitoring of progress against year-on-year PSA targets. You said that you were satisfied with what you produced in 2007, but, given the reservations that the report raises, how can any reliance be placed on the Department’s claims of achievement against the targets?

Mr Sterling:

We acknowledge that the issue of the time lag on GVA-based measurements is a problem. We are trying to address that through the production of those forecasts, which will never be as accurate as the ONS statistics.

However, we use GVA as a measure of living standards and prosperity, and the overriding goal in the Programme for Government that we are working towards is the halving of the productivity gap between here and the rest of the United Kingdom by 2015, excluding the greater south-east of England. Northern Ireland’s productivity is currently 81% of that of the rest of the United Kingdom, excluding the greater south-east of England.

That target will not significantly change over time, and the independent review of economic policy acknowledged that significant improvements in our productivity will only occur over time. For that reason, using GVA is acceptable, because it looks at trends over a long time. I acknowledge that it would be good to have more accurate information more regularly, but it is difficult to do that.

Ms Purvis:

Paragraphs 3.14 and 3.15 of the report mention the two-year time lag. Appendix 1 shows Departments’ measurements against PSAs. At the time, it was difficult to report on those. Can you confirm whether all those PSAs have been reported on, and what the outcomes were?

Mr Sterling:

In each of those areas, we have either sought to address the limitations or, in some cases, agreed new targets. In part, that is to address the deficiencies that the report identifies.

Ms Purvis:

Can you provide the Committee with that information?

Mr Sterling:

I am happy to produce something that shows the way in which the targets have been improved or revised.

Ms Purvis:

I appreciate that.

Mr Sterling:

Our technical notes set out the current targets, but I understand that it would be helpful to be able to see how those have progressed. I am happy to provide that information to the Committee.

Mr McGlone:

The discussion has become very technical, and data is whizzing about all over the place. Will you explain all these guidance issues, the memoranda and so on? Who enforces the guidance to ensure that the work is done? I was formerly the Chairperson of the Committee for the Environment. At the end of each year, the programme of work that had been carried out was compared with what had been presented at the beginning of the year. If the oversight and global management is not being done, how do you seriously expect to comply with the other PSA targets? If about 60% of the work programme is delivered through the Department and the Committee, how can everything else be expected to fall into place? Who has the oversight role to ensure that the work is done? How does the PSA activity fit in with that?

Mr Sterling:

The guidance to which we have been working is DFP guidance that was produced in spring 2007. I have tried to explain how we have sought to ensure compliance with that guidance, and we have done that largely internally.

Mr McGlone:

What if it is not being done? Where does the buck stop? Who ensures that it is done?

Mr Sterling:

The buck stops with me.

Mr McGlone:

I am not talking about your Department; I am talking about Mr McMillen’s Department, for example.

Mr McMillen:

The key feature of the Programme for Government, which this report looks at, and where we are today is that the central challenge function has now been given to the team from OFMDFM, DFP Supply and PEDU.

Mr McGlone:

Who is on that group, and when was it set up?

Mr McMillen:

The group was officially set up after the Executive agreed the new Programme for Government monitoring system in March 2009. However, it has been doing administrative work behind the scenes for a couple of years. The group looks at the quarterly returns from Departments and challenges them to decide whether there is evidence to support the —

Mr McGlone:

Challenging the returns and bringing home the bacon are two different things.

Mr McMillen:

The group challenges the returns, and, if the evidence is not any different, it will present the report for the Executive. It is that report that informs the Executive of where Departments are having difficulty in delivering the Programme for Government targets. That information can be taken into accountability meetings between Ministers, and they can check on where Departments are having problems or not delivering the Programme for Government targets. Therefore, a system is in place that takes the matter right through to ministerial and Executive level.

Mr McGlone:

I hope that we will see some product, then.

PSA 6 relates to the establishment of new businesses. Paragraph 3.18 of the report indicates that the PSA target did not square up with Invest NI’s corporate plan. I have looked at the Department of Enterprise, Trade and Investment’s justification of each of the targets. There are figures like 8,500 new businesses to be created and 3,000 new businesses in new targeting social need (TSN) areas. However, during the 2005-08 period, that changes to 10,000 new businesses, of which 40% were to be in new TSN areas. Does that mean that the target for new businesses has been reduced by 1,500?

Mr Sterling:

Yes; the target was changed.

Mr McGlone:

Can you understand how people such as me could be confused by that?

Mr Sterling:

Yes.

Mr McGlone:

In the Valence Technology report that the Committee produced last week, we recommended the development of a quantified target for Invest NI to ensure that the issues facing people from areas of economic and social disadvantage were addressed. One of the targets in PSA 6 relates to the establishment of 10,000 sustainable new businesses, of which 40% were to be in new TSN areas. I followed that target through to the 2008-11 Programme for Government and see that it has been revised to 6,500 new jobs from inward investment. Seventy per cent of the projects generating those jobs are to be within 10 miles of an area of economic disadvantage.

How does that square with your previous target? How different is it? Is there a disparity? If so, have you lowered or raised the target? Will you explain why the wording of the old target was changed?

Mr Sterling:

I will explain that now, but I will also try to explain it in the note that I told Ms Purvis I would send, which will set out the reasons for changes so that we can compare the current Programme for Government with previous ones.

We accept the criticism in the Audit Office report that the use of the word “sustainable” was not sufficiently clear. However, for the sake of clarity and transparency, I should point out that we no longer have a target for the creation of new businesses. We are taking a different approach in which we are seeking to focus on the creation of wealth and prosperity, not just on starting businesses. Therefore, our targets are based on supporting people to export and to invest more in research and development. In other words, we are encouraging and supporting companies to take actions that generate more wealth. That is underpinning our current arrangements.

Mr McGlone:

The targets are all for job-creation projects. The creation of wealth comes as a result of job-creation projects.

Mr Sterling:

PSA 3 contains targets for job creation, and we work jointly with the Department for Employment and Learning on that. Therefore, we are still interested in job creation, but we are just as interested in taking steps that will generate wealth and prosperity. That is why there are targets for the type of earnings that we want to see in new jobs that are being promoted. We want to see companies being encouraged and supported to export and generate wealth in that way, and we want to encourage companies to market, sell abroad and invest in research and development. We have had careful discussions with the Enterprise, Trade and Investment Committee on those issues when we have been reporting on performance and targets.

Mr McGlone:

Thank you. Did you just explain the change in the wording?

Mr Sterling:

Yes, I did. However, as I said, it is only fair that I set it out in more detail in my letter.

Mr McGlone:

You mentioned that you have been working to DFP guidance — it would be useful if we had a copy of that.

Why have you adopted a less challenging target for job creation?

Mr Sterling:

I am not sure why there was a change in the target between the 2004-06 period and the later period. However, we do not now have a target for the creation of small businesses at all, and I have attempted to explain why that is. One of our main targets is to create 6,500 new jobs from inward investment, and there is a sub-target that 5,500 of those jobs should provide salaries above the Northern Ireland private sector median, which is about £17,000 per annum.

Mr McGlone:

Of the projected 8,500 new businesses in the period 2004-07 and the 3,000 new businesses in new targeting social need areas, and also in the period 2005-08, how many businesses were actually established? That would be useful.

Mr Sterling:

I will do that; I will give you a report that sets out what was achieved against the earlier targets. The delivery report, which I think you have looked at, shows the performance against the first year of the current Programme for Government targets.

Mr McGlone:

Finally, will you describe the 10-mile issue?

Mr Sterling:

It is an as-the-crow-flies measure.

Mr McLaughlin:

Mr Pengelly, for the benefit of the Committee, will you outline your involvement in the process as head of the performance and efficiency delivery unit (PEDU) in DFP?

Mr Richard Pengelly (Department of Finance and Personnel):

The role of PEDU is related to the preparation of the Programme for Government delivery report. Mr McMillen has mentioned that a central team was established to undertake an assessment of departmental performance against PSA targets, and that central team comprises officials from the economic policy unit of OFMDFM, supply officials from DFP and colleagues from PEDU. That team takes receipt of the submissions and returns from Departments on their progress against PSA targets. It carries out the initial assessments, plays those assessments back to the Departments and has a dialogue with the Departments. Ultimately, the team facilitates the preparation of advice and guidance, through Mr McMillen, to the First Minister and deputy First Minister and the Executive, with respect to overall performance.

Mr McLaughlin:

Therefore, your involvement in that process is as the head of PEDU?

Mr Pengelly:

Yes.

Mr McLaughlin:

OK. You are probably aware that earlier this year DFP submitted a performance report to the Committee for Finance and Personnel, which outlined a number of measurements that had been achieved or substantially achieved — whatever that means in the context of a mid-year report. Through the questioning of departmental officials, it became apparent that those self-assessments could not be sustained, with the result that the report was sent back for review and resubmission.

The Committee for Finance and Personnel also discovered that the Programme for Government delivery report — which is produced by OFMDFM, and which you have just referred to — had differing assessments from the in-year report, and that is the theme that I want to explore. We have had discussions about the data sets that Departments are relying on, and I presume that it can be quite challenging to achieve a robust and uniform assessment across a number of Departments. Furthermore, I suspect that the vulnerability of those self-serving self-assessments is part of the problem.

We find that Departments are using one measurement tool or assessment scale, while the OFMDFM report relies on a completely different scale. With a relatively small region such as the North, why can uniformity not be achieved?

The whole purpose in having the PSAs is to have reliable and robust performance assessments which the general public can depend on. Their purpose is not to blind people with statistics or assessments, particularly when they do not have the same opportunity as a scrutiny Committee to drill down into those assessments. What is your involvement with respect to PEDU’s role in the assessments? Have you outlined or established some tangible improvements in the system?

Mr Pengelly:

With respect to the specific problem that you have articulated, the short answer is no. One of the key differences between the central team and the Departments is that it carries out assessments on behalf of the Executive about cross-cutting performance by the Administration. That is one of the key changes —

Mr McLaughlin:

It is a high-level perspective

Mr Pengelly:

Yes. It takes a very high-level view.

One of the fundamental changes between the PSAs that were the subject of the Audit Office report and the current crop of PSAs is that the departmental focus has now moved to a cross-cutting basis. However, some components of the PSAs still remain departmentally focused. Therefore, the central team has taken a strategic view of the PSAs, whereas the departmental assessments and the DFP business plan that you spoke about come from a departmental perspective, taking elements where they contribute to PSAs.

Because it is a departmental issue, it is used as an internal management tool by Departments, which establish their own criteria for assessing performance. DFP had a seven-point red/amber/green rating, whereas the central team uses a four-point rating. DFP has decided to move in parallel and adopt a four-point rating.

Mr McLaughlin:

You will move to the OFMDFM system?

Mr Pengelly:

Yes. That will be useful and will eliminate some of those differences in the future.

Mr McLaughlin:

It is important that DFP has decided to address that problem of the differential in assessments by adopting a common assessment scale. Will that be applied across all Departments, or will the problem continue to exist elsewhere?

Mr Pengelly:

Ultimately, that is an issue for individual Departments. The business plan is a departmental construct. It is for Departments and their respective Ministers to decide on what basis they want to assess and report performance of their own business entity.

The work that we do for the central team takes the Executive perspective, so we can be much more prescriptive about how Departments report for our purposes. However, Departments need that flexibility to report on their own business. Obviously, they will engage with their respective Committees on that matter.

Mr McLaughlin:

Is that a formal position — that it is a matter for Departments? We are not talking about the European Commission here. All Departments can set their own monitoring system?

Mr McMillen:

Departments have the independence to do that. However, you have raised an issue on which we need to reflect. It has caused confusion to members and, obviously, to the public that there two monitoring systems. The first report, which was issued in June, was the first time that that had been exposed. Obviously, the Finance Committee has dealt with that. We need to reflect on that. On the other hand, it shows that there was a healthy challenge at the centre: people at the centre actually did not believe that what Departments were saying.

Mr McLaughlin:

That is good.

Mr McMillen:

They said that they willing to put it to the Executive, and to Ministers, that they did not believe what Departments were reporting. As we engage with Departments during the course of subsequent reports, we will be able to narrow the gap and to come up with a more consistent way to take that forward.

Mr McLaughlin:

That is a helpful answer. Without giving you any more work to do, Richard, because I know that you cover quite a brief, is there not, in fact, a role for PEDU to consider reporting mechanisms? If the information is reliable — if every one of the Committees, MLAs and, in particular, the public can have the type of in-year, mid-term, and end-of-year performance reports that they can actually rely on — that will, ultimately, help everyone.

Mr Pengelly:

It will. However — and I do not say this to be unhelpful — it goes back to the core rationale for the establishment of PEDU. The Finance Minister established it. I believe that PEDU has a role in that process. Fundamentally, however, PEDU’s role is to work with Departments to ensure delivery of individual priority targets. It is a small unit, and deliberately so — we do not want to create another overhead in the system.

Ministers might be reluctant to take it away from that sharp-edged focus on helping Departments to achieve successful delivery of targets to a more back-office assessment of the reporting framework. That sits best where the current policy responsibility lies, which is with our colleagues in OFMDFM. Certainly, because PEDU is in that delivery space, it can add value to that and can work with them. However, I am not sure —

Mr McLaughlin:

Obviously, there is a fundamental weakness in the performance assessment process. It alarms me that your Department produced a report that was not robust. The Committee dismantled it in about 10 minutes. Officials had to accept that the report did not provide accurate and robust conclusions on their own performance. They had to take it back to your Department, review it, rewrite it, and bring it back to the Committee. Even subsequent to that — at another stage of the process — we found that the OFMDFM report came to different conclusions. That is hardly joined-up government, to say the least.

Mr Pengelly:

I accept that. However, the debate that played out with the Finance Committee — unfortunately, I was not there for the first episode, although I had to be there for the second —

Mr McLaughlin:

We were looking for you. [Laughter.]

Mr Pengelly:

So I heard.

That was in parallel with the debate that played out when the central team was assessing DFP. I do not wish to justify differences of opinion, but some of the issues that have been very helpfully raised by the Audit Office about data systems and specificity on targets show that we are not there yet. We have made great strides since the publication of the report and the helpful good practice guide, but we are not there yet. For as long as we are not there, we are inherently in a place of subjective judgement. Every individual that you ask to offer a subjective view will offer a different one. I hope that it is a diminishing issue. However, I acknowledge that it is an issue, and one that we need to focus on and eliminate.

Mr McLaughlin:

Appendix 4 fairly and accurately reflects improvements that have been made to the performance assessment, but section 3 of the report deals with difficulties in the compendium reports from the centre. Clearly, there is more to do.

Mr Pengelly:

There is more to do. We talked about aligning the seven-point scale with the four-point scale, and we can do more with the timing. The first delivery report was a very difficult piece of work, particularly for Mr McMillen and his colleagues, in getting it turned round and approved by the Executive before the summer recess. It was very important to get it into the public domain before the recess.

There was slightly more of a time lag to the assessment of the departmental business plans. Many of the performance indicators are lag indicators, and there is always the risk that better, more informed information becomes available after the central delivery report, which may change the perspective. Those are the sorts of issues that we need to focus on and align better.

Mr McLaughlin:

We will get a response to the report. You have a very significant and strategic role in helping to bring that together, and I would be very interested if the Committee could get your views on how the system can be improved.

Mr Pengelly:

Yes, we will set that up.

The Chairperson:

You will be glad to hear that you can take it easy now.

The PSA process has been set up to improve accountability in the public sector. It seems that there is more to do in some Departments to make that right with regard to information and performance. The session has given us an interesting insight into the quality of the performance information that has been reported by the Departments. We welcome the improvements introduced in 2007, but I urge OFMDFM and all the other Departments to comply fully with the best practice in this area. People have said that they will supply us with all the information and that may lead us to pose more questions after we reflect on today’s meeting. Thank you for coming in today.