Northern Ireland Assembly Flax Flower Logo

Session 2009/2010

Fifth Report

Public Accounts Committee

Report on
Public Service Agreements – Measuring Performance

TOGETHER WITH THE MINUTES OF PROCEEDINGS OF THE COMMITTEE
RELATING TO THE REPORT AND THE MINUTES OF EVIDENCE

Ordered by The Public Accounts Committee to be printed 5 November 2009.

Report: NIA 22/09/10R Public Accounts Committee

This document is available in a range of alternative formats.
For more information please contact the
Northern Ireland Assembly, Printed Paper Office,
Parliament Buildings, Stormont, Belfast, BT4 3XX
Tel: 028 9052 1078

Membership and Powers

The Public Accounts Committee is a Standing Committee established in accordance with Standing Orders under Section 60(3) of the Northern Ireland Act 1998. It is the statutory function of the Public Accounts Committee to consider the accounts and reports of the Comptroller and Auditor General laid before the Assembly.

The Public Accounts Committee is appointed under Assembly Standing Order No. 56 of the Standing Orders for the Northern Ireland Assembly. It has the power to send for persons, papers and records and to report from time to time. Neither the Chairperson nor Deputy Chairperson of the Committee shall be a member of the same political party as the Minister of Finance and Personnel or of any junior minister appointed to the Department of Finance and Personnel.

The Committee has 11 members including a Chairperson and Deputy Chairperson and a quorum of 5.

The membership of the Committee since 9 May 2007 has been as follows:

Mr Paul Maskey*** (Chairperson)
Mr Roy Beggs (Deputy Chairperson)

Mr Patsy McGlone ** Ms Dawn Purvis
Mr Jonathan Craig Mr David Hilditch *******
Mr John Dallat Mr Jim Shannon *****
Mr Trevor Lunn Mr Mitchel McLaughlin
Rt Hon Jeffrey Donaldson MP MLA ********  

Mr Mickey Brady replaced Mr Willie Clarke on 1 October 2007*
Mr Ian McCrea replaced Mr Mickey Brady on 21 January 2008*
Mr Jim Wells replaced Mr Ian McCrea on 26 May 2008*
Mr Mr Thomas Burns replaced Patsy McGlone on 4 March 2008**
Mr Paul Maskey replaced Mr John O’Dowd on 20 May 2008***
Mr George Robinson replaced Mr Simon Hamilton on 15 September 2008****
Mr Jim Shannon replaced Mr David Hilditch on 15 September 2008*****
Mr Patsy McGlone replaced Thomas Burns on 29 June 2009******
Mr David Hilditch replaced Mr George Robinson on 18 September 2009*******
Rt Hon Jeffrey Donaldson replaced Mr Jim Wells on 18 September 2009********

Table of Contents

List of abbreviations used in the Report

Report

Executive Summary

Summary of Recommendations

Introduction

The Need for Robust Governance Arrangements

Vigorous Reporting Procedures

Refining Procedures

Appendix 1:

Minutes of Proceedings

Appendix 2:

Minutes of Evidence

Appendix 3:

Correspondence

Appendix 4:

List of Witnesses

List of Abbreviations used in the Report

PSA Public Service Agreements

OFMDFM Office of the First Minister and deputy First Minister

DETI Department of Enterprise, Trade and Investment

PEDU Public Expenditure and Delivery Unit

DARD Department of Agriculture and Rural Development

C&AG Comptroller and Auditor General

DFP Department of Finance and Personnel

GVA Gross Value Added

Executive Summary

1. Since 1998, Northern Ireland departments have been required to publish Public Service Agreements (PSAs) covering each three-year government spending cycle. These specify the targets to be used to measure performance against key departmental and cross-cutting objectives.

2. Adopting high standards in performance reporting can improve the accountability and transparency of public service delivery, help departments to allocate resources effectively and contribute to robust, evidence-based policy decisions. The Committee has real concerns, however, about the reliability and accuracy of the underlying PSA data systems. This makes it difficult to conclude that reported performance has actually been achieved.

3. OFMDFM has a central co-ordination and oversight role in relation to PSAs but its exercise of these functions was not sufficiently rigorous in the past. Despite subsequent improvements in oversight arrangements, it is clear that the new system has not sufficiently addressed the specific data system limitations identified by the C&AG’s report. Ten years after the launch of PSAs, much still remains to be done before any examination of targets and data systems will produce a clean bill of health.

4. OFMDFM must do more to hold individual departments to account for implementing improvements to weak data systems. At the moment, it is unlikely it has the capacity to do so. Given this gap, the Committee believes that there is a clear need for the C&AG to provide independent oversight of data systems in order to drive forward quality improvements in this area.

5. If the quality of measurement systems is poor, the reported data will also be poor. Senior management must therefore take ownership of this issue. In the past, departmental senior managers have not been sufficiently involved in overseeing data quality and have failed to use technical specialists, such as statisticians and economists, to best effect. Their involvement can help improve the quality of PSA data systems and should be encouraged.

6. There is evidence that reported performance has not always been substantiated by the underlying data. Although there is now a greater degree of challenge exerted by a central team from the Office of the First Minister and deputy First Minister (OFMDFM) and the Department of Finance and Personnel (DFP), the new Delivery Report still represents a subjective assessment of performance. The quality of reporting falls short of best practice and the DFP Departmental Committee has already identified that self-assessed claims of performance achievement were in fact not sustainable.

7. It is inherently difficult to set targets in the public sector as there is rarely a single measure which adequately captures overall departmental performance. However the Committee was disappointed at the lack of clarity in some targets, their failure to get to the heart of business objectives and evidence of changes to targets during the 3 year public spending cycle. There is also evidence of a lack of consistency across departments in terms of whether to select output or outcome measures. A more sophisticated framework of targets could usefully be developed for the next Programme for Government.

Summary of recommendations

1. The lack of focus on data quality and on the robustness of measurement systems remains a concern. In the Committee’s view, this is an area in which OPFMDFM must continue to seek improvement. However to provide independent assurance to the Assembly, the Committee recommends that OFMDFM and the C&AG jointly develop a work programme of PSA data system validation to assess the reliability and accuracy of the underlying systems and the reported data.

2. All departments have access to a range of specialist staff. The Committee recommends that all departments use in-house resources such as statisticians and economists to improve their oversight of the quality of PSA data systems. This should ensure separation of duties between those responsible for service delivery and those responsible for monitoring and reporting performance.

3. Accounting Officers must take ownership of PSA data systems. The Committee recommends that departments’ Statements of Internal Control include a specific assurance that risks to data quality have been assessed and that appropriate controls have been put in place to mitigate these identified risks.

4. It is of concern that the quality of reporting still falls short of best practice. The Committee recommends that, for each target, the Delivery Report should include the baseline position, interim outturn figures and the latest available data to ensure an objective and quantitative assessment of performance.

5. It is alarming that the performance assessments undertaken by individual departments contradict those of the central Delivery Report and that they use different assessment scales. The Committee recommends that all assessments are based on objective information and that departments align their self-assessment scales to the four-point system used in the central Delivery Report.

6. The summary nature of the Delivery Report limits transparency and scrutiny in the PSA process. The Committee recommends that progress on all targets contained in Delivery Agreements is reported every six months, made available on departmental websites and submitted to the relevant departmental Committees.

7. In the past, too many targets have been overly complicated and difficult to understand and monitor. Departments must ensure that each target is precisely stated and easily understood. Given past limitations in this respect, the Committee expects that all future PSA targets will be specific, measurable, attainable, realistic and timebound, and that, collectively, they cover the main aspects of departments’ key services.

8. Setting targets to measure public sector performance is not an exact science. However, there is a significant divergence in the approaches used by different departments. The Committee recommends that, for the next Programme for Government, OFMDFM takes the lead in developing a more sophisticated framework which introduces greater consistency in the use of output and outcome targets across all departments.

9. Changing targets during the life of the PSA risks undermining user confidence and accountability. The Committee recommends that such changes to targets are avoided but, if absolutely necessary, they must be explicitly disclosed and accompanied by a clear justification of the need for change.

Introduction

1. The Public Accounts Committee met on 8 October 2009 to consider the Comptroller and Auditor General’s report ‘Public Service Agreements – Measuring Performance’. The main witnesses were:

2. Since 1998, Northern Ireland departments have been required to publish Public Sector Agreements (PSAs) covering each three-year government spending cycle. These specify the targets to be used to measure performance against key departmental and cross-cutting objectives. Adopting high standards in performance reporting improves the accountability and transparency of public service delivery and can help departments to allocate resources effectively and make robust, evidence-based policy decisions.

3. In practice, reporting of performance relies on data generated from the underlying data systems. The C&AG’s report considered the adequacy of a selection of the systems used by departments to measure performance. This report noted that senior management were not sufficiently involved in PSA data systems, some of the measurement systems were not fit for purpose, and performance was not being reported in a sufficiently clear, timely and comprehensive manner.

4. In taking evidence on the C&AG’s report, the Committee focused on:

The Need for Robust
Governance Arrangements

The central oversight role of OFMDFM

5. OFMDFM has a central co-ordination role in relation to PSAs. This role involves ensuring that departments adopt appropriate PSA targets, co-ordinating annual progress reports and promoting best practice. It is clear that, at the time of the C&AG’s report, OFMDFM did not fulfil its central oversight role to an adequate standard. It did not provide guidance on the design and operation of PSA data systems or monitor compliance with best practice. It also failed to exert a strong challenge function on the selection of targets, the robustness of data systems or the reporting of achievements.

6. OFMDFM accepted that its oversight role was not sufficiently rigorous in the past. However it told the Committee that significant improvements have been introduced into the new PSA process. Key developments have included the issue of new guidance in June 2007, the development of new governance structures, the selection of more cross-cutting and strategic targets, the requirement to produce detailed Delivery Agreements for each target and the establishment of a central team to challenge reported performance assessments.

7. The Committee welcomes the improvements in oversight made since the C&AG’s examination, as these should contribute to an improved PSA process. However, it is clear to the Committee that the new system has not sufficiently addressed the specific data system limitations identified by the C&AG. These have led to real concerns about the reliability and accuracy of the data and make it difficult to conclude that reported performance was actually achieved.

8. OFMDFM’s Accounting Officer offered assurances to the Committee that, were the C&AG to revisit this topic, the PSA targets and systems of departments would “fare much better". He acknowledged that past results had been disappointing and said that the situation would improve incrementally. Ten years after the launch of PSAs, it appears therefore that much still remains to be done before any examination of targets and data systems will produce a clean bill of health.

9. The Committee expects OFMDFM, as part of its central oversight role, to hold individual departments to account for implementing improvements to weak data systems. At the moment, it is not clear that it has the capacity to do so. While the more recent involvement of the Performance and Efficiency Delivery Unit (PEDU) is welcome, it is focused on helping departments deliver targets rather than assessing the performance management framework. Given this gap, the Committee believes that there is a clear need for independent oversight of data systems to drive forward quality improvements in this area.

Recommendation 1

10. The lack of focus on data quality and on the robustness of measurement systems remains a concern. In the Committee’s view, this is an area in which OFMDFM must continue to seek improvement. However to provide independent assurance to the Assembly, the Committee recommends that OFMDFM and the C&AG jointly develop a work programme of PSA data system validation to assess the reliability and accuracy of the underlying systems and the reported data.

The involvement of senior managers and technical specialists in the
PSA process

11. A strong corporate control environment is key to the establishment and operation of robust PSA data systems. Senior managers within individual departments are ultimately responsible for the quality of PSA data systems. They have a role in ensuring that risks to data quality are formally assessed and that appropriate quality controls are established.

12. The C&AG’s report noted that senior managers were not sufficiently involved in overseeing data quality. This is disappointing. Both DARD and DETI accepted the criticism but outlined improvements which have been introduced under the 2008-2011 Programme for Government. These include the appointment of Senior Responsible Officers for each PSA, supported by Data Quality Officers. In addition, DARD stated that it has commissioned an Internal Audit review to ensure its PSA data systems and reporting procedures are compliant with good practice.

13. Technical specialists, such as statisticians and economists, can also play an important quality control role. Their involvement can help departments to select meaningful targets, identify appropriate data sources and accurately report performance. All government departments have access to such expertise. The C&AG’s report noted, however, that the use of such specialists was often limited and, even where they were involved, their concerns surrounding data systems were not fully addressed.

14. During his evidence, the DETI Accounting Officer highlighted the specific actions his Department has taken in this regard. He has created an internal Strategic Planning, Economics and Statistics Division which has central responsibility for the development of PSAs and targets, the maintenance of the data systems, the validation of the data systems and the monitoring of performance. The Committee welcomes these developments and believes they should be considered by other departments.

Recommendation 2

15. All departments have access to a range of specialist staff. The Committee recommends that all departments use in-house resources such as statisticians and economists to improve their oversight of the quality of PSA data systems. This should ensure separation of duties between those responsible for service delivery and those responsible for monitoring and reporting performance.

16. Quality assessment of underlying data systems needs to be fully embedded in the PSA process. It is not clear from the evidence presented to the Committee whether systems are being risk assessed and controls put in place. If the quality of measurement systems is poor, the reported data will also be poor; this can lead to erroneous judgements. For this reason, there must always be full disclosure of known risks and data weaknesses.

Recommendation 3

17. Accounting Officers must take ownership of PSA data systems. The Committee recommends that departments’ Statements of Internal Control include a specific assurance that risks to data quality have been assessed and that appropriate controls have been put in place to mitigate these identified risks.

Vigorous Performance Reporting

Performance reports must be based on comprehensive, accurate and reliable information

The need for performance reporting to be based on actual data

18. The introduction of PSAs formalised the process for reporting performance to the Assembly and the public on departmental achievement against key objectives. Departments should produce timely, transparent and comprehensive performance reports and make these publicly accessible. It is disappointing to note that the C&AG’s report identified a number of cases where reported performance was not substantiated by the underlying data. These included instances where performance was not compared against baselines, actual outturn and historical data were not provided and there was inadequate interpretation of results.

19. In response, departments cited the introduction of a new centralised reporting framework which they consider has improved the quality of PSA reporting. This takes the form of a Delivery Report which adopts a more detailed four point assessment scale.

20. The Committee recognises that the new report is more comprehensive and has benefited from PEDU and OFMDFM challenge. There is now a more detailed degree of analysis of results and this is welcomed. Nevertheless, the Delivery Report still represents a subjective assessment of performance. It does not present, for all targets, the underlying data supporting the assessment. For this reason, the Committee is of the view that the report, while an improvement, is still not fit for purpose. More than a decade after the introduction of PSAs, this is an unacceptable position.

Recommendation 4

21. It is of concern that the quality of reporting still falls short of best practice. The Committee recommends that, for each target, the Delivery Report should include the baseline position, interim outturn figures and the latest available data to ensure an objective and quantitative assessment of performance.

The evidence of contradictory assessments of performance

22. A key purpose of PSAs is to give reliable and robust performance assessments in which the general public and elected representatives can have confidence. Earlier this year, DFP submitted a performance report to its departmental Committee outlining the extent to which measures had been achieved. During questioning of departmental officials, it became apparent that the self-assessed claims were in fact not sustainable.

23. Furthermore, the departmental Committee identified that DFP was using a seven-point assessment tool which was completely different from the central team’s four-point scale. This clearly risks causing confusion to the Assembly and the general public. Departments currently have the independence and flexibility to adopt their own monitoring systems. However, given that the two reports came to such differing conclusions, it is clear that this represents a fundamental weakness in the performance assessment process.

Recommendation 5

24. It is alarming that the performance assessments undertaken by individual departments contradict those of the central Delivery Report and that they use different assessment scales. The Committee recommends that all assessments are based on objective information and that departments align their self-assessment scales to the four-point system used in the central Delivery Report.

25. As there are concerns about reported performance figures, the Committee welcomes DFP’s subsequent written assurance that senior civil servants’ bonuses were “not based on the data systems underpinning the delivery of PSAs". Until such data systems are robust, and produce reliable measurements of performance, achievement of PSA targets should not be used as a basis for awarding bonuses.

The high level nature of the Delivery Report

26. A further limitation of the new Delivery Report is its high level nature. OFMDFM indicated that the report is designed to inform the Executive and the Assembly of the general direction of travel, highlighting what is going right and what is going wrong. Although more detailed information is available in the individual Delivery Agreements, these are not currently published. This is not a tenable position. In the absence of the necessary detail, Assembly members can take no assurance on the validity and accuracy of high level self-assessments.

Recommendation 6

27. The summary nature of the Delivery Report limits transparency and scrutiny in the PSA process. The Committee recommends that progress on all targets contained in Delivery Agreements is reported every six months, made available on departmental websites and submitted to the relevant departmental Committees.

Gaps in annual performance reporting

28. The C&AG’s report highlighted that, during the previous Programme for Government, OFMDFM failed to publish a composite performance report for either 2006-07 or 2007-08. This is unacceptable as the absence of a composite report for this period prevented full and transparent assessment of performance by the Assembly and the public. Following the hearing, DETI provided written evidence of performance against its previous targets to the end of 2008. While this is welcomed, it highlights the lack of accountability which arose as a result of the non-publication of a composite report during this period. The Committee expects that, in future, achievement for all years and for all targets will be published centrally by OFMDFM.

Refining Procedures

It is important that departments develop more sophisticated procedures before selecting the next round of PSA targets

The need to adopt SMART targets

29. It is inherently difficult to set targets in the public sector. Unlike the private sector there is rarely a single measure, such as profitability, which adequately captures overall performance. Nevertheless, it is a well-established principle that, as far as possible, targets should be specific, measurable, attainable, realistic and timebound (SMART). Failure to adopt SMART targets can lead to difficulties in assessing overall performance and improving public service delivery.

30. The Committee was particularly disappointed that the major strategic targets relating to child poverty have been deficient in this respect. OFMDFM acknowledged that, at the time of the C&AG’s report, the targets were poorly drafted and the data required to measure improvement were not initially collected. When data became available, they were not statistically accurate, although OFMDFM claimed that they were sufficiently plausible to measure progress. Despite this assurance, there continue to be data system problems around this key policy area. There is no accepted definition of severe child poverty and currently there is no measurement system in place. In effect, OFMDFM is shooting at a target that it cannot see.

31. Problems were also evident with a number of other targets. For example, discussions with DARD highlighted that it was not always clear what was to be measured, some targets failed to get to the heart of business objectives while others tended to be overly complicated and, as a consequence, difficult to understand and interpret.

Recommendation 7

32. In the past, too many targets have been overly complicated and difficult to understand and monitor. Departments must ensure that each target is precisely stated and easily understood. Given past limitations in this respect, the Committee expects that all future PSA targets will be specific, measurable, attainable, realistic and timebound, and that, collectively, they cover the main aspects of departments’ key services.

The need for consistency of approach in selecting targets

33. The difficulty in selecting targets is further evidenced by the differing approaches adopted by DARD and DETI in relation to the use of Gross Value Added[1] (GVA). DARD has decided to dispense with this target arguing that, as an outcome measure, it is not within DARD’s ability to influence. Conversely, DETI has continued to use this measure as a target because it considers it a key indicator of long term economic development. While both these arguments may have their own merits, the difference in approach implies the need to refine the target selection process. There must be greater clarity on whether PSAs should only contain targets where departments can influence achievement.

Recommendation 8

34. Setting targets to measure public sector performance is not an exact science. However, there is a significant divergence in the approaches used by different departments. The Committee recommends that, for the next Programme for Government, OFMDFM takes the lead in developing a more sophisticated framework which introduces greater consistency in the use of output and outcome targets across all departments.

The need for targets not to be subject to unnecessary change

35. PSA targets are selected at the outset of each three-year spending cycle. They should be consistently stated and should not be subject to unnecessary change. Changing targets during the three-year life of a PSA undermines user confidence, as it can appear that departments are arbitrarily making changes to increase the likelihood of achievement. The Committee of Public Accounts at Westminster has been critical of frequent changes to targets as this weakens their ability to serve as useful and meaningful tools of accountability and retain credibility[2].

36. The C&AG’s report noted that one of DETI’s agencies, InvestNI, amended its target on the establishment and support of new businesses in disadvantaged areas. It was clear that these changes risked confusing readers and undermining the transparency of performance reporting. Following the session, DETI provided the Committee with a written submission containing details of the various changes to the target. In this, DETI stated that the revisions reflected changes in the reporting time period rather than in the target itself. This explanation does not offset the Committee’s concerns.

Recommendation 9

37. Changing targets during the life of the PSA risks undermining user confidence and accountability. The Committee recommends that such changes to targets are avoided but, if absolutely necessary, they must be explicitly disclosed and accompanied by a clear justification of the need for change.

[1] GVA measures the contribution to the economy of each individual producer, industry or sector in the United Kingdom.

[2] Committee of Public Accounts, Session 2006-07, Improving literacy and numeracy in schools (Northern Ireland) HC 108

Introduction

The Need for Robust Governance Arrangements

Vigorous Performance Reporting

Minutes of Proceedings
of the Committee
Relating to the Report

Thursday, 1 October 2009
Room 144, Parliament Buildings

Present: Mr Roy Beggs (Deputy Chairperson)
Mr John Dallat
Rt Hon Jeffrey Donaldson MP MLA
Mr David Hilditch
Mr Trevor Lunn
Mr Patsy McGlone
Mr Mitchel McLaughlin
Mr Jim Shannon

In Attendance: Ms Aoibhinn Treanor (Assembly Clerk)
Mr Phil Pateman (Assistant Assembly Clerk)
Miss Danielle Best (Clerical Supervisor)
Mr Darren Weir (Clerical Officer)

Apologies: Mr Paul Maskey (Chairperson)
Ms Dawn Purvis
Mr Jonathan Craig

The meeting opened at 2.03 pm in public session.

4. Briefing on NIAO report ‘Public Service Agreements – Measuring Performance’.

Mr Kieran Donnelly, C&AG, Mr Eddie Bradley, Director; Claire Dornan, Audit Manager; and Mr Joe Campbell, Audit Manager; briefed the Committee on the report.

The witnesses answered a number of questions put by members.

[EXTRACT]

Thursday, 8 October 2009
The Senate Chamber, Parliament Buildings

Present: Mr Paul Maskey (Chairperson)
Mr Roy Beggs (Deputy Chairperson)
Mr Jonathan Craig
Mr John Dallat
Rt Hon Jeffrey Donaldson MP MLA
Mr Trevor Lunn
Mr Patsy McGlone
Mr Mitchel McLaughlin
Ms Dawn Purvis
Mr Jim Shannon

In Attendance: Ms Aoibhinn Treanor (Assembly Clerk)
Mr Phil Pateman (Assistant Assembly Clerk)
Miss Danielle Best (Clerical Supervisor)
Mr Darren Weir (Clerical Officer)

Apologies: Mr David Hilditch

The meeting opened at 2.01 pm in public session.

3. Evidence on NIAO report ‘Public Service Agreements – Measuring Performance’.

The Committee took oral evidence on the above report from:

Mr John McMillen, Accounting Officer, Office of the First Minister and deputy First Minister (OFMDFM); and supporting officials Mr Damian Prince, OFMDFM; Mr Gerry Lavery, DARD; Mr David Sterling, DETI; and Mr Richard Pengelly, PEDU.

The witnesses answered a number of questions put by the Committee.

3.12 pm Mr Lunn left the meeting.

3.38 pm Mr McLaughlin and Mr McGlone left the meeting.

Members requested that the witnesses should provide additional information to the Clerk on some issues raised as a result of the evidence session.

[EXTRACT]

Thursday, 15 October 2009
Room 144, Parliament Buildings

Present: Mr Paul Maskey (Chairperson)
Mr Roy Beggs (Deputy Chairperson)
Mr John Dallat
Rt Hon Jeffrey Donaldson MP MLA
Mr David Hilditch
Mr Trevor Lunn
Mr Patsy McGlone
Mr Mitchel McLaughlin
Mr Jim Shannon

In Attendance: Ms Aoibhinn Treanor (Assembly Clerk)
Mr Phil Pateman (Assistant Assembly Clerk)
Miss Danielle Best (Clerical Supervisor)
Mr Darren Weir (Clerical Officer)

Apologies: Mr Jonathan Craig
Ms Dawn Purvis

The meeting opened at 2.03 pm in public session.

5. Issues Paper on evidence session on NIAO report ‘Public Service Agreements – Measuring Performance’.

Members considered an issues paper on this evidence session.

[EXTRACT]

Thursday, 5 November 2009
Room 144, Parliament Buildings

Present: Mr Paul Maskey (Chairperson)
Mr Roy Beggs (Deputy Chairperson)
Rt Hon Jeffrey Donaldson MP MLA
Mr David Hilditch
Mr Patsy McGlone
Mr Mitchel McLaughlin
Ms Dawn Purvis
Mr Jim Shannon

In Attendance: Ms Aoibhinn Treanor (Assembly Clerk)
Mr Phil Pateman (Assistant Assembly Clerk)
Miss Danielle Best (Clerical Supervisor)
Mr Darren Weir (Clerical Officer)

Apologies: Mr John Dallat
Mr Trevor Lunn
Mr Patsy McGlone

The meeting opened at 2.04 pm in public session.

8. Consideration of Draft Committee Report on Public Service Agreements – Measuring Performance

Agreed: Members ordered the report to be printed.

Agreed: Members agreed that the report would be embargoed until 00.01 am on Thursday, 26 November 2009.

Agreed: Members agreed to launch the report with a press release to be agreed at a later meeting and to a Committee launch at a relevant venue.

[EXTRACT]

Appendix 2

Minutes of Evidence

8 October 2009

Members present for all or part of the proceedings:
Mr Paul Maskey (Chairperson)
Mr Roy Beggs (Deputy Chairperson)
Mr Jonathan Craig
Mr John Dallat
Mr Jeffrey Donaldson
Mr Trevor Lunn
Mr Patsy McGlone
Mr Mitchel McLaughlin
Ms Dawn Purvis
Mr Jim Shannon

Witnesses:

Mr Gerry Lavery

Department of Agriculture and Rural Development

Mr John McMillen
Mr Damian Price

Office of the First Minister and deputy First Minister

Mr David Sterling

Department of Enterprise, Trade and Investment

Mr Richard Pengelly

Department of Finance and Personnel

Also in Attendance:

Mr Kieran Donnelly

Comptroller and Auditor General

Ms Fiona Hamill

Deputy Treasury Officer of Accounts

1. The Chairperson (Mr P Maskey): Today, the Committee will address the matters raised in the Audit Office report ‘Public Service Agreements — Measuring Performance’. Mr John McMillen, accounting officer for the Office of the First Minister and deputy First Minister (OFMDFM), is here. You are very welcome.

2. Mr John McMillen (Office of the First Minister and deputy First Minister): I am joined by Damian Prince, head of the economic policy unit, who oversees the operation of the Programme for Government monitoring, and Richard Pengelly, public spending director and head of the performance and efficiency delivery unit (PEDU) in the Department of Finance and Personnel (DFP). His team helps OFMDFM in looking at departmental performance. I am also joined by Gerry Lavery, senior finance director of the Department of Agriculture and Rural Development, and by David Sterling, permanent secretary at the Department of Enterprise, Trade and Investment.

3. The Chairperson: I believe that this is your first time before the Public Accounts Committee. You have done well to escape us so far.

4. Mr McMillen: Please be gentle with us.

5. The Chairperson: We certainly will.

6. Paragraph 1.1 of the report outlines the important role of public service agreements (PSAs) in reporting performance, improving service delivery and promoting accountability. In retrospect, is your Department’s oversight role in that process sufficient?

7. Mr McMillen: Before I start to answer, I wish to put it on record that the Department welcomes the Northern Ireland Audit Office report. It is, and will remain, an excellent source of advice, best practice and guidance on this subject.

8. The report highlighted a number of weaknesses evident in the data and the procedures used to monitor the reporting of the 13 PSAs in 2006-07. As the Committee is aware, however, we have put in place revised procedures, protocols and delivery mechanisms to monitor and report the current Programme for Government. Although the design of those new arrangements preceded the publication of the Audit Office report, we were able to draw on the emerging findings of the audit team and the work carried out by the National Audit Office, and also on Treasury guidance. We have since issued guidance to Departments requiring them to produce delivery agreements. Those agreements aim to strengthen accountability and confidence in delivery and, where PSA outcomes cut across departmental boundaries, clearly explain the contributions of each Department.

9. Importantly, the delivery agreements also require the production of a measurement annexe for each target, and it is that which deals with any data issues. I am not saying that we now have the procedure perfect. Indeed, the delivery report that was delivered to the Executive and Assembly in June 2009 still draws attention to some failings in the data supporting delivery of PSA targets. However, the fact that we are now aware of those shortcomings and reporting them — and, indeed, actively challenging them — demonstrates that we have taken on board the issues that are highlighted in the Comptroller and Auditor General’s report.

10. Returning to your question, Chairman, the findings of the Audit Office report were certainly disappointing. However, it was looking at the previous Programme for Government, and at particular complex data-set issues, which, in the main, relate to a minority of the PSA targets overall. Indeed, of the PSA delivery agreements for OFMDFM, complex data sets relate to about 25%.

11. Therefore, although the report is disappointing, I do not think the conclusion could be drawn that it reflected poor performance across all Departments. Indeed, the Audit Office said that that is not evidenced on those issues.

12. The Chairperson: Do you regard the Department’s oversight role as having been sufficient?

13. Mr McMillen: The previous technical notes were not as robust as what is now required in the delivery agreements and the guidance that followed. In the early part of the decade, we were still learning how to operate the system. It was evolving, as it did in Whitehall and other jurisdictions. We were learning, and we applied what appeared to be best practice at the time. Perhaps we did not apply it as rigorously as we should have. We have learned from that, and we now apply it rigorously.

14. The Chairperson: Have lessons been learned?

15. Mr McMillen: Yes.

16. The Chairperson: Paragraph 1.12 of the report states that a “radically different approach" to PSAs has been adopted and that this provides an assurance that the new data systems are “fit for purpose". Can you outline how that has been achieved?

17. Mr McMillen: We used a system that was based on a different approach to the PSA targets. Previously, targets tended to be departmental. We have now tried to embed the PSA targets in the strategic priorities for the Executive. They are much more cross-cutting.

18. The guidance requires public service delivery agreements for each of the PSAs, and for there to be a responsible Minister and senior responsible owner within the Departments. The agreements require the establishment of PSA delivery boards for each PSA, which are made up of senior officials from across the various Departments that contribute to that delivery. There is also guidance for the Department on how performance is to be measured, including a measurement annexe that includes issues around data sets and issues raised in the Audit Office report. There is a much more robust system of guidance for Departments to follow.

19. The key point is the challenge function that is applied at the centre by a team comprising officials from the economic policy unit in OFMDFM, PEDU and the Supply division in DFP. That team monitors performance on a quarterly basis and challenges Departments on how they are performing. That is ratcheting up the delivery mechanisms and making Departments more accountable.

20. The Chairperson: What is the time frame? Paragraph 1.12 refers to targets that were set in 2006-07. There was devolution in May 2007.

21. Mr McMillen: The main guidance which improved the situation was issued by DFP early in 2007 ahead of the Programme for Government development. The research was being done in 2006-07. That guidance has followed through into the formation of the Programme for Government. The monitoring system that was set up was also put forward at that stage, although it was not approved by the Executive until March 2009. However, Ministers, through DFP and the monitoring rounds, have been monitoring performance against the new guidance and the PSAs on a quarterly basis as part of the monitoring of the Budget.

22. The Chairperson: Paragraph 1.11 of the report refers to a “good practice checklist" developed by the Audit Office. If the Audit Office revisits that area and tests a sample of the new PSA data systems using the checklist, will the resulting report tell us a story of significant improvement?

23. Mr McMillen: All Departments welcome the checklist. It is very useful to be able to assess how we are doing against a checklist. If the Audit Office looked at how the new system matches up against that checklist, we would fare much better. However, I think that the real value in the checklist will be in the next Programme for Government, because it gives very good advice on defining and setting targets and on how those targets are going to be measured. That will form part of the evolution of how we are improving performance management across the Government.

24. The Chairperson: Are you saying that if the Audit Office revisits that, no improvements will be seen until the next Programme for Government?

25. Mr McMillen: No, I think that the current Programme for Government monitoring processes are showing an improvement in how we are reporting. Evidence of that is contained in the view of the Confederation of British Industry (CBI), which, in response to the delivery report, observed that there was a big improvement on what has gone before. The CBI saw it as being transparent and open, and it welcomed that. The Assembly’s Research and Library Services has also seen the critique, and also welcomed it and said that it had showed a big improvement. Therefore, the current performance system is better than what was in place previously. There are still problems with it, but we have reported those to the Executive, and Departments are dealing with them in each monitoring round.

26. The Chairperson: The scale of some of the weaknesses in the data system led to some real concerns about the reliability and accuracy of the claims that were made about performance. The Committee would like your assurances that such data systems were not relied on when senior managers’ performance bonuses were being determined.

27. Mr McMillen: Senior managers’ bonuses are a matter for the permanent secretaries. I am sure that many factors play into those; however, I cannot assure the Committee one way or the other as to how performance on the delivery of Programme for Government targets feeds into such determinations. Data sets and their management would be a small part of the process, but I have no information on how that feeds into the determination of the bonuses.

28. The Chairperson: Would it be possible for you to go back to the Department and ask whether the Committee could get a look at some of that information?

29. Mr McMillen: I shall certainly see what I can do for you; however, there may be some difficulty in how it relates.

30. The Audit Office report noted that, given that the data sets themselves are not being managed, it is difficult to conclude that performance is being achieved. However, the report did not necessarily say that it is not being achieved. Therefore, that is a difficult issue. However, I shall take your point on board and see what I can find out for you.

31. The Chairperson: The Committee may end up drafting some written questions to the Department on that matter.

32. Mr Shannon: You and I meet regularly in Committees, so we are not strangers. My membership of the Committee for the Office of the First Minister and deputy First Minister means that I am aware of your role in OFMDFM.

33. It is clear that OFMDFM has a central co-ordination role, and I know that you and Assembly Members understand that function. However, it seems that you have not provided guidance on designing or operating PSA systems. In the absence of such guidance, how do you intend to ensure the quality of the data that is produced? That is what the issue is about; you must be able to report to the Assembly on those data so that it can see the targets and the performance.

34. Mr McMillen: DFP issued the guidance in March 2007, I believe. It was put together by a joint team from OFMDFM and DFP. It was issued by DFP ahead of the agreement on monitoring frameworks. The joint guidance is out, and it tells Departments how they should set up delivery agreements, how they should put their measurement annexes together, and how they should demonstrate that.

35. Mr Shannon: It seems that the data is not there. I understand that the purpose of the system is to make the Assembly, the media and other Departments aware of the data that is produced. If the data is not there and has not been there, is it not difficult to ascertain performance?

36. Mr McMillen: We need to make a distinction between the position in 2006-07, with which the Audit Office report deals, and what is extant today. The guidance was issued subsequent to the dates that are covered in the report, and the Programme for Government reporting system now monitors against that new guidance. The central team is made up of staff from the economic policy unit, PEDU and DFP supply. That team issues challenges against that guidance and takes data issues into account. I am on PSA delivery boards and, from my experience, I know that they work on data sets and data issues. Therefore, a mechanism is now in place to challenge the data issues.

37. Indeed, some targets were given red or amber status in the delivery report because of data issues, where the central team said that standards for data were not being reached, so it pushed the targets back to Departments for them to address for future monitoring rounds.

38. Mr Shannon: Are you telling us that the new system will overcome past difficulties?

39. Mr McMillen: I believe that those difficulties are being overcome and that that will continue; the situation will improve incrementally.

40. Mr Shannon: Will the same monitoring system be in place for reporting to the Assembly and all the Committees so that they will be aware of what is happening through the new system?

41. Mr McMillen: The First Minister and deputy First Minister will take the report to the Executive and the Assembly. I believe that a take-note debate on the first report, which covers 2008-09, is scheduled for some time in this session.

42. The Committees have also been receiving departmental responses. As you are aware, I was up in front of the OFMDFM Committee to talk about the performance of my Department.

43. Mr Shannon: Paragraph 2.9 of the report sets out the key components of technical notes. It states that they should: “set baselines; provide definitions of key terms; set out clearly how success will be assessed; describe the data sources that will be used; and outline any known and unavoidable significant weaknesses or limitations in the data system."

Those are the criteria. However, paragraph 2.10 refers to several deficiencies that were found when the Audit Office examined the technical notes. It seems that there were deficiencies in the technical notes provided by Departments in which at least one of those guidelines was not used. It seems very unusual, if not wrong, to have guidelines and then ignore them.

44. Mr McMillen: I accept the finding in the Audit Office’s report that the criteria were not being applied in 2006-07. The previous system lacked a challenge function at its centre to push the Departments back and examine what was coming through. We now have the central team who carry out that function. We have published measurement annexes for delivery agreements, which set out the criteria. The guidance requires very similar criteria to what was set out in paragraph 2.9 in the guidance for Departments to include in their measurement annexes. That is now being assessed by the central team. If something is not satisfactory, it will be reported back to Departments. In some instances, red or amber status is given to a target because it cannot be evidenced.

45. Mr Shannon: Paragraph 2.10 also states that:

“a number of the Technical Notes were factually inaccurate."

Will you assure the Committee that that will not happen again, and give us an explanation as to why it happened?

46. Mr McMillen: I do not have an explanation for individual Departments, but I assure the Committee that we now look at the accuracy of technical notes as part of our central monitoring. We also question baselines; if there are errors, we immediately go back to the Department to ask why.

47. Mr Shannon: The point that we are making is that it cannot be that difficult to have technical notes in place. We look forward to a vast improvement.

48. Paragraph 3.22 of the report refers to a delay — and again, no delivery report. The Public Accounts Committee has a role as the policeman of the Assembly and its Committees. We are concerned about the delay in publishing the 2006-07 report; performance for a large number of targets went unreported as a result. From the Programme for Government and Budget website, I see that the last old-style performance report was for 2005-06. Therefore, there is a gap between the old-style and the new-style reports for the year 2007-08. How can the Assembly assess the performances of Departments in 2007-08 if we do not have a report?

49. Mr McMillen: I accept that there were delays in getting the 2006-07 report published, which, as reported, were due to resource constraints. The Department is committed to producing a report on the Programme for Government as per the monitoring framework. As Members will be aware, we produced the first report for 2008-09 in June. We are gathering and recording information for the next report for general information within Departments, and we will be producing other reports every six months. We are committed to doing that.

50. Mr Damian Prince (Office of the First Minister and deputy First Minister): I can reassure members that, since the Executive approved the framework for monitoring the Programme for Government on 5 March, we have produced three reports on the performance of each Department. We produced an indicative report at the end of December, which was a dry run to ensure that we had the correct systems in place to properly monitor what was happening. We also produced a delivery report as at the end of March, which members will have seen, and which has been placed in the Assembly. We have also commissioned a report as at the end of June 2009, and that is a good way along.

51. Members can be assured that there is now a process in place. Even the delivery report at the end of March is not a full stop; it is a punctuation mark along the whole journey of reporting back on the Programme for Government. There will be regular update reports and regular challenges. The benefit of the new system is that it has changed the culture and the approach. The delivery of the Programme for Government is being driven, monitored and interrogated by the centre, rather than just by individual Departments.

52. Mr Shannon: I understand the explanation that you have given, but are you saying that there was no formal reporting of PSA performance for either 2006 or 2007? Has that been lost, or acted on, or are you saying that it is just a mistake from the past, but that you have now moved on and will get it right in the future? Is that what you are saying?

53. Mr Prince: My understanding is that, for the end of each of those years, performance was reported in the accounts of individual Departments and through their departmental boards. There was no composite report, as had been produced earlier. At that stage, in 2006-7, we were on the cusp of a new direction with a new Administration and the design of new systems. Most energy and effort went into getting the new system right and getting something nailed down to monitor the current Programme for Government. That is where the main energy and attention now centres.

54. Mr Shannon: Are you simply putting your hands up and saying that you got it wrong for those two years, but that you will get it right now? Is that it?

55. Mr McMillen: I would point out that that was prior to devolution. We have now produced a report for the first year of the new Programme for Government for 2008-2011, and intend to continue to produce reports for Committees every six months or eight months.

56. Mr Shannon: In relation to paragraph 3.21 — I think Dusty Bin was on ‘3-2-1’ a long time ago. That is probably showing my age. Paragraph 3.21 of the report, which deals with the introduction of PSAs, notes that, in order to comply with good practice, performance reports should:

“Include latest outturn figures, compare performance against baselines and provide historical trend data".

I do not see any of those three things in OFMDFM’s new-style delivery report, issued in June 2009, which the Committee has seen. Why does the new approach not follow basic best practice?

57. The Chairperson: I ask members and witnesses to speak up a bit, as the sound system is of very poor quality.

58. Mr McMillen: The delivery report to the Executive, and thus the Assembly, is not intended to be a detailed analysis of each individual target, but is rather designed to inform the Executive and the Assembly of the direction of travel, showing where we are doing well and highlighting areas that need further examination. It does not go into detail.

59. Mr Prince: The OFMDFM Committee also requested the detail that you are looking for. One member said that people do not feel the greens that are being highlighted in the reports. The delivery report is essentially a document to give the Executive a high-level overview of what is going right and what is going wrong, but it is built on a lower level of detail, which is contained in the delivery agreements. Those cover things like what the baseline is for each individual target, what progress has been made, what percentage of completion has been achieved, what milestones have been reached and what has been missed. The central delivery team reviews those reports and challenges Departments.

60. The red/amber/green (RAG) status is determined on the basis of those reports. When a report makes a claim that something has green status, we interrogate it further to look for the evidence that it should have that status. At that point we get down to the lower level of detail. If Departments are not at the point that they said they would be at, we will mark the rating down. That information is available not in the delivery report, but at the next level down, in the delivery agreements and the information that Departments submit to substantiate their RAG status.

61. Mr Shannon: The key issue for Committee members is how we can access figures if the three best practice outlines that we have in front of us have not been followed. If the figures are not made accessible through the best practice system, the Assembly is almost being misled about how that is done. I am sure that that is not your intention; perhaps you need to reassure us on that point.

62. Mr McMillen: I can reassure you that there is no desire to mislead anyone. It was a balance between overloading information and having a composite report that is acceptable, easily read and quickly digested. It may be that we have overdone that in some instances. We would welcome any feedback from the Committee on whether the level of information should be expanded. The report was welcomed by the CBI and others who considered it to be a big improvement and regarded it as being transparent, open and quickly accessible. I can appreciate some of the detail, and we can look at the possibility of that being accessed through the departmental delivery agreements on departmental websites.

63. Mr Dallat: Glossy reports such as this are about real people; people who are badly disadvantaged, children who are living in poverty, and so on. The Assembly depends on the data collected to influence the political change that is needed. Therefore, what we are discussing this afternoon is something that is really serious. Do you agree?

64. Mr McMillen: I do.

65. Mr Dallat: Public service agreement 13 is the only target selected, and it did not meet any of its criteria. Is that not an awful indictment? PSA 13 is about child poverty; the failure to achieve the target means that children who are suffering from deprivation and poverty will not have their needs addressed because the data is toxic or is being googled from the Internet and is not relevant to what is happening on the ground.

66. Mr McMillen: I certainly accept the point that real data is important in making policy decisions and driving forward. I do not necessarily make the connection that the row of Xs against PSA 13 leads to the conclusion that we are not making an improvement on that. In mitigation, PSA 13 is a good example of a target that was badly drafted at that time. It includes targets such as improving prospects and opportunities, which are very difficult things to measure and to get the data sets in place for. Many of the Xs are there because we could not evidence some of those issues.

67. The report refers to inaccuracies in the hard data on child poverty. A lot of that arose because of the system for measuring child poverty. We wanted to include data going back to 1998-99, but it was not collected until 2002-03. Therefore, we had to use other factors to work the data baseline back. As we were not able to make that data statistically accurate and to put intervals on it, from a statistician’s point of view, it was deemed to be a plausible estimate but not accepted as being statistically accurate. However, a Treasury assessment done in a different way came within 2% of our estimate. So we were working with figures that were plausible and could be taken forward to demonstrate improvement, but they were not statistically accurate.

68. Mr Dallat: Surely you should have been at the forefront of this initiative. What methods were you using that did not hit at all on the reality of child poverty?

69. Mr McMillen: We ended up with statistical evidence that set the baseline for child poverty, which was within 2% of an assessment carried out by the Treasury but was not statistically rigorous in terms of professional statisticians’ standards. However, our statisticians were happy that that was a plausible and good target for us to work off, and that was the baseline that we worked from.

70. Mr Dallat: Do you agree that child poverty is one of our single greatest injustices? The Good Friday/Belfast Agreement was signed 11 years ago, and a promise was made in section 75 of the Northern Ireland Act 1998 that there would be equality. You have had all the available resources at hand to produce statistics that would allow political decisions to be taken to address the difficulties faced by the most vulnerable people in society. Do you agree that you have failed in that regard?

71. Mr McMillen: I believe that we have produced baseline data that has allowed us to measure improvements in what we are doing to address child poverty. I accept that it has not met the required criteria, and we have put systems in place to try to improve on that. Child poverty is still a key target for the Department in the current PSAs.

72. Mr Dallat: Following on from that, I note from the 2008-09 delivery report that the delay in achievement against this target: “reflects outstanding decisions on a measurement for severe child poverty".

Is that not a classic example of putting the cart before the horse?

73. Mr McMillen: When Ministers were developing the Programme for Government there was a desire to do something about severe child poverty, but there was no accepted definition of what child poverty was, nor was there a way to measure it. Nevertheless, Ministers wished to do something about it, and it is perfectly right that politicians should be able to say what is important and to express aspirations. Statisticians in the Department have been examining how we can measure severe child poverty; they have presented a number of options to Ministers and are awaiting a decision.

74. Mr Dallat: Do you agree that a definition should have been established before you started measuring it? Otherwise, you do not have a clue what you are measuring.

75. Mr McMillen: That is an important factor, but it should not necessarily limit the wishes of politicians and Ministers in setting targets for issues. Targets can, sometimes, be aspirational, after which we can try to find systems for measuring them. Certainly, where definitions can be arrived at beforehand, that makes it much easier to measure performance.

76. Mr Dallat: Do you agree that you were shooting at a target that you could not see, because you did not know what the target was?

77. Mr McMillen: At that stage, yes.

78. Mr Dallat: That is an awful indictment, given the levels of inequality in Northern Ireland that affect children, and, particularly, children who live in child poverty as we speak. Let us hope, Chairperson, that something positive comes out of this sooner rather than later. Otherwise, this Committee is wasting its time.

79. Mr McMillen: There are two sets of targets. There are targets for child poverty, which have statistics and measures agreed with them. We are trying to reach agreement on a definition of, and measurements for, severe child poverty, for which Ministers set targets.

80. Mr Dallat: Do you agree that those were the targets that you should have been focused on — the ones for severe child poverty?

81. Mr McMillen: When we can put a definition on it. Everyone should be focused on the targets for severe child poverty.

82. Mr Beggs: The report sets out specific limitations in the old PSA targets for the Department of Agriculture and Rural Development (DARD), including a failure:

“to provide baseline figures or describe how a net increase in jobs would be calculated".

Incorrect baseline figures were provided in another instance, and, in a third case, it was found that:

“the data system used to determine baseline information was not subsequently used".

83. I will turn to the new system, and, in particular, PSA 4, ‘Supporting Rural Businesses’. One of your new targets is to:

“reduce by 25% the administration burden on farmers and agri-food businesses by 2013."

Can you outline the baseline position for that target and define each of the key terms in it?

84. Mr Lavery: Under PSA 4 we have committed to reduce the administrative burden created by DARD on the agrifood sector by 25%. To take that forward, we had an independent review of the current burden on the agrifood sector. That review reported on 30 June and has completed its public consultation period. On the basis of the responses to that consultation and our own review of it, we will make a Government response.

85. That review used an international standard methodology to measure the impact of red tape on business, and the Department expects that that methodology will be followed through to measure its performance.

86. Mr Beggs: Can you define what that target is? A target must be SMART (specific, measurable, attainable, realistic and timely) if it is going to work; it must be clearly measurable. What is the Department actually measuring?

87. Mr Lavery: The Department will measure the burden on a farmer of, for instance, having to read, absorb and comply with the guidance to complete the Integrated Administration and Control System form or the single farm payment scheme form; being required to accompany a veterinary officer in the course of a brucellosis or bovine TB test; and completing his claim form. All of those operations take time, and that time will be calculated in accordance with the standardised methodology and converted into a financial impact. Currently, the sum total of the financial impact has been calculated at £50 million per year, and the Department is striving to impact on that figure.

88. Mr Beggs: Are you confident that an objective rather than subjective measure — which could cause great variation in the figures produced — will be used?

89. Mr Lavery: Absolutely. The Department will have an independent assessment of its performance; we will not simply be assessing it internally and marking our own homework. The Department will submit its findings to an external reviewer.

90. Mr Beggs: OK. Moving on, the fundamental principle of PSAs is to measure performance against the key business objectives of a Department. Paragraph 2.19 of the report refers to a complicated target on the annual supply of timber using agreed sale lots. It is not clear whether that target gets to the heart of the efficiency and effectiveness of the Forest Service, which is one of the Department’s key business areas.

91. Do you accept that measuring wood output does not measure the efficiency or the effectiveness of the Forest Service, as the Department could sell more wood by dropping its prices, or move too much wood and become unsustainable. Do you agree that that is not a SMART target?

92. Mr Lavery: First of all, it is important to have a range of targets when undertaking performance measurement. The more targets a Department has, the better it is for the public and for managers who are trying to drive performance. In this —

93. Mr Beggs: What does that target tell the Department?

94. Mr Lavery: In that case, the output target for the Forest Service was purposely set at 400,000 cubic metres of timber per year. The performance was driven to achieve that level of output, and the Forest Service was successful in achieving that target throughout the PSA period.

95. That, in itself, is very important and significant, but the key message behind that target was to reassure the timber processing industry that, during that three-year period, the Department did not intend to have an increase or a diminution in supply. That was an important reassurance to an industry that is a substantial employer in rural areas.

96. Mr Beggs: What does that target tell the Department about the efficiency or the effectiveness of its business? Are there any commercial-style targets in that area? The setting of an output target does not provide that commercial information on its own.

97. Mr Lavery: In addition to that target there is a suite of targets contained in the Forest Service business plan each year, on which the Department reports. It is that accumulated position that allows the Department to measure the efficiency and effectiveness of the Forest Service. In the area of timber supply, the Department benchmarks its prices and performance against the rest of the UK and the Republic of Ireland, and it examines the out-turn prices that it achieves. Therefore, there is a suite of commercially-driven targets in addition to the output target.

98. Mr Beggs: Would it not have been better to include those targets in the Department’s PSA?

99. Mr Lavery: At the time, we were under a regime that invited Departments to specify 10 targets that encapsulated them. The Department of Agriculture and Rural Development is complex and wide-ranging, and we made the best selection that we could at the time in the knowledge that the Forest Service was publishing its annual performance publicly.

100. Mr Beggs: I will move to another target. Appendix 1 of the report states that PSA target 1 seeks to:

“Reduce the gap in agricultural Gross Value Added (GVA) per full time worker equivalent (measured as Annual Work Units) between NI and the UK as a whole by 0.6 of a percentage point per annum between 2003 and 2008, i.e. from 34% in 2003 to 31% in 2008."

Will you explain the meaning of “annual work unit"? Most people have an idea about GVA. However, that sentence is a very complicated construction, never mind how difficult it is to understand its meaning. What does it mean to the average man on the street who is meant to be reading these reports and assessing government actions?

101. Mr Lavery: That target was popular with the economists, if perhaps less popular with the man on the Clapham omnibus. However, the idea was to recognise the gap in productivity between Northern Ireland industry and the average productivity in the United Kingdom, and then to strive to reduce it. To do that, the Department has a suite of programmes that range from education and training to capital grants. We were trying to encapsulate the overall impact of those programmes, which, after all, must be intended to improve the competitiveness of our agriculture industry versus that in the rest of the United Kingdom and worldwide.

102. We had to try to get an overall measure. In that context, we had to produce a unit, which became the annual work unit. It is simply a way of finding a statistical unit that can be used as a unit cost or, in this case, a unit of performance. The industry has a wide range of working practices; there are full-time farmers, part-time farmers, farmer spouses and labourers. We reduced all those roles to a particular annual work unit, which is a statistic that can be used as an agriculture performance measurement that is acknowledged across the United Kingdom and that enables us to benchmark against the UK average.

103. Mr Beggs: Why was the targeted improvement so modest at 0·6% a year? Moreover, why did a 31% gap remain at the end of the period? If the target is so modest and the outcome so minimal, is all the effort to record the figures worthwhile?

104. Mr Lavery: A 3% improvement in productivity compared with the United Kingdom average would have equated to around a 10% improvement in our relative position. That would have been significant. However, the difficulty with this target, as the report makes clear, is that gross value added is a good way to measure the health of a sector or the health of a region. It is not a good way to measure the impact of our performance. We have discontinued its use to measure performance, because gross value added encapsulates many factors, most of which are outwith our control, including the exchange rate and movement in prices, which, unfortunately for our industry, is happening in both directions at the moment.

105. Mr Beggs: Have you set this as another target that has not been a SMART target? It must be borne in mind that smart targets are a fundamental thing taught to anyone in business. Do you accept that this is the third area where you have not had a SMART target?

106. Mr Lavery: At that time, the focus was on outcome rather than output. SMART targets are a very good way of measuring the output of an area. I could specify that during the course of 2008-09 I would train 1,500 people and that they would arrive at a certain level of qualification. That is a SMART target; it measures the output. Our focus at that time was very much on outcome, and what that would mean to the economy. That target was designed to try to convey an overall economic out-turn.

107. Mr Donaldson: I declare an interest as a former Minister in OFMDFM, though not during the period covered by the report. Paragraph 2.3 notes that senior managers in Departments are ultimately responsible for the quality of PSA data systems and so on, yet paragraph 2.4 shows that the departmental management boards appear not to have taken an interest in PSA quality control and data accuracy — and that is not exclusive to your Department by any means. What systems and controls does DARD have in place to enforce that quality assurance function?

108. Mr Lavery: First, we have sought to comply with the guidance that OFMDFM has helpfully issued. Therefore, I chair a PSA delivery board, and that board meets quarterly. Each quarter we receive a progress report on our PSA targets, and we review those targets and look at any data quality issues that may present. The six-monthly reviews go to the Minister and the Committee for Agriculture and Rural Development, and we take care that they are also seen by our entire board, including the independent members. Each PSA now has a senior responsible officer (SRO) who is a member of the Senior Civil Service, and those officers are supported by data quality officers. Data quality officers are under absolutely no illusion that they are responsible for precisely that: the quality of the data that goes in to support the target, and so forth. That is the system.

109. Mr Donaldson: Yes; that is the system, but the evidence presented in the report indicates that, although management boards took an interest in ensuring that targets were in place and monitoring progress towards the achievement of those targets, when it comes to the PSA data systems — the data accuracy and quality control — there was no evidence that Departments had put in place formal guidance or policies specifically in relation to the standards expected. Although you have lines of management, which is good, have you now put in place those systems? Have you now got policies and formal guidance specifically designed to establish whether the PSA data systems, quality control and data accuracy are working for you?

110. Mr Lavery: I can offer you some reassurance. First, we have learned lessons from the Audit Office report. Secondly, we have ensured that there is a delivery board in the new PSA framework, which I chair. Because the delivery board is focusing on PSA performance, it has the time and the focus to drill down into the PSAs. Perhaps the departmental management board would not have the time to go that far in its meetings. Having a separate board to do that work will, obviously, improve the focus.

111. The third reassurance is that the Committee for Agriculture and Rural Development, I am delighted to say, takes a keen interest in the performance of the Department, and specifically our reporting of the performance on PSA targets. With that degree of invigilation, interrogation and the stimulating enjoyment of debate, I am satisfied that no stone will be left unturned.

112. Mr Donaldson: I am afraid that we are not, Mr Lavery. The problem is that I am not hearing that you have put in place formal guidance and policies to deal with this. It is all very well being able to drill down, but if you do not know what you is looking for — the standards that are required for quality assurance, data accuracy and quality control — how are you able to assess whether the people who are doing the drilling are doing their job and getting to the core issues?

113. Mr Beggs referred to your targets, and you gave an explanation for them. Perhaps it might assist if you were able to more clearly establish whether the whole system is delivering in relation to data control. I am anxious to hear whether you have put in place formal guidance or policies.

114. Mr Lavery: In that case, there are two further reassurances. First, the good practice checklist, which is contained in the annexe to the Audit Office report, was sent to all of our SROs and our data quality officers. To that extend, people know the standard; they know what they are trying to do.

115. Secondly, we have commissioned our internal auditors to carry out an in-depth review, specifically to establish whether DARD PSA data systems and reporting procedures are compliant with the good practice checklist. That has been commissioned, and it will report at the end of October. Thirdly, the guidance that was provided by OFMDFM in 2007 has been closely followed by DARD.

116. Mr Donaldson: It is good that we have guidance. Are you able to provide the Committee with a copy of the policies that you have to monitor the progress that is being made, particularly in relation to the standards that are expected from PSA data systems?

117. Mr Lavery: I am happy to do so.

118. Mr Donaldson: Although the report identified several weaknesses in the data systems that were used by your Department to support PSA targets in 2006-07, paragraph 1.12 outlines assurances from OFMDFM that things have improved significantly, and we heard the evidence from Mr McMillen about that. Can you explain to the Committee exactly how you have ensured that the data systems that support more recent PSA targets for your Department are robust?

119. Mr Lavery: All of the PSA targets were developed in the context of guidance that was set out by OFMDFM and circulated in the Department. They have all been the subject of reporting on progress to date. They have also benefited from the draft and final reports of the Audit Office in this investigation being circulated to the business areas named in the report, and from the good practice checklist that was circulated to all of the SROs and the data quality officers. Following the completion of the process, the memorandum of reply and the Public Accounts Committee’s conclusions will be circulated in the Department and specific lessons will be learned.

120. Mr Donaldson: Paragraphs 2.14-2.17 outline the need to involve professionals in the PSA process. Therefore, in reference to your new PSA, can you explain the extent to which your professional economists or statisticians were involved in designing your targets? What did your Department do as a result of its advice to ensure that each data system was fit for purpose? Going back to the point that was made by Mr Beggs, if you are falling well below a PSA target, do you involve your economists and statisticians to assess why that is the case and whether the target is robust?

121. Mr Lavery: The issue of involving specialists in the formulation and design of PSA targets and, indeed, in their operation is an area in which, in 2004, we did not perform as well as we should have. The reason for that is that, with the exception of the gross value added target, we relied heavily on administrative systems and management systems that collect information to measure out-turn. We did not see that the involvement of an economist or a statistician would have added significantly to those systems. In hindsight, that was wrong. Had we involved a statistician, we could have avoided many of the deficiencies and weaknesses that the report points to. We have taken that on board. This time, an economist has been involved from the start of the PSA dialogue through the design of the targets. In addition, the data quality officers all have access to an economist and a statistician to support them.

122. Mr Donaldson: Thank you for your honesty. Hopefully, in due course, the results of that will be seen in the PSA targets and the Department getting closer to meeting those targets.

123. Mr Lunn: Mr Sterling, you have had the benefit of listening to Mr Lavery answer a question on behalf of his Department. I ask you the same question. The report has identified weakness with the data systems used by the Department of Enterprise, Trade and Investment (DETI) to support the PSA targets that were in place during 2006-07?

124. In paragraph 1.12, OFMDFM assures us that things have improved significantly. On behalf of the Department of Enterprise, Trade and Investment, can you explain what has been done to ensure that the data systems supporting your Department’s more recent PSA targets are robust?

125. Mr David Sterling (Department of Enterprise, Trade and Investment): I will go back to 2007. The issue of the guidance was quite fundamental to the changes that we have put in place. In the autumn of 2007, that guidance was used to inform the development of PSAs within DETI, at the time that the PSAs and targets for the current Programme for Government were being developed. The intention was to comply with that guidance at that time.

126. In the Department, at that time, we left the responsibility for developing PSAs and targets with our economists. They took the lead in developing the draft PSAs, and were involved in negotiating those in the Department with our Minister and, ultimately, with OFMDFM and the Executive. Since then, further measures have been introduced. We have created a new division, the strategic planning, economics and statistics (SPES) division. We have brigaded our economists and statisticians, and cemented with that division the responsibility for the development of PSAs and targets, the maintenance of the data systems, the validation of the data systems, and the monitoring of performance. We have given that responsibility to our specialists.

127. The oversight of performance occurs at a variety of levels. Working from the bottom up, I can tell you that, on a quarterly basis, our SPES division —

128. Mr Lunn: What division?

129. Mr Sterling: The strategic planning, economics and statistic division; it is a bit of a mouthful.

130. That division engages with our various business areas and with our satellite organisations, such as the Tourist Board, Invest Northern Ireland, the Consumer Council, and the Health and Safety Executive. The division challenges the various business units within the Department, and the non-departmental public bodies, on their performance. There is then a formal oversight and liaison meeting with the various business units, and the non-departmental public bodies, at which senior management in the Department — including me, as senior responsible officer — engage with our counterparts in the various areas. The results and reports from those oversight and liaison meetings are reported to the departmental board.

131. Delivery boards are also in place for PSA 1 and PSA 5. Ultimately, we report our performance to the Enterprise, Trade and Investment Committee at the year-end. The Minister discussed Departments’ performance for 2008-09 at a Committee for Enterprise, Trade and Investment meeting in June. That is a broad overview of the framework that we have in place.

132. Mr Lunn: That sounds very comprehensive. Is that in line with what the Agriculture Department has done?

133. Mr Lavery: Yes. We are not quite twins, but we are closely related.

134. Mr Lunn: At paragraphs 2.23 to 2.26 it seems that limitations with your PSAs were not clearly disclosed. Why did you not consider it necessary to fully disclose limitations in the quality of the data, the methodology or the underlying systems for those important areas?

135. Mr Sterling: I distinguish between the previous Programme for Government and the arrangements that applied then, and the arrangements, processes and procedures that we have put in place for the current Programme for Government. I acknowledge all the deficiencies that have been identified.

136. We have sought in the technical notes for PSA 1 and PSA 5 to ensure that all limitations in our current data systems and procedures are identified. To the best of my knowledge, that is the case. However, to provide further satisfaction in that regard, our strategic planning, economics and statistics division will conduct a further review of the adequacy of the data systems in the run-up to the production of the next Programme for Government. We were satisfied, in a sense, with what was produced in 2007; however, we will validate that again before the process for developing the next Programme for Government starts next year.

137. Mr Lunn: Do you think that the quality of the data may still be deficient in some areas?

138. Mr Sterling: I do not think that we are perfect; however, we have addressed all the deficiencies that were identified in the report. Some issues will always be difficult to resolve entirely. One of the issues identified was the use of GVA as a measure of performance, because of the long time lag. There can be more than two years between performance and getting accurate GVA information from the Office of National Statistics (ONS).

139. Nonetheless, we still believe that GVA is a good measure of living standards and prosperity, because it is an internationally recognised standard. It is the right thing to be measuring. The fact that we do not have up-to-date information is a problem, but we are trying to address that. We have commissioned forecasts from an independent organisation. We will get the first set of forecasts in the next few weeks. Those forecasts will never be as accurate as the material from ONS. Nonetheless, they will be produced using a broadly similar methodology. We cannot address that limitation entirely; however, we believe that we are doing as much as we can do in the circumstances.

140. Mr Lunn: Do you accept that the financial information that was presented to the Assembly in 2007 should have carried a health warning or at least a caveat that it might not be reliable?

141. Mr Sterling: Yes.

142. Mr Lunn: What would you say about 2009?

143. Mr Sterling: Without going over what we did in 2007 again, we have sought to identify risks in the Department through our risk management processes. We have concluded at corporate level that our data systems do not pose a major risk. We have, therefore, concluded that our data systems are fit for purpose. We recognise that we will be challenged on that at some stage. We are carrying out an internal challenge, and risks have been identified in certain areas at lower levels.

144. For example, our statisticians have identified a risk with their role in dealing with ONS data. Through the Northern Ireland Statistics and Research Agency, they have engaged with ONS to ensure that we are satisfied that the quality of data is ample.

145. Mr Lunn: The Department’s agency companies, particularly Invest NI, have received some unfortunate publicity in the past few weeks. I do not agree with all of it, but are the statistics upon which the criticism was based accurate?

146. Mr Sterling: Without going through the press commentary, I can say that we have put as much attention into the development of the data systems, targets and performance management arrangements for Invest NI as we have for the rest of the Department. The report identified deficiencies with some of Invest NI’s measures, for example, the use of GVA and the measures in relation to R&D expenditure. We believe that we have addressed those. On reflection, the comments about Invest NI’s performance in the press do not really include a criticism of its data systems. I hope that that answers your question.

147. Mr Lunn: That is a fair answer; thank you.

148. Ms Purvis: I shall follow on from the point that Trevor made. Mr Sterling, you spoke about your previous and continuing reliance on data sets from ONS. Paragraph 2.19b of the Audit Office report expresses reservations about using those data sets, particularly for the monitoring of progress against year-on-year PSA targets. You said that you were satisfied with what you produced in 2007, but, given the reservations that the report raises, how can any reliance be placed on the Department’s claims of achievement against the targets?

149. Mr Sterling: We acknowledge that the issue of the time lag on GVA-based measurements is a problem. We are trying to address that through the production of those forecasts, which will never be as accurate as the ONS statistics.

150. However, we use GVA as a measure of living standards and prosperity, and the overriding goal in the Programme for Government that we are working towards is the halving of the productivity gap between here and the rest of the United Kingdom by 2015, excluding the greater south-east of England. Northern Ireland’s productivity is currently 81% of that of the rest of the United Kingdom, excluding the greater south-east of England.

151. That target will not significantly change over time, and the independent review of economic policy acknowledged that significant improvements in our productivity will only occur over time. For that reason, using GVA is acceptable, because it looks at trends over a long time. I acknowledge that it would be good to have more accurate information more regularly, but it is difficult to do that.

152. Ms Purvis: Paragraphs 3.14 and 3.15 of the report mention the two-year time lag. Appendix 1 shows Departments’ measurements against PSAs. At the time, it was difficult to report on those. Can you confirm whether all those PSAs have been reported on, and what the outcomes were?

153. Mr Sterling: In each of those areas, we have either sought to address the limitations or, in some cases, agreed new targets. In part, that is to address the deficiencies that the report identifies.

154. Ms Purvis: Can you provide the Committee with that information?

155. Mr Sterling: I am happy to produce something that shows the way in which the targets have been improved or revised.

156. Ms Purvis: I appreciate that.

157. Mr Sterling: Our technical notes set out the current targets, but I understand that it would be helpful to be able to see how those have progressed. I am happy to provide that information to the Committee.

158. Mr McGlone: The discussion has become very technical, and data is whizzing about all over the place. Will you explain all these guidance issues, the memoranda and so on? Who enforces the guidance to ensure that the work is done? I was formerly the Chairperson of the Committee for the Environment. At the end of each year, the programme of work that had been carried out was compared with what had been presented at the beginning of the year. If the oversight and global management is not being done, how do you seriously expect to comply with the other PSA targets? If about 60% of the work programme is delivered through the Department and the Committee, how can everything else be expected to fall into place? Who has the oversight role to ensure that the work is done? How does the PSA activity fit in with that?

159. Mr Sterling: The guidance to which we have been working is DFP guidance that was produced in spring 2007. I have tried to explain how we have sought to ensure compliance with that guidance, and we have done that largely internally.

160. Mr McGlone: What if it is not being done? Where does the buck stop? Who ensures that it is done?

161. Mr Sterling: The buck stops with me.

162. Mr McGlone: I am not talking about your Department; I am talking about Mr McMillen’s Department, for example.

163. Mr McMillen: The key feature of the Programme for Government, which this report looks at, and where we are today is that the central challenge function has now been given to the team from OFMDFM, DFP Supply and PEDU.

164. Mr McGlone: Who is on that group, and when was it set up?

165. Mr McMillen: The group was officially set up after the Executive agreed the new Programme for Government monitoring system in March 2009. However, it has been doing administrative work behind the scenes for a couple of years. The group looks at the quarterly returns from Departments and challenges them to decide whether there is evidence to support the —

166. Mr McGlone: Challenging the returns and bringing home the bacon are two different things.

167. Mr McMillen: The group challenges the returns, and, if the evidence is not any different, it will present the report for the Executive. It is that report that informs the Executive of where Departments are having difficulty in delivering the Programme for Government targets. That information can be taken into accountability meetings between Ministers, and they can check on where Departments are having problems or not delivering the Programme for Government targets. Therefore, a system is in place that takes the matter right through to ministerial and Executive level.

168. Mr McGlone: I hope that we will see some product, then.

169. PSA 6 relates to the establishment of new businesses. Paragraph 3.18 of the report indicates that the PSA target did not square up with Invest NI’s corporate plan. I have looked at the Department of Enterprise, Trade and Investment’s justification of each of the targets. There are figures like 8,500 new businesses to be created and 3,000 new businesses in new targeting social need (TSN) areas. However, during the 2005-08 period, that changes to 10,000 new businesses, of which 40% were to be in new TSN areas. Does that mean that the target for new businesses has been reduced by 1,500?

170. Mr Sterling: Yes; the target was changed.

171. Mr McGlone: Can you understand how people such as me could be confused by that?

172. Mr Sterling: Yes.

173. Mr McGlone: In the Valence Technology report that the Committee produced last week, we recommended the development of a quantified target for Invest NI to ensure that the issues facing people from areas of economic and social disadvantage were addressed. One of the targets in PSA 6 relates to the establishment of 10,000 sustainable new businesses, of which 40% were to be in new TSN areas. I followed that target through to the 2008-11 Programme for Government and see that it has been revised to 6,500 new jobs from inward investment. Seventy per cent of the projects generating those jobs are to be within 10 miles of an area of economic disadvantage.

174. How does that square with your previous target? How different is it? Is there a disparity? If so, have you lowered or raised the target? Will you explain why the wording of the old target was changed?

175. Mr Sterling: I will explain that now, but I will also try to explain it in the note that I told Ms Purvis I would send, which will set out the reasons for changes so that we can compare the current Programme for Government with previous ones.

176. We accept the criticism in the Audit Office report that the use of the word “sustainable" was not sufficiently clear. However, for the sake of clarity and transparency, I should point out that we no longer have a target for the creation of new businesses. We are taking a different approach in which we are seeking to focus on the creation of wealth and prosperity, not just on starting businesses. Therefore, our targets are based on supporting people to export and to invest more in research and development. In other words, we are encouraging and supporting companies to take actions that generate more wealth. That is underpinning our current arrangements.

177. Mr McGlone: The targets are all for job-creation projects. The creation of wealth comes as a result of job-creation projects.

178. Mr Sterling: PSA 3 contains targets for job creation, and we work jointly with the Department for Employment and Learning on that. Therefore, we are still interested in job creation, but we are just as interested in taking steps that will generate wealth and prosperity. That is why there are targets for the type of earnings that we want to see in new jobs that are being promoted. We want to see companies being encouraged and supported to export and generate wealth in that way, and we want to encourage companies to market, sell abroad and invest in research and development. We have had careful discussions with the Enterprise, Trade and Investment Committee on those issues when we have been reporting on performance and targets.

179. Mr McGlone: Thank you. Did you just explain the change in the wording?

180. Mr Sterling: Yes, I did. However, as I said, it is only fair that I set it out in more detail in my letter.

181. Mr McGlone: You mentioned that you have been working to DFP guidance — it would be useful if we had a copy of that.

182. Why have you adopted a less challenging target for job creation?

183. Mr Sterling: I am not sure why there was a change in the target between the 2004-06 period and the later period. However, we do not now have a target for the creation of small businesses at all, and I have attempted to explain why that is. One of our main targets is to create 6,500 new jobs from inward investment, and there is a sub-target that 5,500 of those jobs should provide salaries above the Northern Ireland private sector median, which is about £17,000 per annum.

184. Mr McGlone: Of the projected 8,500 new businesses in the period 2004-07 and the 3,000 new businesses in new targeting social need areas, and also in the period 2005-08, how many businesses were actually established? That would be useful.

185. Mr Sterling: I will do that; I will give you a report that sets out what was achieved against the earlier targets. The delivery report, which I think you have looked at, shows the performance against the first year of the current Programme for Government targets.

186. Mr McGlone: Finally, will you describe the 10-mile issue?

187. Mr Sterling: It is an as-the-crow-flies measure.

188. Mr McLaughlin: Mr Pengelly, for the benefit of the Committee, will you outline your involvement in the process as head of the performance and efficiency delivery unit (PEDU) in DFP?

189. Mr Richard Pengelly (Department of Finance and Personnel): The role of PEDU is related to the preparation of the Programme for Government delivery report. Mr McMillen has mentioned that a central team was established to undertake an assessment of departmental performance against PSA targets, and that central team comprises officials from the economic policy unit of OFMDFM, supply officials from DFP and colleagues from PEDU. That team takes receipt of the submissions and returns from Departments on their progress against PSA targets. It carries out the initial assessments, plays those assessments back to the Departments and has a dialogue with the Departments. Ultimately, the team facilitates the preparation of advice and guidance, through Mr McMillen, to the First Minister and deputy First Minister and the Executive, with respect to overall performance.

190. Mr McLaughlin: Therefore, your involvement in that process is as the head of PEDU?

191. Mr Pengelly: Yes.

192. Mr McLaughlin: OK. You are probably aware that earlier this year DFP submitted a performance report to the Committee for Finance and Personnel, which outlined a number of measurements that had been achieved or substantially achieved — whatever that means in the context of a mid-year report. Through the questioning of departmental officials, it became apparent that those self-assessments could not be sustained, with the result that the report was sent back for review and resubmission.

193. The Committee for Finance and Personnel also discovered that the Programme for Government delivery report — which is produced by OFMDFM, and which you have just referred to — had differing assessments from the in-year report, and that is the theme that I want to explore. We have had discussions about the data sets that Departments are relying on, and I presume that it can be quite challenging to achieve a robust and uniform assessment across a number of Departments. Furthermore, I suspect that the vulnerability of those self-serving self-assessments is part of the problem.

194. We find that Departments are using one measurement tool or assessment scale, while the OFMDFM report relies on a completely different scale. With a relatively small region such as the North, why can uniformity not be achieved?

195. The whole purpose in having the PSAs is to have reliable and robust performance assessments which the general public can depend on. Their purpose is not to blind people with statistics or assessments, particularly when they do not have the same opportunity as a scrutiny Committee to drill down into those assessments. What is your involvement with respect to PEDU’s role in the assessments? Have you outlined or established some tangible improvements in the system?

196. Mr Pengelly: With respect to the specific problem that you have articulated, the short answer is no. One of the key differences between the central team and the Departments is that it carries out assessments on behalf of the Executive about cross-cutting performance by the Administration. That is one of the key changes —

197. Mr McLaughlin: It is a high-level perspective

198. Mr Pengelly: Yes. It takes a very high-level view.

199. One of the fundamental changes between the PSAs that were the subject of the Audit Office report and the current crop of PSAs is that the departmental focus has now moved to a cross-cutting basis. However, some components of the PSAs still remain departmentally focused. Therefore, the central team has taken a strategic view of the PSAs, whereas the departmental assessments and the DFP business plan that you spoke about come from a departmental perspective, taking elements where they contribute to PSAs.

200. Because it is a departmental issue, it is used as an internal management tool by Departments, which establish their own criteria for assessing performance. DFP had a seven-point red/amber/green rating, whereas the central team uses a four-point rating. DFP has decided to move in parallel and adopt a four-point rating.

201. Mr McLaughlin: You will move to the OFMDFM system?

202. Mr Pengelly: Yes. That will be useful and will eliminate some of those differences in the future.

203. Mr McLaughlin: It is important that DFP has decided to address that problem of the differential in assessments by adopting a common assessment scale. Will that be applied across all Departments, or will the problem continue to exist elsewhere?

204. Mr Pengelly: Ultimately, that is an issue for individual Departments. The business plan is a departmental construct. It is for Departments and their respective Ministers to decide on what basis they want to assess and report performance of their own business entity.

205. The work that we do for the central team takes the Executive perspective, so we can be much more prescriptive about how Departments report for our purposes. However, Departments need that flexibility to report on their own business. Obviously, they will engage with their respective Committees on that matter.

206. Mr McLaughlin: Is that a formal position — that it is a matter for Departments? We are not talking about the European Commission here. All Departments can set their own monitoring system?

207. Mr McMillen: Departments have the independence to do that. However, you have raised an issue on which we need to reflect. It has caused confusion to members and, obviously, to the public that there two monitoring systems. The first report, which was issued in June, was the first time that that had been exposed. Obviously, the Finance Committee has dealt with that. We need to reflect on that. On the other hand, it shows that there was a healthy challenge at the centre: people at the centre actually did not believe that what Departments were saying.

208. Mr McLaughlin: That is good.

209. Mr McMillen: They said that they willing to put it to the Executive, and to Ministers, that they did not believe what Departments were reporting. As we engage with Departments during the course of subsequent reports, we will be able to narrow the gap and to come up with a more consistent way to take that forward.

210. Mr McLaughlin: That is a helpful answer. Without giving you any more work to do, Richard, because I know that you cover quite a brief, is there not, in fact, a role for PEDU to consider reporting mechanisms? If the information is reliable — if every one of the Committees, MLAs and, in particular, the public can have the type of in-year, mid-term, and end-of-year performance reports that they can actually rely on — that will, ultimately, help everyone.

211. Mr Pengelly: It will. However — and I do not say this to be unhelpful — it goes back to the core rationale for the establishment of PEDU. The Finance Minister established it. I believe that PEDU has a role in that process. Fundamentally, however, PEDU’s role is to work with Departments to ensure delivery of individual priority targets. It is a small unit, and deliberately so — we do not want to create another overhead in the system.

212. Ministers might be reluctant to take it away from that sharp-edged focus on helping Departments to achieve successful delivery of targets to a more back-office assessment of the reporting framework. That sits best where the current policy responsibility lies, which is with our colleagues in OFMDFM. Certainly, because PEDU is in that delivery space, it can add value to that and can work with them. However, I am not sure —

213. Mr McLaughlin: Obviously, there is a fundamental weakness in the performance assessment process. It alarms me that your Department produced a report that was not robust. The Committee dismantled it in about 10 minutes. Officials had to accept that the report did not provide accurate and robust conclusions on their own performance. They had to take it back to your Department, review it, rewrite it, and bring it back to the Committee. Even subsequent to that — at another stage of the process — we found that the OFMDFM report came to different conclusions. That is hardly joined-up government, to say the least.

214. Mr Pengelly: I accept that. However, the debate that played out with the Finance Committee — unfortunately, I was not there for the first episode, although I had to be there for the second —

215. Mr McLaughlin: We were looking for you. [Laughter.]

216. Mr Pengelly: So I heard.

217. That was in parallel with the debate that played out when the central team was assessing DFP. I do not wish to justify differences of opinion, but some of the issues that have been very helpfully raised by the Audit Office about data systems and specificity on targets show that we are not there yet. We have made great strides since the publication of the report and the helpful good practice guide, but we are not there yet. For as long as we are not there, we are inherently in a place of subjective judgement. Every individual that you ask to offer a subjective view will offer a different one. I hope that it is a diminishing issue. However, I acknowledge that it is an issue, and one that we need to focus on and eliminate.

218. Mr McLaughlin: Appendix 4 fairly and accurately reflects improvements that have been made to the performance assessment, but section 3 of the report deals with difficulties in the compendium reports from the centre. Clearly, there is more to do.

219. Mr Pengelly: There is more to do. We talked about aligning the seven-point scale with the four-point scale, and we can do more with the timing. The first delivery report was a very difficult piece of work, particularly for Mr McMillen and his colleagues, in getting it turned round and approved by the Executive before the summer recess. It was very important to get it into the public domain before the recess.

220. There was slightly more of a time lag to the assessment of the departmental business plans. Many of the performance indicators are lag indicators, and there is always the risk that better, more informed information becomes available after the central delivery report, which may change the perspective. Those are the sorts of issues that we need to focus on and align better.

221. Mr McLaughlin: We will get a response to the report. You have a very significant and strategic role in helping to bring that together, and I would be very interested if the Committee could get your views on how the system can be improved.

222. Mr Pengelly: Yes, we will set that up.

223. The Chairperson: You will be glad to hear that you can take it easy now.

224. The PSA process has been set up to improve accountability in the public sector. It seems that there is more to do in some Departments to make that right with regard to information and performance. The session has given us an interesting insight into the quality of the performance information that has been reported by the Departments. We welcome the improvements introduced in 2007, but I urge OFMDFM and all the other Departments to comply fully with the best practice in this area. People have said that they will supply us with all the information and that may lead us to pose more questions after we reflect on today’s meeting. Thank you for coming in today.

Appendix 3

Correspondence

Correspondence of 1 October 2009
from Committee of Finance and Personnel

DFP Logo

Assembly Section
Craigantlet Buildings
Stormont
BT4 3SX

Tel No: 02890 529147
Fax No: 02890 529148
email: Norman.Irwin@dfpni.gov.uk

Mr Shane McAteer
Clerk
Committee for Finance and Personnel
Room 419
Parliament Buildings
Stormont 1 October 2009

Dear Shane

DFP Performance Against 2008/09 PSA Targets

Following an evidence session with the Committee for Finance and Personnel on 16 September, DFP officials undertook to produce a further paper commenting on the reasons for differences between the status of 2008/09 PSA targets provided to the Committee in the Department’s revised paper of 29 June and those reported in the Programme for Government Delivery Report subsequently published by OFMDFM.

Of the 23 PSA targets for which DFP has responsibility, there are 5 where the assessment provided by the DFP differs materially from the OFMDFM Delivery Report. In the Department’s view there are 3 key reasons for these differences:

1. Difference in the Reporting frameworks

The assessment scales used by DFP and the Central Assessment Team differ. DFP has been using a 7 point scale to report progress using the terminology “Achieved", “Substantially Achieved" etc. On the other hand, the OFMDFM Report uses a 4 point scale: Red, Amber, Amber/Green and Green.

The differing scales clearly have the potential to produce different headline assessments so, for future reporting cycles, DFP will use the same reporting scale as the OFMDFM central team.

2. Assessment of Headline Status

Often a PSA target will encapsulate complex programmes with a range of actions. The process of synthesising the array of progress information into a single headline status indicator is subjective, requiring the exercise of judgement. This means that different parties can make different judgements.

The DFP report includes a commentary on the various targets, which provides additional background and justification in relation to the status that has been assessed. The OFMDFM report does not provide such a commentary to explain or expand on its assessment of progress. This makes it difficult to explain differences in the two assessments.

The department responsible for delivering the various PSAs will clearly have more information and detail about the progress of any particular target, given that it is more closely involved with the delivery. Accordingly, DFP will continue to ensure that it passes as much information as possible to the OFMDFM Central Team to inform its assessment.

3. Timing

The OFMDFM report became available after the DFP report was revised and forwarded to the Committee on 29 June. In future, it may be preferable for DFP to finalise its report after OFMDFM has published its end year report, to minimise the risk of variance.

We do hope these actions will keep to a minimum any variances between the DFP and the OFMDFM reports. However, there may be instances where we simply have to agree to differ; indeed there may not be much point in having an external assessment of DFP’s (and other Departments’) performance if the two were always, and by definition, exact duplicates of each other. In such instances, DFP is more than willing to justify its assessment by reference to detailed information on progress.

Yours sincerely,

signature - Norman Irwin

Norman Irwin

Chairperson’s Letter of 12 October 2009 to
Mr Gerry Lavery

NIA logo

Public Accounts Committee

Mr Gerry Lavery
Senior Finance Director
Department of Agriculture and Rural Development
Dundonald House
Upper Newtownards Road
Belfast
BT4 3SB

Room 371
Parliament Buildings
BELFAST BT4 3XX

Tel: (028) 9052 1208
Fax: (028) 9052 0366
E: pac.committee@niassembly.gov.uk

12 October 2009

Dear Gerry,

Evidence Session on Public Service Agreements

Thank you for attending the evidence session on Public Service Agreements.

As agreed at the meeting, the Committee would like you to provide a paper detailing the policies and guidance issued by DARD which specifically relate to the monitoring of progress and effectiveness of the data systems used by the Department.

I should appreciate your response by 2 November 2009.

I am also copying this letter to David Thomson, Treasury Officer of Accounts, for his information.

Yours sincerely,

signature - Paul Maskey

Paul Maskey

Chairperson
Public Accounts Committee

Chairperson’s Letter of 12 October 2009 to
Mr Richard Pengelly

NIA logo

Public Accounts Committee

Mr Richard Pengelly
Public Spending Directorate
Department of Finance and Personnel
Rathgael House
Balloo Road
BANGOR
BT19 7NA

Room 371
Parliament Buildings
BELFAST BT4 3XX

Tel: (028) 9052 1208
Fax: (028) 9052 0366
E: pac.committee@niassembly.gov.uk

12 October 2009

Dear Richard,

Evidence Session on Public Service Agreements

Thank you for attending the evidence session on Public Service Agreements.

The Committee agreed to request a detailed overview from you and your unit of improvements that would enhance the effectiveness of processes underpinning Public Service Agreements.

Your perspective will be considered by the Committee in formulating its report.

I should appreciate your response by 2 November 2009.

I am also copying this letter to David Thomson, Treasury Officer of Accounts, for his information.

Yours sincerely,

signature - Paul Maskey

Paul Maskey

Chairperson
Public Accounts Committee

Chairperson’s Letter of 12 October 2009 to
Mr David Sterling

NIA logo

Public Accounts Committee

Mr David Sterling
Deputy Secretary
Department of Enterprise, Trade and Investment
Netherleigh
Massey Avenue
Belfast
BT4 2JP

Room 371
Parliament Buildings
BELFAST BT4 3XX

Tel: (028) 9052 1208
Fax: (028) 9052 0366
E: pac.committee@niassembly.gov.uk

12 October 2009

Dear David,

Evidence Session on Public Service Agreements

Thank you for attending the evidence session on Public Service Agreements on 8 October 2009.

As agreed at the meeting, the Committee would be grateful if you could provide the following information:

1. An overview detailing which PSA targets for your Department were revised or improved; and results achieved against both the original target and the current target, for comparison, including performance against targets for business and wealth creation in TSN areas.

2. A copy of the guidance issued by the Department of Finance and Personnel, used by the Department and referred to by you during the evidence session.

I should appreciate your response by 2 November 2009.

I am also copying this letter to David Thomson, Treasury Officer of Accounts, for his information.

Yours sincerely,

signature Paul Maskey

Paul Maskey

Chairperson
Public Accounts Committee

Chairperson’s Letter of 13 October 2009
to Mr Stephen Peover

NIA logo

Public Accounts Committee

Mr Stephen Peover
Accounting Officer
Department of Finance and Personnel
Rathgael House
Balloo Road
Bangor
BT19 7NA

Room 371
Parliament Buildings
BELFAST BT4 3XX

Tel: (028) 9052 1208
Fax: (028) 9052 0366
E: pac.committee@niassembly.gov.uk

13 October 2009

Dear Stephen,

Evidence Session on Public Service Agreements

I write with reference to the above evidence session on Public Service Agreements conducted by the Committee on 8 October 2009.

The Committee sought an answer to the following question which was put to Mr McMillen, Accounting Officer for OFMDFM:

“…the extent of weaknesses in data systems has led to real concern over the reliability and accuracy of performance achievement claims. Can you assure this Committee that reliance was not placed on any such data systems when determining senior manager bonuses for performance?"

Mr McMillen responded to the Committee that he was unable to provide an answer to this question as it did not fall within his remit but was a matter for permanent secretaries.

The Committee would be grateful if you could clarify this and provide a copy of the guidance issued by the department in relation to the issue.

I should appreciate your response by 2 November 2009.

I am also copying this letter to David Thomson, Treasury Officer of Accounts and Mr John McMillen, Accounting Officer for their information.

Yours sincerely,

signature - Paul Maskey

Paul Maskey

Chairperson, Public Accounts Committee

Correspondence of 15 October 2009
from Mr Stephen Peover

DFP logo

 

From the Permanent Secretary

Stephen Peover

Rathgael House
Balloo Road
BANGOR, BT19 7NA

Tel No: 028 91277601
Fax No: 028 9185 8184
E-mail: stephen.peover@dfpni.gov.uk

Mr Paul Maskey MLA
Chairperson
Northern Ireland Assembly
Public Accounts Committee
Room 371
Parliament Buildings
BELFAST
BT1 3XX 15 October 2009

Dear Mr Maskey

Evidence Session on Public Service Agreements

Thank you for your letter of 13 October in which you requested an answer to the question put to Mr John McMillen, Accounting Officer for OFMDFM, seeking assurance that reliance was not placed on the data systems used for assessing the delivery of Public Service Agreements when determining performance bonuses for senior managers.

In responding to the question, I believe it would be helpful for the Committee if I were to explain the performance management arrangements in place for senior civil servants in the Northern Ireland Civil Service (NICS) and how these relate to decisions on the payment of non-consolidated bonus awards.

The performance management arrangements for senior civil servants, as with other staff, involve a number of key points during the annual reporting cycle. At the start of the reporting period each senior civil servant is required to agree with his or her line manger a number of personal objectives, against which performance will be assessed during the year. Guidance is issued to senior civil servants on how such objectives should be constructed. Over recent years the personal objectives for senior civil servants have been required to cover corporate objectives (objectives relating to the individual’s contribution to the wider work of the department or the NICS); business objectives (the specific objectives to be delivered in the business area in which the individual works); and capacity related objectives (objectives relating to how the individual contributes to improved capacity or capability in his/her own business area or the wider department through, for example, efficiency measures, improvement in processes, learning and development, etc.) The personal objectives agreed for each of the approximately 220 senior civil servants in the NICS will differ according to the individual’s specific post and range of responsibilities.

At the mid point of the reporting period, line managers hold a mid-year review with individuals formally to assess progress to date against the personal objectives and to agree any in-year changes. At the end of the reporting period, a formal appraisal discussion is held between individuals and their line managers to assess performance during the year. A written Performance Appraisal Report is completed by the line manager which records the extent to which personal objectives have been met and the manner in which objectives have been met, in particular the leadership and professional skills that have been demonstrated.

Following the completion of end of year performance appraisals, departmental Pay Committees meet to make recommendations to Permanent Secretaries on annual pay awards to individual members of the senior civil service in accordance with annual senior civil service pay strategies, which are agreed by the Minister for Finance and Personnel. Permanent Secretaries make final recommendations on the pay awards for all the senior civil servants in their respective departments. These recommendations are moderated as appropriate by a NICS Senior Civil Service Pay Committee.

I enclose for information the Senior Civil Service Pay Strategy 2008 which describes how the process operated in relation to the 2007/08 reporting period and, in particular, the criteria used for awarding non-consolidated bonus payments. There were no non-consolidated bonus payments to senior civil servants in respect of the 2008/09 reporting period.

These performance management arrangements provide for detailed engagement between senior civil servants and their line managers in setting personal objectives, focusing very clearly on what is expected from individuals in relation to their specific jobs and ensuring that ample opportunity exists for discussion and assessment of whether and how personal objectives have been achieved. Obviously I am not privy to the individual personal objectives agreed between each of the 220 senior civil servants in the Northern Ireland Civil Service and their line managers, but I am confident that the performance management arrangements in place and decisions that have been taken on whether or not to award non-consolidated bonus payments are based on a robust performance management system and end of year performance appraisals. They are not based on the data systems underpinning the delivery of Public Service Agreements.

I hope this explanation is helpful to the Committee. I am copying this letter to David Thomson and John McMillen for their information.

Yours sincerely

signature - Stephen Peover

Stephen Peover

Northern Ireland Civil Service
Senior Civil Service Pay Strategy 2008

1. This pay strategy provides the basis for the Senior Civil Service (SCS) pay award in the Northern Ireland Civil Service for 2008. It reflects the recommendations of the Senior Salaries Review Body (SSRB) and the accompanying Cabinet Office guidelines where those guidelines are consistent with the business needs of the Northern Ireland Departments.

2. The aims of the Northern Ireland Civil Service SCS pay strategy are:

3. The Northern Ireland Civil Service generally shadows the broad framework of the pay arrangements in place for SCS staff in the Home Civil Service. In practice this means that the Northern Ireland Civil Service adopts the overall cost envelope and size of bonus pot as recommended by the SSRB and endorsed by Cabinet Office, as well as the pay band minima and maxima for SCS staff. The NICS will, however, determine for itself all other variables such as the base pay matrices, percentage of staff eligible for bonus payment and the level of bonus payments, reflecting local considerations.

4. The SCS pay award is effective from 1 April 2008 and payable to staff in post at that date. It rewards performance over the year 1 April 2007 to 31 March 2008. There will be no staging of any part of the award.

5. There are two core elements to the SCS pay award – a base pay award and a non-consolidated bonus. The base pay award is determined on an assessment of performance, growth in competence, job challenge relative to peers and an assessment of confidence in future performance. The award of a non-consolidated bonus is based on in-year delivery relative to objectives or other short-term personal contributions, as measured against specified criteria. Non-consolidated bonuses are not dependent on tranche allocation for the base pay award.

Pay Band Structure

6. The NICS operates a pay band structure for the SCS below Permanent Secretary as indicated below. Pay Band 1 covers Assistant Secretary (Grade 5) level posts. Pay Band 2 covers Deputy Secretary (Grade 3) posts. The pay bands effective from 1 April 2007 were:

1 April 2007
Pay Band Minimum
£
Progression Target Rate (PTR)
£
Recruitment & Performance Ceiling (RPC)
£
1 56,100 78,540 116,000
2 81,600 N/A 160,000

7. With effect from 1 April 2008 the minimum to Pay Band 1 will be increased to £57,300 in line with SSRB recommendations. In line with Cabinet Office guidance, the Progression Target Rate (PTR) for Pay Band 1 will be abolished. There will be no uplift to the other elements of the Pay Bands. From 1 April 2008 the Pay Bands will therefore be as follows -

1 April 2008
Pay Band Minimum
£
Recruitment & Performance Ceiling (RPC)
£
1 57,300 116,000
2 81,600 160,000

Base Pay Award

8. From April 2004 the Cabinet Office introduced greater flexibility in order to allow Departments more clearly to differentiate between levels of performance and reward within an overall cost envelope. In 2008 the cost envelope is 2.5% for base pay, which limits the scope for significant differentiation. This represents year one of a three year pay settlement with an indicative cost envelope of 7% growth in SCS paybill for the three year period 2008 -2011 as recommended by the SSRB. For the 2008 award, the aim has been to get the majority of SCS staff as close as possible to the headline increase in the SCS paybill of 2.5%, while allowing for some limited differentiation for Tranche 1 and Tranche 3 performers. Accordingly the following pay arrangements for Pay Bands 1 and 2 will apply for the 2008 SCS pay award.

Pay Bands 1 & 2

Performance Tranche Award
TRANCHE 1 2.75%
TRANCHE 2 2.5%
TRANCHE 3 1%

9. Any SCS member in Tranche 3 whose performance is deemed unsatisfactory will receive no base pay award.

10. Base pay awards will be based on the individual’s value or contribution to the Department and more widely, marked by:

(a) an assessment of the individual’s performance;

(b) the individual’s overall growth in competence;

(c) the challenge associated with the job relative to peers; and

(d) confidence in the individual’s future performance, based on sustained past performance.

11. All base pay awards will be paid as a percentage of the individual’s salary. The overall targets for performance tranche distribution for SCS staff are:

Tranche 1 25%
Tranche 2 65-70%
Tranche 3 5-10%

12. The Progression Target Rate for Pay Band 2 was removed with effect from 1 April 2006 and it has been removed for Pay Band 1 from 1 April 2008. In practice this means that there will no longer be capping arrangements in place on consolidated pay progression for those staff who are not Tranche 1 performers.

Pay Decisions

13. Pay recommendations on tranche allocation for Grade 5 level will be considered at Departmental Pay Conferences comprised of the Permanent Secretary and line managers of SCS Grade 5 staff. The tranche allocation for staff at Grade 3 level will be determined by the Permanent Secretary. Each Permanent Secretary will then bring to the NICS SCS Pay Committee his or her proposed Tranche 1, 2 and 3 awards for both Grade 5 and Grade 3 level staff (in line with the distribution at paragraph 11 above).

NICS SCS Pay Committee

14. The role of the NICS Pay Committee is to ensure consistency of outcome with the terms of the agreed pay strategy and to provide a moderation role if required. The Committee will consist of all NICS Permanent Secretaries and will be chaired by the Permanent Secretary of the Department of Finance and Personnel. The Head of the Civil Service will not participate in moderation discussion.

15. Pay recommendations are not final until after any SCS Pay Committee moderation, and pay recommendations should therefore not be divulged to individuals until the Committee’s task is complete and each Permanent Secretary has been informed in writing by the Secretary of the Pay Committee (the Director of Central Personnel Group).

Bonuses

16. Eligibility for non-consolidated bonuses is distinct from the considerations for the base pay award. Permanent Secretaries have responsibility for determining the award of non-consolidated bonuses for both Grade 5 and Grade 3 level staff. In considering bonus allocations for Grade 5 level staff, Permanent Secretaries should take the views of the line managers of Grade 5s at the Departmental Pay Conference.

17. The SSRB Report 2008 recommended that the bonus pot for the SCS pay award should be increased by 1% to 8.6% of the SCS paybill. It has been determined that approximately 75% of NICS SCS staff should be eligible to receive a non-consolidated bonus, split equally across the three levels.

18. There will be 3 levels of Non- Consolidated Bonus Payments for SCS staff at Grade 5 and Grade 3 level.

Level Amount Distribution
Level 1 £10,500 25%
Level 2 £7,750 25%
Level 3 £5,000 25%
No Bonus 25%
Total £1,103,250 99.59% of Bonus Pot

19. Eligibility for a non-consolidated bonus should reflect achievements against performance objectives and the leadership and professional skills demonstrated in meeting these objectives. Specifically, the criteria to be used in considering the award of non-consolidated bonuses are:

20. The following definitions will determine the level of bonus to be paid to those who meet the eligibility criteria:

Bonus Level Definition
Level 1 Achieved objectives despite the most difficult political, operational and economic environment, displaying exceptional leadership in the business area and more widely.
Level 2 Achieved objectives despite moderately difficult political, operational and economic environment, displaying a high level of leadership in the business area and more widely.
Level 3 Achieved objectives encountering only some political, operational and economic obstacles, displaying leadership in the business area and more widely.

Pay Progression

21. It has been agreed that a special feature of the 2008 SCS pay award should be an attempt to address some structural issues with the SCS pay system which have resulted in a lack of pay progression for those at or near the bottom of Pay Band 1 and the fact that some staff in this category are paid less than staff in the grade below. Restrictions on the size of the SCS pay award mean that measures to address this issue will have to be tightly targeted.

22. For the purposes of this exercise the following rules will apply:

(i) The base pay element of the 2008 pay award will first of all be effected.

(ii) Following the application of increases to the base pay award, SCS staff within the following parameters will be identified:-

a. Staff in Pay Band 1 in the grade for 5 years or more on 1 April 2008; and

b. Staff with a salary of £ 62,407 (max of Grade 6) or less on 1 April 2008 (following award of 2008 pay increases);

(iii) Each officer identified will receive an additional £1,000 increase to their base pay on a consolidated basis.

(iv) Officers with a salary between £62,408 and £63,407 and 5 years or more in the grade will be considered for a payment on a consolidated basis of a value up to £999 to ensure that their pay has not been surpassed by that of an officer benefiting from this special exercise.

Promotion Increases

23. Promotion shall be defined as a permanent move to a post in a higher pay band. On promotion an individual’s base pay will either increase to the minimum of the higher band or he/she will receive a 10% rise, whichever is the greater. The same principles will apply to staff on temporary promotion.

Pay Communication

24. Pay decisions by the Pay Committee will be communicated in writing to each member of the SCS by their Permanent Secretary.

Appeals

25. Staff can appeal against the outcome of the Pay Committee within 10 days of the date of their pay notification by the Permanent Secretary, using the NICS Grievance Procedures. The Pay Committee’s decision cannot be challenged on the basis that an individual simply disagrees with the assessment of their contribution in relation to that of their peers. The grounds for appeal must be made on the following issues:

Department of Finance and Personnel

August 2008

Correspondence of 28 October from
Mr David Sterling

DETI Logo

From the Permanent Secretary

David Sterling

Netherleigh
Massey Avenue
BELFAST BT4 2JP

Telephone: (028) 9052 9441
Facsimile: (028) 9052 9545
Email: david.sterling@detini.gov.uk
janice.davison@detini.gov.uk

Our ref: NPS DETI 036/09
Your ref 28 October 2009

Paul Maskey
Chairperson
Public Accounts Committee

Dear Paul

Evidence Session on Public Service Agreements

1. Thank you for your recent correspondence following the evidence session on Public Service Agreements on 8 October 2009. At the meeting, I agreed that I would respond in writing to a number of specific queries raised by members of the Committee. As you broadly indicated, these relate to:

i. the perceived change to the target to establish 10,000 new businesses;

ii. performance against the DETI / Invest NI targets referred to in the Northern Ireland Audit Office (NIAO) report;

iii. the definition of related targets in the 2008-11 Programme for Government (PfG); and

iv. progress to date on delivering against these related targets in the 2008-11 PfG.

2. I will address each of these issues in turn. Firstly, we had some discussion at the meeting on how the reporting of performance on the target to establish 10,000 new businesses had varied. As the NIAO report indicates, any confusion in reporting against PSA targets was a reflection of the complex nature of the previous PSA reporting process, which resulted in PSA targets not aligning with Invest NI’s Corporate Plan reporting periods and their associated targets. This is something DETI / Invest NI has sought to avoid in the current 2008-11 reporting period. However, I would like to take this opportunity to again confirm that while the reporting of the 2005-08 target may have varied in light of the specific time period considered, there was no change to the underpinning target of 10,000 business starts during the 2005-08 period as a whole. Secondly, members also commented on the time lag associated with some of DETI’s indicators and asked for an update on performance against the 2005-08 targets referred to in the NAIO report. This is provided in Table 1 in Annex A of this note. This information reveals that DETI / Invest NI have made good progress in delivering on 2005-08 commitments.

3. I also outlined to Members the shift in focus in the current PfG towards a greater emphasis on the support for value-added investment and business activity. This has naturally resulted in a similar change in the definition of targets in light of the priority that is now attached to improving private sector productivity in Northern Ireland. Table 2 in Annex A maps relevant targets from the current PSA framework against those 2005-08 targets referred to in the NIAO report for Members information. Further information on the specification of these targets, and the data sources used to monitor progress, is provided in the technical notes which are published on the DETI website (available at http://www.detini.gov.uk/deti-about-home/about-corporate-planning.htm).

4. The Department monitors progress on current PfG targets on a quarterly basis, and shares this information with the ETI Committee. The latest monitoring information available relates to Quarter 1 2009/10. Information from this report to the ETI Committee on those indicators relevant to our discussions is provided in Table 3.

5. Finally, I also attach (at Annex B) a copy of the guidance, Priorities & Budget 2007 - Guidance for Departments, issued by the Department of Finance and Personnel, which I referred to in my evidence. Section II of this document (pages 10-18) sets out the performance management arrangements for the current PfG.

6. I trust this information is helpful.

Yours sincerely

signature - David Sterling

David Sterling

Annex A: Performance Information

Table 1: Progress against 2005 - 2008 targets

Target Progress (To 31 March 2008)
PSA 4 – By March 2008 reduce the productivity gap measured by GVA per hour worked) with the UK. Data published by the Office for National Statistics reveals that Gross Value Added (GVA) per hour worked in Northern Ireland increased from 81.2 per cent of the UK average in 2005 to 84.1 per cent in 2007. Figures for 2008 are due to be released in December 2009.
PSA 5 – By March 2008, business expenditure on R&D by Invest NI client companies to increase at a rate faster than that of comparable UK regions, so as to reduce the current gap in business intramural R&D expenditure as a percentage of GVA. Figures relating to 2007 indicate that NI BERD has increased at a faster rate than the average for the UK as a whole, and when expressed as a percentage of GVA, the gap has narrowed by three percentage points. Final 2008 figures will be available December 2009.
PSA 6 – During the period 2005-2008, support the establishment of 10,000 new businesses, of which 40% will be in New TSN areas. Over the period 1st April 2005 to 31st March 2008, there were 9,991 locally-owned start-up projects. The target of 40% was set including special status areas, and on this basis, results indicate an achievement of 36% in this regard.
PSA 7 – By March 2008, increase the level of exports as a percentage of total sales by Invest NI client companies (excluding the top 25 exporting clients in 2003) to 30%. Exports as % Total Sales by Invest NI client companies (excluding the top 25 exporting clients in 2002/03 was 31.9% in 2007/08).
PSA 8 – By March 2008, increase annual visitor spend to £518 million. The outturn figure for domestic and out of state tourism at the end of 2007 was £535million. This figure is based on an assessment of the 2007 calendar year – Jan to Dec – and therefore suggests that the target figure of £518million to the end of March 2008 had been exceeded. The total revenue in 2008 (calendar year) was £540 million – an increase of 5% over 2007. Together these figures suggest that the PSA8 figure was surpassed.

Table 2: Mapping 2005-08 targets and related targets in the 2008-11 PfG

2005-08 Traget Related 2008-11 Target
PSA 4 – By March 2008 reduce the productivity gap measured by GVA per hour worked) with the UK. The 2008 – 2011 Programme for Government has identified a new, long term goal to aim to halve the private sector productivity gap with the UK (excl GSE) by 2015.
PSA 5 – By March 2008, business expenditure on R&D (BERD) by Invest NI client companies to increase at a rate faster than that of comparable UK regions, so as to reduce the current gap in business intramural R&D expenditure as a percentage of GVA. Increase the BERD expenditure in Invest NI client companies with less than 250 employees by a 8% Compound Annual Growth Rate (CAGR)
Increase the BERD expenditure in Invest NI client companies with greater than 249 employees by a 5% CAGR
The level of export sales as a percentage of total sales by Invest NI client companies, excluding the Top 25 exporting companies, to increase by 3 percentage points
PSA 6 – During the period 2005-2008, support the establishment of 10,000 new businesses, of which 40% will be in New TSN areas. 6,500 new jobs from inward investment
6,500 new jobs from inward investment of which 5,500 will provide salaries above the Northern Ireland Private Sector Median
6,500 new jobs from inward investment of which 2,750 will have salaries at least 25% above the Northern Ireland Private Sector Median
70% of new FDI projects secured to locate within 10 miles of an area of economic disadvantage
Support 45 new start-ups exporting outside the UK and 300 exporting to GB
PSA 7 – By March 2008, increase the level of exports as a percentage of total sales by Invest NI client companies (excluding the top 25 exporting clients in 2003) to 30%. Maintain the CAGR in external sales per employee by Invest NI manufacturing clients at 6%
Increase in the CAGR in external sales per employee by Invest NI tradable services clients to 4%
The level of export sales as a percentage of total sales by Invest NI client companies, excluding the Top 25 exporting companies, to increase by 3 percentage points
PSA 8 – By March 2008, increase annual visitor spend to £518 million. Increase tourism revenue from out-of-state visitors to £520m by 2011 from a baseline of £370m in 2006 (Please note that the 2008-11 target relates to out-of-state visitors only, while the previous target also includes domestic visits)

Table 3: Progress against 2008 - 20011 targets

2008-11 Target Progress (To 30 June 2009)
Aim to halve the private sector productivity gap with the UK (excl GSE) by 2015. DETI has commissioned independent forecasts to assess performance against this goal. These are due in November 2009.
Increase the BERD expenditure in Invest NI client companies with less than 250 employees by a 8% CAGR Latest data from the 2007 Research and Development report from DETI provides provisional baseline statistics for these targets. In SMEs (0-249 employees) R&D expenditure was £102.7m in 2007. To achieve the target, expenditure is required to increase by almost £27m to £129.3m. Next update is available in March 2010.
Increase the BERD expenditure in Invest NI client companies with greater than 249 employees by a 5% CAGR Latest data from the 2007 Research and Development report from DETI provides provisional baseline statistics for these targets. In large businesses (250+ employees) R&D expenditure was £65m. To achieve the target, growth of £10m to £75m is required. Next update is available in March 2010.
6,500 new jobs from inward investment 4,055 jobs were supported in 2008/09, with a further 563 in Q1 2009/10. Provisional results based on project approvals in the first quarter of the year indicate that the agency is on track to achieve the 2009/10 targets. However, the speed with which clients will bring forward their projects cannot easily be predicted given the current depressed economic climate.
6,500 new jobs from inward investment of which 5,500 will provide salaries above the Northern Ireland Private Sector Median 2,342 jobs were promoted in 2008/09 offering salaries above the NI PSM, with a further 490 in Q1 2009/10. Provisional results based on project approvals in the first quarter of the year suggest that the agency may achieve the 2009/10 target. However, the speed with which clients will bring forward their projects cannot easily be predicted given the current depressed economic climate.
6,500 new jobs from inward investment of which 2,750 will have salaries at least 25% above the Northern Ireland Private Sector Median 1,125 jobs were promoted in 2008/09 offering salaries at least 25% above the NI PSM, with a further 471 in Q1 2009/10. Provisional results based on project approvals in the first quarter of the year suggest that the agency may achieve the 2009/10 target. However, the speed with which clients will bring forward their projects cannot easily be predicted given the current depressed economic climate.
70% of new FDI projects secured to locate within 10 miles of an area of economic disadvantage 82% of new FDI projects secured in 2008/09 located within 10 miles of an areas of economic disadvantage, while this figure stood at 50% in Q1 2009/10. Given the nature of this target it is still very early in the year to determine, with any certainty, its likely outcome. Although the latest figures for Q1 suggest that the annual target will not be met, the agency remains confident that it will achieve the annual target by the year end and will meet the three year target.
Support 45 new start-ups exporting outside the UK and 300 exporting to GB 15 start-ups exporting outside UK and 103 exporting to GB were supported in 2008/09. Q1 2009/10 saw a further 4 and 20 start ups exporting outside the UK, and to GB respectively. Currently on course to achieve year end target. However, as with other targets, the economic downturn continues to present significant challenges moving forward and it is difficult to predict what the impact on this target will be.
Maintain the CAGR in external sales per employee by Invest NI manufacturing clients at 6% External sales per employee baseline at 2007/08 = £125,893 (provisional, still subject to change). Next update available March 2010. It is currently difficult to assess the impact of the current economic downturn on progress against this target as change in external sales per employee will be determined by both changes in external sales and employment. However, there is clear evidence that demand (sales) has reduced in many client companies and job losses are accumulating with uncomfortable regularity.
Increase in the CAGR in external sales per employee by Invest NI tradable services clients to 4% External sales per employee baseline at 2007/08 = £72,506 (provisional, still subject to change). Next update available March 2010. It is currently difficult to assess the impact of the current economic downturn on progress against this target as change in external sales per employee will be determined by both changes in external sales and employment. However, there is clear evidence that demand (sales) is reduced in many client companies and job losses are accumulating with uncomfortable regularity.
The level of export sales as a percentage of total sales by Invest NI client companies, excluding the Top 25 exporting companies, to increase by 3 percentage points Baseline at 2007/08 = 28.5% (provisional). Next update available March 2010. It is currently difficult to assess the impact of the current economic downturn on progress against this target as change in exports as a percentage of all sales will be determined by both changes in exports sales and all sales. However, there is clear evidence that demand (sales) is reduced in many client companies, including the Top 25 exporters, and job losses are accumulating with uncomfortable regularity.
Increase tourism revenue from out-of-state visitors to £520m by 2011 from a baseline of £370m in 2006 Data from 2008 (at December) indicates that annual tourism revenue stood at £396m. The global tourism industry continues to be hit hard by the worldwide recession and by extreme volatility across all major markets and, in line with this, travel to the island of Ireland is experiencing a decline. Latest available data from the Northern Ireland Passenger Survey (NIPS) reflects the impact on Northern Ireland as well as the tough year that tourism businesses across the island of Ireland are experiencing. The NIPS suggests that Revenue from overseas visitors travelling direct to Northern Ireland decreased by -23% in the period January to April 2009 compared with the same four months in 2008. All market areas showed declines except for North America (where Revenue is believed to have increased slightly by +1%) This position reflects the effects of the global recession, discounting by industry in a bid to win business in increasingly competitive markets and, also, lower spending and shorter stays by visitors while on the ground in Northern Ireland as they trim their personal spending. Latest available data from the UNWTO (United Nations World Tourism Organisation), suggests that, while undoubtedly suffering a decline in its tourism overall, the island of Ireland actually maintained and, in some cases, grew its share of available tourism business compared to some competitor destinations such as Great Britain, France and Germany. Based on data from the NI Hotels Federation and feedback from the industry, revenue from overseas visitors to NI (ie excluding ROI visitors) is expected to decline at a slower rate than the island of Ireland as a whole. Overseas revenue to NI in 2009 is currently expected to range between £323m (best case scenario) a decrease of -4%, and £288m (worst case scenario) a decrease of -14% on 2008. When looking at the ROI element of total revenue there has been a 51% growth in revenue generated by the ROI market in Jan-March of this year (generating (£14.3m). The revenue outlook is directly related to forecasted visitor volume and projected outturns but a continuation of the current economic climate suggests a risk to the delivery of this target.

Correspondence of 30 October 2009 from
Mr Gerry Lavery

From the Senior Finance Director

Gerry Lavery

Dundonald House
Upper Newtownards Road
Belfast BT4 3SB
Tel: 028 905 24638
Fax: 028 905 24813
Email: gerry.lavery@dardni.gov.uk

Mr Paul Maskey
Chairperson
Public Accounts Committee
Room 371
Parliament Buildings
BELFAST BT4 3XX 30 October 2009

Dear Paul

Evidence Session on Public Service Agreement

Policies and guidance issued by DARD on the monitoring of progress and effectiveness of the data systems used in PfG.

Thank you for your letter of 12 October, asking for more detail on the policies and guidance issued by DARD in respect of data systems to underpin our Public Service Agreements.

The Department’s policy has been to use the guidance on monitoring progress and ensuring effective data systems that was issued by OFMDFM. This guidance was forwarded to relevant staff with instructions that the guidance be followed. There are controls in place at different levels within the DARD structure to ensure that there is compliance with the guidance.

A list of guidance on the monitoring of progress and the effectiveness of data systems that was provided by OFMDFM and utilised by DARD is as follows:

1. Priorities and Budget 2007: Guidance for Departments (attached Annex A). This guidance dealt with data systems in section 2 and this was issued to all SRO and Data Quality Officers.

2. PfG and PSA Monitoring – Guidance Quarter ending 31 March 2009 (attached Annex B). This guidance related to monitoring and included guidance on applying RAG ratings. Issued to all SRO and Data Quality Officers.

3. PfG and PSA Monitoring – Guidance Quarter ending 30 June 2009 (attached Annex C). This guidance related to monitoring and included guidance on applying RAG ratings. Issued to all SRO and Data Quality Officers.

4. Programme for Government End of Year Delivery Report – ‘RAG Ratings (attached Annex D). This guidance related to monitoring and included guidance on applying RAG ratings. Relevant issues raised by this guidance was addressed centrally and not issued to DARD SRO.

5. Programme for Government Monitoring Quarter 1 2009/2010 ‘RAG Ratings’ (attached Annex E). This guidance related to monitoring and included guidance on applying RAG ratings. Relevant issues raised by this guidance were addressed centrally and not issued to DARD SRO.

6. The ‘good practice checklists’ contained in the NIAO report PSA – Measuring Performance were sent to all Senior Responsible Officers and Data Quality Officers in January 2009 (attached Annex F). This guidance related to data quality.

I hope that this response is helpful. A copy goes to David Thomson, Treasury Officer of Accounts.

Yours sincerely

signature - Gerry Lavery

Gerry Lavery

Senior Finnce Director
Enc

PfG & PSA Monitoring – Guidance

Introduction

The Monitoring Framework will collect and monitor information on the prospects for the targets and commitments within the Budget and Programme for Government being realised in full and within the resources allocated. The attached templates cover not just the 23 PSAs but also any key goals or commitments within the PfG that are not precisely covered by the 23 PSAs.

Returns must be made using the attached templates only.

Contributing departments should send their input directly to the lead department for consolidation and submission to OFMDFM/DFP. Only the lead department for each PSA should submit the final consolidated return to OFMDFM/DFP.

A PSA Monitoring Template should be completed for each PSA for which the Department leads. The template includes specified fields for the responses – these are embedded within the template and, where required, information should only be entered into these specified fields. In many cases relevant information from earlier monitoring rounds is retained and will only need to be updated should things have moved on.

Equality Monitoring

This round of PfG Monitoring does not require departments to report on progress against addressing the issues identified in the EQIA carried out at a strategic level on the PfG. However, it is likely that this will form part of the next round of monitoring.

Section 1: Lead Department and Delivery Board

The front page of the template seeks information on the SRO for the target. As agreed by the Executive on 5 March 2009, a single lead SRO must be identified for each PSA.

The templates also seek information on those individuals who comprise the Governance arrangements (Delivery Board) currently in place to oversee progress and delivery against the targets. As agreed by the Executive where the PSA or key goal or commitment requires the delivery of actions across a number of departments/divisions, the SRO must establish a Delivery Board, comprising accountable senior officials (at SCS level) from contributing departments/divisions.

Role of the SRO

The SRO is accountable for coordinating quarterly returns to the centre and will be the first point of contact where follow up action or additional information is required.

While the SRO is not accountable for the delivery of targets outside their area of responsibility, they are responsible for ensuring that appropriate reporting mechanisms, data systems, risk management frameworks and delivery structures/processes are in place across all targets. Where the SRO has concerns in this regard, these should be set out in the appropriate sections within the template.

Section 2: Measurement

As before the Measurement section requests an update on each of the targets within the PSA as of 30 June 2009. As before the update for each indicator will be in one of two forms – one for Measurable Indicators and one for Narrative Indicators.

Measurable Indicators cover those targets that might typically be described as “SMART" targets. An update on each and an illustration of how to complete the Measurable Indicators within the template is included in Figure 1 below. The tables also record the value of the indicators reported in earlier monitoring rounds. Again previously provided information on, for example, baselines and milestone targets have been retained.

For Measurable Indicators a section is also included to enable a brief narrative to be added in support of the quantitative information on the indicators. In some cases we have used the narrative section to request some specific information in relation to a particular indicator – often in relation to the timing of the next available update for the indicator. A number of the targets within the PfG have indicators that have yet to be fully developed and departments will wish to address this and related issues as a matter of priority.

Narrative Indicators: Once again for those targets not defined in particularly measurable terms – for example targets that involve completing a particular action by a point in time – a narrative update is sought. Departments should avoid responding solely with simplistic statements within the narrative such as “progress on track" but should rather include additional supporting commentary to briefly describe and illustrate what progress has been made in the time to date. For example the narrative update might briefly describe what actions have been completed or progressed to date and how this compares to their target times.

In addition a RAG summary section is provided for both measurable and narrative indicators which seeks a judgement, on behalf of the Department, on the current prospects of successful delivery against each of the targets. As a default each is currently set to “RED" – while the template now also includes the RAG rating outcome from the assessments as at 31 March 2009.

The box below provides a broad indication of how delivery is to be assessed against the RAG scale. This now also includes some bullet points for, and only for, those type of indicators (which are relatively few) that are about continuously meeting a particular level of service – as opposed to targeting a level of improvement by a point in time in the future. So it would apply to indicators structured like, for example, the following; “from April 2008 Public Service availability will be at 95% or better".

The Rag Assessment:

Red:

Amber:

Green / Amber:

Green:

Departments should use the narrative update space, which is now provided for both narrative and measurable indicators, to briefly outline the rationale for the RAG rating assigned to each indicator. OFMDFM/DFP will review departmental assessments of the RAG status of targets to ensure that all assessments are consistent with the above framework.

Where Departments have assigned a RED or AMBER RAG rating to an indicator, they must outline, within the narrative, any remedial actions being taken to address performance on that indicator.

Where an update on progress is not provided, or data is not available, the RAG status WILL BE ASSESSED AND REPORTED AS RED.

Section 3: Governance

Not included in this round of monitoring.

Section 4: Risk & Delivery Management

Not included in this round of monitoring.

Section 5: Financial

Not included in this round of monitoring.

PFG Goals & Commitments

A number of Key Goals and Commitments within the PfG are not precisely covered within the 23 PSA targets. Typically there are about 3 or 4 per Departments and these have been identified, within a separate template, and Departments should report progress once again using the approach set out in Figure 1.

[Extract]

NIA Image

Departmental Guidance Referred to by
Mr David Sterling and Mr Gerry Lavery

Priorities and Budget 2007
Guidance for Departments

Department of Finance and Personnel
March 2007

Contents

Section 1 Overview and Timetable xx
Introduction
PE Context
Performance Management
Capital
Ministerial Priorities
Overall Aims and Objectives
Planned Outputs
Consultation
Timetable

Section 2 CSR Performance Framework xx
Introduction
DSO and PSA Priority Outcomes
PSA Delivery Agreements
Selection of Indicators

Section 3 Departmental Returns xx
Introduction
Review of Priorities and Financial Structures
Contents of Return

Section 4 Equality and Sustainable Development xx
Introduction
Statutory Obligations
Methodology
Verification and Quality Assurance

Annex A High-Level Impact Assessment (HLIA) Pro-forma xx

Annex B Summary of Equality Impacts Pro-forma

Section 1 – Overview and Timetable

Introduction

1.1 This guidance has been prepared by DFP to provide the background and approach to the Priorities and Budget process, which will set the improvements in public sector outcomes and associated expenditure plans for the period 2008-09 to 2010-11. It outlines:

1.2 Throughout the process, departments will have due regard to the objectives as outlined below including statutory duties concerning equality of opportunity and good relations as set out in Section 75 of the Northern Ireland Act 1998; they will also give full consideration to opportunities to address social need experienced by the most deprived people and areas, focusing particularly on opportunities to tackle the problems of unemployment and/or increase employability.

1.3 Departments are also reminded of the need to incorporate sustainable development into policy-making, in line with the commitments made in the Government’s sustainable development strategy and the related statutory duty on public authorities. To support this process, departments are encouraged to involve their own sustainable development champions in preparing their submissions.

PE Context

1.4 On 19 July 2005, the Chief Secretary to the Treasury announced the Government’s intention to conduct a second Comprehensive Spending Review (CSR), reporting in 2007, to identify what further investments and reforms are needed to equip the UK for the global challenges of the decade ahead.

1.5 Spending Reviews set firm and fixed three-year Departmental Expenditure Limits and, through Public Service Agreements (PSA), define the key improvements in outcomes that the public can expect from these resources. Successive Spending Reviews since 1997 have targeted significant increases in resources for the Government’s priorities, matched by far-reaching reforms, and have set ambitious targets for improvements in key public services. The 2007 CSR will set spending plans for 2008-09, 2009-10 and 2010-11. A decade on from the first CSR, the 2007 CSR will represent a long-term and fundamental review of government expenditure.

1.6 Although HM Treasury have forecast continued growth in public expenditure over the period covered by the 2007 CSR, following the significant increases in investment in public services in recent spending reviews, it is likely that this will be at a slower rate going forward. In order that funds can be made available to fund Ministerial spending priorities it is therefore essential that resources are released through efficiency improvements in the delivery and management of existing programmes.

1.7 The issue of constrained public expenditure growth at the UK level, is particularly relevant for Northern Ireland, as the starting position in Northern Ireland is public expenditure on a per capita basis is ahead of UK levels[1]. The operation of the Barnett formula means that NI departments will experience slower growth in available resources than Whitehall equivalents over the CSR period.

1.8 In line with the approach adopted by Whitehall Departments as reaffirmed in the Budget 07 Report, NI Departments have been developing plans to deliver cumulative efficiency savings of 3% a year over the CSR planning period including an annual real reduction in civil service administration costs of 5%[2], in order to accommodate expected cost pressures, and at the same deliver improvements in public service provision. All resources released by this work will remain within Northern Ireland, for reallocation to public services here. The savings in terms of administration will be reclassified as additional available resource spend.

1.9 In addition, we will also need to address whether the existing levels of over-commitment currently built into planned spending at the Block level are sustainable in the light of recent patterns of actual departmental spending as well as the recommendations from the PKF Review of Forecasting and Monitoring.

1.10 The key implication of this that it will be essential that there is a strong focus on all departments delivering on their efficiency savings targets. Therefore Efficiency Delivery Plans will need to be finalised with monitoring and accountability mechanisms put in place to ensure that the provision of priority services is maintained with adjusted expenditure baselines. In addition, in terms of the reallocation of resources, given the more constrained fiscal environment, there should be even greater focus on those areas where there is greatest need, in particular front line services, and where additional investment will deliver maximum improvement in public service outcomes.

Performance Management

1.11 The Public Service Agreement (PSA) Framework will continue to form a crucial pillar of the Government’s approach to driving public service performance, strengthening accountability, and improving service quality. However a number of changes have been introduced as set out below:

(a) ‘Public Service Agreements’ (PSAs) will sit alongside Departmental Strategic Objectives in the framework but will not be constrained by departmental structures. The final PSA set (of around [30]) will represent a priority subset of Government’s objectives for the spending period. Departmental Strategic Objectives may map onto PSA Outcomes wholly or in part but PSAs are not intended to express ambition in relation to every aspect of Government business.

(b) A rationalised basket of key performance indicators, will underpin each PSA. Targets or minimum standards of performance can be attached to one or more of these indicators only where such an approach is the most effective way to drive delivery. Departments will be expected to include indicators of user satisfaction or experience where appropriate.

(c) Each department will base their work on a set of ‘Departmental Strategic Objectives’ for the spending period, which should cover the full range of departmental business and form the ‘top line’ of the department’s business plan.

(d) Each PSA will be accompanied by a published ‘Delivery Agreement’ (DA). The DAs will strengthen accountability and confidence on delivery, as well as supporting our commitment to more responsive public services. Where PSA outcomes cut across departmental boundaries the DAs will clearly explain the contributions made by each department.

Capital

1.12 Department’s capital allocations will be set through the concurrent development of the second iteration of the Investment Strategy for Northern Ireland (ISNI) which will also set indicative plans for the following seven years. In this process, Departments will be bidding for their entire capital budget on a project basis, where appropriate, and so will need to construct their capital bids from a zero base. In order to fund the increased capital need, an ambitious programme of asset disposals will be taken forward with aim of going further than the target previously agreed with HM Treasury (£1billion over the period 2005-06 to 2010-11),

1.13 The second iteration of ISNI will have a stronger focus on covering all the costs associated with the projects and in particular the resource consequentials. A number of scenarios based on the investment framework will be developed as regards the projects to be funded from the expected capital allocations over the 2008 to 2018 period, with the first three years being subject to a greater level of detailed work in the context of the 2007 Priorities and Budget process. The resource implications of these projects will be afforded high priority in terms of resource bids (thus reducing available resources for other current proposals) and hence there will be robust scrutiny to ensure that the level of consequentials is the minimum necessary to deliver the projects/programmes. Departments will also need to ensure consistency between their capital and resource proposals so that a low priority proposal with respect to resource is not approved on the basis of the associated capital project.

1.14 Departments will not be required to bid for capital as part of the Priorities and Budget process although there will still be a need to submit spending proposals for the associated resource consequentials (excluding consequentials with respect to administration budgets[3]). This will apply to all capital projects including those not taken forward in the ISNI 2 scenarios.

1.15 It is recognised that the 5% real reduction in administration costs at the NI Block level will cause significant difficulties for departments, particularly with respect to the associated administration costs of new policy proposals. CFG are currently considering the possibility of reallocating administration budgets between departments to ensure that critical needs are met, while still adhering to the 5% real reduction at Block level. Further guidance will be issued shortly.

Ministerial Priorities

Restored Executive Context

1.16 Upon restoration, a pressing issue for the Executive (and as reflected in the Committee on the Programme for Government CSR sub-group report) will be the consideration of priorities, which will form the context for the Budget process. However, it is the intention that the priorities for the restored Executive will emerge primarily from the Programme for Government process, which may not be sufficiently well defined until the early summer period. While this will cause some difficulties in terms of the identification of PSAs and associated spending bids, departments should proceed with work to identify potential Budget bids as well as taking forward some of the background work on impact assessments.

Direct Rule Context

1.17 Pending restoration, the Secretary of State’s note to the Ministerial team of October 2006 indicated that Departments should review all areas of expenditure to ensure they are tightly focused on delivering aspects of the government’s three key cross-cutting priority strategies:

1.18 In addition, equal attention should be made by departments to ensuring that in all areas of policy we are maximizing opportunities to deliver on the vision for Northern Ireland set out in A Shared Future. Expenditure plans should demonstrate a step change in how government uses public expenditure to reduce the costs of separation and foster a sense of shared community. These priorities follow on from those set out as part of the 2005 process, with overall priority given to health & social services and in particular reform of elective care:

1.19 We expect that the Ministerial team will further refine the statement of its priorities, and we will circulate further details as appropriate. For example, there will need to be clear expression as regards the key outcomes relating to the cross-cutting priority strategies. In addition, Ministers may wish to include other areas as key priorities, such as developing the economy, education/skills as well as health & social care.

Overall aims and objectives

1.20 Taking account of the above context, the overall aims of the Priorities and Budget 2007 exercise are:

The objectives are to:

(a) review and amend strategic priorities for public services and ensure the link between resource allocations and Ministers’ desired priority outcomes is further strengthened. These will be specified as PSAs/DSOs with associated indicators and targets providing comprehensive coverage across spending programmes;

(b) develop Delivery Agreements for each PSA with clear lines of accountability to ensure that targets are achieved;

(c) ensure that Efficiency Delivery Plans are further developed and finalised to build confidence that sufficient resources will be released to fund expected cost pressures, as well as new and existing spending commitments;

(d) provide resource allocations to those areas identified as high priority by Ministers, and seek to ensure effective delivery of public services; and

(e) determine the optimum balance between expenditure, revenue and borrowing to support strategic priorities in respect of the programmes and services covered by the Northern Ireland administration, to secure public services of the required standard; provide for necessary and appropriate investment; fulfil statutory obligations; and address key policy priorities.

Planned Outputs

1.21 The outputs from Priorities and Budget 2007 exercise will include:-

Consultation

1.22 DFP have completed a preliminary consultation exercise with representatives of the key social, economic and equality partners. The purpose of this engagement was to canvass views on:

1.23 Consultees raised a number of issues relevant to their particular sectors. There is concern that although the strategies produced by departments appear sufficient to address the relevant need, there is insufficient consideration of implementation and in particular that a sufficient level of resources is secured. The representatives from the business community have highlighted the need to focus on “growth enablers" and in particular addressing deficiencies in education and skills. A short summary report has been sent to Ministers which departments should take into consideration, where appropriate, when developing spending proposals.

Timetable

1.24 The following table sets out the broad timetable for the Priorities and Budget process. Paragraph 3.9 below summarises the items that should be dealt with in departmental returns. It should be noted that the timing of certain elements may be subject to change given the more elongated Ministerial approval process under devolution.[4]

Activity Timing
Guidance issued to departments End March
Receipt of departmental returns – Departmental proposed DSO’s and PSA’s End April
Receipt of departmental returns – Departments proposed spending plans including linkages to PSA’s and first draft Delivery Agreements. End May
Draft Budget Scenarios produced. End July
Draft Priorities and Budget document published including- Efficiency Delivery Plans, PSA Delivery Agreements, draft ISNI2 and summary of HLIAs End September
Public Consultation Process on the draft document October to November
Revised proposals to Finance Minister for consideration late November
Revised proposals to Ministers4 for consideration Early December
Priorities and Budget document published mid December

Section 2 – CSR Performance Framework

Introduction

2.1 This section sets out the main changes to the Public Service Agreements (PSAs) and performance management arrangements in the 2007 CSR, enabling departments to begin work on initial drafts.

2.2 As part of the CSR, Public Service Agreements (PSAs) will continue to form a crucial pillar of the Government’s approach to driving public service performance, strengthening accountability, and improving service quality. However a number of changes have been introduced as set out below:

(a) Each department will base their work on a set of Departmental Strategic Objectives (DSOs) for the spending period. These should cover the full range of departmental business and form the ‘top line’ of the department’s business plan.

DSOs will sit alongside new (more cross-cutting) PSAs and cover the totality of the department’s business. They are intended to ensure that issues that do not feature in PSAs will not be forgotten about or downgraded. This will also ensure that cross-cutting priorities that do not feature in PSAs can be aligned with departments’ ‘day to day’ business more effectively. These changes to the arrangements are likely to lead to a significantly smaller number of PSAs during the CSR period and beyond.

(b) PSAs will sit alongside DSOs in the framework but will not be required to cover every aspect of Government business. Final PSAs will represent a priority subset of Government business for the spending period. DSOs may map onto PSA Outcomes wholly or in part, but this is not a requirement. Outcomes which remain a key aspect of departmental business but are not included in the top cross-government priorities for the spending period, or those which represent ‘business as usual’, are unlikely to be reflected in PSAs. PSAs will be expressed as ambitious outcomes and will not be directly linked to inputs and outputs.

(c) Each PSA will be accompanied by a published Delivery Agreement (DA). These DAs will strengthen accountability and confidence on delivery, and will support the Government’s commitment to more responsive public services. Where PSA outcomes cut across departmental boundaries, DAs should clearly explain the contributions made by each department.

(d) A rationalised basket of key performance indicators, will underpin each PSA. Targets or minimum standards of performance can be attached to one or more of these indicators, but only where such an approach is the most effective way to drive delivery. Departments will be expected to include indicators of user satisfaction or experience where appropriate.

2.3 Figures 1 and 2 below set out the SR04 and CSR07 PSA frameworks to illustrate these changes:

Figure 1: SR04 PSA Framework

NIA Image

Figure 2: CSR 07 Framework for Priorities, PSAs and DSOs

NIA Image

2.4 The PSA section of submissions should include the following elements:

a) a summary list of DSOs and a list of the indicators that will be used by the department’s board for performance management purposes (and shared with DFP for monitoring purposes);

b) a summary table of the cross-Government PSA priority outcomes that the department will contribute to delivering;

c) a copy of the draft DA and Measurement Annex (which will replace the Technical Note) for each PSA where the department has specific responsibility for delivery. The draft DA should set out for each PSA outcome:

• a draft map of the delivery chain and a high level explanation of how the outcome will be achieved, with a clear explanation of how accountability, risk management and incentive structures will support achievement of the PSA outcome;

• the basket of indicators underpinning each PSA and the levels of any targets or standards attached; and

• an account of the approach taken to consulting and working with the delivery chain in producing the draft.

d) a summary of where and how user engagement and bottom-up accountability mechanisms will be extended and strengthened in services as key levers in the Government’s approach to driving delivery of improved outcomes.

2.5 Templates have been attached at the end of this section to assist departments in completing their draft DSOs, PSAs and DAs.

DSOs and PSA Priority Outcomes

2.6 In the CSR, the PSA framework will recognise the need to effectively manage performance in relation to the totality of each department’s business, whilst placing additional central focus on successful delivery of a streamlined set of key, Northern Ireland-wide priority outcomes. Through agreeing and publishing DSOs alongside a more focused set of high priority PSA outcomes, the Government will be able to clearly set out what it intends to achieve in return for the budgets agreed in the CSR, with a clear indication of the priorities for the spending review period. This will also ensure more effective alignment of cross-cutting priorities with the ‘day-to-day’ business of departments.

2.7 DSOs should:

2.8 They should be:

2.9 Each department’s Priorities and Budget submission must include the reporting template that will be used by the department’s board for performance management purposes (and shared with DFP for monitoring purposes). The template should enable rigorous financial management, and therefore:

2.10 Additionally, board-level reporting by departments should aim to provide sufficient information to assure senior management and Ministers that:

2.11 Departments should aim to have detailed DSOs prepared, and be in a position to demonstrate that they can deliver results broadly applying the same principles as those adopted under the PSA arrangements in the past, by April 2007.

2.12 Future PSA monitoring arrangements will be finalised in the coming months. In the interim, ongoing monitoring of current PSA targets will continue as normal, although these processes will be revisited as part of the Priorities and Budget process.

2.13 Many of the highest priority outcomes for the CSR period will cut across departmental boundaries. Where more than one department has a significant role to play in delivery of a PSA outcome, relevant departments should plan their delivery and consultation through coordinated processes. In such instances a single, shared and agreed Delivery Agreement will be included in the returns of all relevant departments.

PSA Delivery Agreements

2.14 Outcomes are more likely to be effectively delivered if:

2.15 Therefore, delivery planning and a greater focus on the role and requirements of service users will form an integral part of the PSA framework in the CSR. Publication of these DAs will ensure clarity and focus throughout the delivery chain, enhancing accountability and increasing the likelihood of delivery.

2.16 All PSA DAs should be informed by robust evidence and collaboration with stakeholders throughout the delivery system. This is crucial to ensure that:

2.17 Departments should ensure that the structure and content of DAs follows the guidelines set out in the table below:

Delivery Agreement: What should it include?

(i) Vision: A clear, concise statement of the Government’s ambition of what will be achieved in the spending period in relation to the priority outcome.

(ii) Measurement: A summary of the way performance in relation to the outcome will be measured, setting out key performance indicators and any targets or minimum standards.

(iii) Delivery strategy: This should include a map of the delivery chain for the

PSA outcome and a high-level explanation of the strategy for delivery, with a clear explanation of how levers and incentive structures throughout the delivery chain will support achievement of the PSA outcome. It should also set out the way that the accountability framework reinforces these levers and incentives and make clear where in the delivery chain accountability for achievement of different elements of the delivery strategy lies.

(iv) Risk management strategy: This should summarise the key risks to successful delivery and the strategy for managing these risks. These risks should include the danger of conflicting incentives where that may be a problem.

(v) Consultation schedule: This should summarise how consultation has been carried out, potentially listing the key groups consulted.

Selection of indicators, data and measurement issues, target setting

2.18 The effective use of data (underpinned by timely indicators) to drive improved outcomes is fundamental to the new approach. Targets will be applied to those indicators selectively, but only where they offer the best approach to drive delivery. The approach taken by departments in developing proposals on these indicators - and the discipline with which departments and the centre judge those proposals - will be critical to ensuring that Government commitments in the Priorities and Budget are achievable. The Priorities and Budget process will also be used to tackle inefficient burdens and bureaucracy in the system.

2.19 A robust set of high quality indicators is required for departments to monitor performance, support delivery, and ensure accountability. In summary, indicators should:

(a) be outcome-focused;

(b) be specific;

(c) use robust data subject to quality controls;

(d) reflect departmental performance;

(e) allow comparisons over time;

(f) be sufficiently accurate and reliable as to enable decision-making; and

(g) be relevant to the PSA outcome.

2.20 Departments may wish to use more than one indicator to underpin a PSA. A small basket of indicators can help to capture the outcome and more comprehensively measure departmental performance, whereas the use of too many indicators could run the risk of undermining clarity about performance and management focus. Although the number will vary with the circumstances and availability of data for each PSA outcome, departments should aim for a small set of indicators, not usually exceeding five. All PSAs which relate to frontline, service delivery outcomes should clearly demonstrate a focus on user satisfaction/experience unless there is a substantive argument against doing so.

2.21 In selecting the indicators that will underpin PSAs, departments must consider 4 key areas:

(a) the indicator’s relevance in relation to the PSA outcome;

(b) the influence the department(s) has over the indicator;

(c) how accurate and usable the underpinning data is; and

(d) the balance of costs and benefits in collecting the necessary data from delivery units (where applicable).

2.22 Departments may wish to propose targets with end dates beyond the horizon of the period covered by the 2007 CSR. In such instances departments will need to pay due regard to questions of affordability and the fact that budgets will only be allocated for three years and must also set an interim target as close as possible to the end of the CSR period.

2.23 The purpose of the Measurement Annex is:

2.24 Measurement Annexes should contain at least two sections for each indicator:

2.25 The indicator fact sheet must contain:

2.26 Departments must specify a Data Quality Officer (DQO) for each indicator underpinning a PSA Outcome. The DQO will be responsible for ensuring that risks to data quality are actively managed and that limitations to the data are adequately communicated to both departmental boards and as part of departments’ public reporting. Ultimate responsibility for data systems quality, including the results of any external validation, will rest with the DQO, who will be named in the Measurement Annex.

Equality Considerations

2.27 When developing PSAs, departments must take equality considerations into account and they will need to provide evidence of this in their High Level Impact Assessments. More information can be found in Section 4 – Equality and Sustainable Development.

Templates for completing PSA/DSO Submissions

A. Summary List of Departmental Strategic Objectives

Departmental Strategic Objectives Key Indicators
Objective 1
Objective 2
Objective 3
Objective 4
Objective 5

B. Summary table of the Cross-Government PSA Priority Outcomes

PSA Outcomes Key Indicators Related Departmental Strategic Objectives Which other Departments will contribute to the achievement of this target?
PSA 1
PSA 2
PSA 3
PSA 4

C. Template for Delivery Agreements

PSA
(a) Vision
(b) Measurement
(c) Delivery strategy (including Delivery Chain Map)
(d) Risk Management strategy
(e) Consultation Schedule

Section 3 - Departmental Returns

Introduction

3.1 The departmental return should set out, in one document, the departments proposed resource spending plans for the years 2008-09 to 2010-11, comprised of a set of spending proposals with supporting evidence. In particular the benefits that would be achieved as a result of each spending proposal being implemented should be clearly stated with a clear linkage to PSA/DSO outputs/outcomes where appropriate.

3.2 In making recommendations to Ministers on the allocation of available resources, spending proposals will be assessed in terms of alignment with ministerial priorities, impact on PSA indicators as well as the extent to which the proposal relates to an unavoidable pressure (pre-commitment or statutory/legal requirement). In addition, a positive impact on equality, good relations, poverty, social inclusion or sustainable development would also increase the chances that a proposal will be recommended to Ministers. Therefore it is important that sufficient quality and quantity of evidence is provided, with the onus on the sponsor business area.

3.3 In light of the cross-departmental approach to the development of PSAs the preference is that the lead department for each PSA puts forward a joint proposal on behalf of the relevant delivery partner departments. This avoids the risk of two departments bidding for the same funding or not at all. There should be clear identification of when a spending proposal is linked to others so that they can be considered collectively where appropriate. Although the lead department will have overall responsibility for the PSA, delivery partner departments will have responsibility in terms of budget and target delivery for that element of the delivery chain they will be making a contribution to. For example if the PSA target is to reduce childhood obesity levels then DE would be responsible for the budget for improving physical activity levels in schools with the associated responsibility for delivering on the relevant target.

3.4 Spending proposals are limited to those with respect to the resource element of departmental expenditure excluding administration costs. Outcomes for the administration costs have already been determined by the Secretary of States commitment to 5% annual real reductions for this area of expenditure, although a restored Executive will have scope to adopt a different approach. In light of the recent NICS pay settlement, the associated nominal reductions in administration budgets will be 1.0%/2.8%/2.8% for the years 2008-09, 2009-10 and 2010-11 respectively[5]. Consideration is also being given to a reallocation exercise for administration budgets, within the same overall NI Block total, to ensure that the impact on front line service delivery is minimised.

3.5 Spending proposals in terms of capital will be considered in the development of the second iteration of the Investment Strategy for Northern Ireland (ISNI 2). However, the resource consequentials of capital projects in ISNI 2 will still need to be included as separate spending proposals for the Priorities and Budget process, although they are expected to be afforded high priority.

3.6 Whilst the departmental return will reflect the department’s views, it will be more effective if it is based on early and substantive engagement with the relevant DFP Supply Team – particularly in view of the tight overall resource position and the desire to avoid nugatory work for departments in terms of preparing spending proposals. Supply will therefore continue to engage with departments to discuss the approach and proposed contents of returns.

3.7 The departmental resource (excluding administration) baselines to be used for the 2007 Priorities and Budget process have recently been provided. Only the baseline data provided by CED should be used by departments as the starting points for their returns. For the avoidance of doubt, spending proposals are only required where a department requires additional resources to those set out in the baselines.

Review of Priorities and Financial Structures

3.8 Prior to the development of departmental returns, it is essential that departments take time to review their priority outcomes and financial structure. These aspects represent the core structure for the development of both financial and business planning, and for the establishment of PSA/DSO objectives. The intention is that at least one Unit of Service should map to each DSO although a Unit of Service cannot map to more than one DSO

3.9 For the purposes of continuity and transparency it is important that, where possible, any change to these structures is undertaken in such a way that a transparent link can be made – in database terms - to the previous structure. This is to ensure that comparable expenditure data can be presented over a run of years. Any proposed changes to Spending Area structures within departments must be agreed with Central Expenditure Division, via the relevant Supply Team.

Contents of the Return

3.10 The structure for the departmental return is set out below:

3.11 The following paragraphs specify the requirements of each section. It is important that all sections are covered in departmental returns.

Summary

3.12 The departmental submission should begin with a short textual summary of the department’s proposals, followed by a table setting out the overall budgeting implications of the proposals. A pro forma for this (Table 1) is attached at the end of this section.

Spending Proposals and Supporting Evidence

3.13 For each spending proposal the departmental submission should complete Table 2 which sets out details of each proposal and supporting evidence:

a. Summary of Spending Proposal- setting out the main points of the proposal and in particular how it links with the over-arching Ministerial priorities. The Responsible Officer for the proposal will be contacted for clarification on the information provided in the return. Links to other proposals and the date the High Level Impact Assessment was completed should also be specified;

b. Resource Requirements- should be set out relative to the position in 2007-08 although this may not be appropriate for new policies/programmes. Although administration and capital are being considered separately it will still be important to ensure consistency with the ISNI 2. In addition, the identification of administration pressures associated with the spending proposals will provide assurance that this aspect has been taken account of, even though it will still need to be funded from within the baseline position. In terms of the evidence that the level of resources proposed is the minimum necessary Departmental returns should refer to best practice levels of marginal/average costs in supporting the case. In light of the more constrained fiscal position it is also important to identify whether a reduced form of the bid might be possible (i.e. is it all or nothing);

c. Public Service Impact- the evidence presented will have a clear linkage with the Delivery Agreements in respect of PSA related spending proposals. Where appropriate the impact should be specified in terms of a quantifiable outcome/output relative to the 2007-08 position (or earlier year if more appropriate). Any impact should be clear and unambiguous i.e. references to for example; improved staff morale, avoidance of industrial action and/or avoidance of public criticism etc should not be used;

d. Extent to which proposal is unavoidable- it is recognised that in some cases there will be significant costs associated with simply maintaining existing services (MES) with little or no impact on the delivery of public services. It is essential that sufficient evidence is presented that the proposal is definitely unavoidable and that the quantum of cost involved is the minimum necessary; and,

e. Equality, Good Relations, Poverty/Social Inclusion and Sustainable Development Impact- while departments will be asked to provide DFP/OFMDFM with a summary of their HLIA consideration of their spending plans, as set out in Section 4, it is still important that specific positive impacts in these areas should be considered as part of the broader assessment of each proposal.

EU Related Bids

3.14 Northern Ireland has been allocated £XX million from the EU Structural Funds Programme for 2007-2013. Draft programmes are moving through the public consultation process. Changes to the treatment of EU income mean that future EU receipts will represent negative DEL and hence extra spending power for Northern Ireland, over and above allocations through the Barnett formula.

3.15 Departments should ensure that their budget proposals that include spending on EU Programmes are submitted net of expected EU income. Proposals should include departments’ needs in respect of remaining expenditure under the remainder of the 2000-06 programmes and for expenditure as set out in the latest version of each of the four draft programmes for 2007-13. Budget proposals should be drafted net of EU income using pro-forma in Table 2 and should be split into Capital and Resource bids. EUD will if required provide additional information on working assumptions accountable departments may wish to include.

3.16 For the Competitiveness and Employment programmes the final match element will be agreed with departments when the programmes are agreed with the European Commission. This is likely to be before any Budget settlement is agreed. In relation to the Cross border Co-operation Programme departments will be required to match fund any agreed activities from within their final budget settlement. For the PEACE III Programme EUD will bid for any necessary resources centrally in respect of the match fund element.

Priority Funding Packages

3.16 The 2007-08 allocations to departments with respect to the Secretary of States Priority Funding Packages (Children & Young People, Skills & Science and Environment and Renewable Energy) will be removed from the rolled forward departmental baselines. Under continued Direct Rule, the level of funding for each package will be maintained to at least the position for 2007-08, although departments will still be required to put forward spending proposals as part of the Priorities and Budget process, which will be subject to separate guidance.

Table 1- Summary of Departmental Spending Proposals – in rank order of priority

Rank Spending Proposal PSA/DSO Resource (excluding Admin) Requirements (£000’s)
2008-09 2009-10 2010-11      
1
2
3
4
5
6
7
8
9
10
11
12
13

Table 2- Departmental Proposal Pro-Forma

1. Summary of Spending Proposal

Title:
Responsible Officer:
Spending Area & UoB:
Link to other bids:
Date HLIA completed:
EU Matched Fund bid:
Set out short summary of the main details of the spending proposal including alignment with Ministerial Priorities

2. Resource Requirements (£000’s)

Baseline CSR Resource
Requirements
2007-08 2008-09 2009-10 2010-11
Resource
For information only
Associated Admin
Associated Capital
Supporting evidence that level of resource requirement is the minimum necessary
Could reduced scale of bid be delivered? Yes/No

3. Public Service Impact on PSA Key Performance Indicators

PSA /DSO Baseline Projected Impact
2007-08 2008-09 2009-2010 2010-11  
DSO
PSA 1
PSA 2
PSA 3
How will spending proposal impact on PSAs and bring wider benefits to the public?

4. Extent to which costs are unavoidable

Unavoidable due to: Yes/No Details of why pressure cannot be avoided and or funded from within existing baselines?
Ministerial Pre-commitment
Legal/Statutory Obligation
Price Inflation
Other 1-
Other 2-
Provide details as to why the level of resources requested is the minimum necessary?

5. Positive Equality and Sustainable Development Impact

Will the spending proposal have a positive impact in terms of: Yes/No Detail
Equality
Good Relations
Poverty/Social Inclusion
Sustainable Development

Section 4 – Equality and Sustainable Development

Introduction

4.1 This section sets out the actions required by departments in assessing, the potential equality, good relations, poverty, social inclusion and sustainable development impacts of spending proposals.

4.2 There have been some changes from the 2005 Priorities and Budget process. New Targeting Social Need (NTSN) has been replaced by Poverty and Social Inclusion. Public authorities also now need to take a new statutory duty regarding sustainable development into account, as they are required to act in a way best calculated to contribute to that achievement of sustainable development in Northern Ireland. This applies to all developing policies including the assessment of spending proposals.

4.3 In line with Equality Commission guidance that equality considerations should be mainstreamed into the policy decision process, there will a greater onus on the individuals responsible for a spending proposal to ensure that the equality impact is considered in the appropriate manner.

Equality Obligations

4.4 Section 75 and Schedule 9 to the Northern Ireland Act 1998 came into force on 1 January 2000 and placed a statutory obligation on public authorities to ensure that, they carry out their various functions relating to Northern Ireland, with due regard to the need to promote equality of opportunity between –

4.5 In addition, without prejudice to this obligation, Public Authorities are also required to have regard to the desirability of promoting good relations between persons of different religious belief, political opinion, and racial group. From January 2007 public authorities are also required to have due regard to the need to promote positive attitudes towards people with a disability and to encourage participation by people with a disability in public life.

4.6 The Government’s strategy to tackle poverty and social inclusion “Lifetime Opportunities" was published in November 2006. “Lifetime Opportunities" retains the principle of its predecessor New Targeting Social Need, which is to target resources and effort towards those in greatest objective need. These principles and considerations, which steered and informed priorities and budget proposals in the past, continue to apply. Departments should therefore identify and fully consider the anti-poverty and social inclusion implications on individuals, groups or areas of any proposed changes submitted.

Sustainable Development Obligations

4.7 In relation to sustainable development, from March 2007 public authorities (government departments and district councils) are required in exercising their functions, to act in the way best calculated to contribute to the achievement of sustainable development in Northern Ireland. This applies to all aspects of their responsibilities, including policy and operations.

Methodology

4.8 The Priorities and Budget process deals with the totality of public expenditure and investment in Northern Ireland. Policies and programmes falling from the Priorities and Budget are worked out in more detail by Departments, and implemented using the funding allocated in the Budget exercise. Even at the first stage, it is important to give appropriate consideration to the statutory obligations arising from equality and good relations duties under Section 75 as well as sustainable development.

4.9 Consideration at this early stage is necessarily at a higher or strategic level, and officials will not have access to such a wealth of detailed data as at later stages. With the agreement of the Equality Commission we have termed this a High Level Impact Assessment (HLIA), as opposed to an EQIA. It is important that Departments attempt to assess the potential equality, good relations, poverty and social inclusion and sustainable development impacts of targeting and funding allocation proposals submitted as part of Priorities and Budget to allow appropriate consideration, and where appropriate, mitigations to take place.

4.10 Each of the seven stages of the EQIA process are considered and, as far as possible, woven into the HLIA process established for Priorities and Budget. Completion of these HLIAs by Departments is an important part of compliance with mainstreaming equality considerations into the Priorities and Budget process. By summarising the data and reasoning which the Department has used so far in having due regard to equality of opportunity and regard to good relations, Departments will provide the information which Ministers will use to ensure that they have proper regard to these factors in taking their decisions.

4.11 Ministers have also decided that sustainable development should underpin the development of the Priorities and Budget process. Departments must therefore demonstrate how their spending proposals contribute to this and again, fulfil the associated statutory duty.

4.12 Departments must provide the evidence of equality, good relations, poverty and social inclusion, and sustainable development impacts, and where appropriate, mitigations, on which Ministers can base their considerations. These should be provided for each spending proposal and efficiency savings option. Departments should already have produced HLIAs with respect to each efficiency option developed into respective Efficiency Delivery Plans as part of the Comprehensive Spending Review although this analysis should be further refined as more information becomes available.

4.13 HLIAs should be completed by filling in the pro forma attached at Annex A. Guidance notes are attached to the pro forma. The form seeks information at a high or strategic level. Information should be summarised into a few lines wherever possible. If a proposed target or allocation has only neutral impacts, a pro forma should still be completed, indicating this. Completion of the pro forma helps Departments to demonstrate that they have fulfilled their statutory duty at this stage of policy development. In completing the HLIA you should seek the advice of your departmental Equality Unit regarding the content of these forms.

4.14 It is important to identify the potential positive (as well as negative) impacts of proposals, as these will be important factors in Ministerial decision making. In particular, proposals that contribute to the achievement of the relevant equality, good relations, poverty and social inclusion and sustainable development strategies should be highlighted. However, it is perhaps more important to identify potential negative impacts. It should be noted that a negative impact does not necessarily mean that the proposal cannot go ahead. Mitigating measures, for example a skewing of resources towards a particular Section 75 group, can help to negate any adverse differential impact. However, Departments MUST show that mitigations have been considered. Even if these have not been worked out in detail, Departments must indicate what they know about them at this stage and explain why, despite any continuing negative impact; it is still considered appropriate to proceed with the policy. In some circumstances, it will still be appropriate to proceed, but Departments and Ministers must be alert to the potential negative impacts. This information will have to be included in the consultation document, and must be robust and defensible.

4.15 If screening and equality impact assessment of relevant issues have not yet been carried out (and it will often be too early to do this), detailed data and research may not be available. This is a high level process and it is appropriate to set out as evidence the early-stage reasoning or even the hypothesis or qualitative evidence which leads the Department to the conclusions drawn. Where screening or an EQIA has not yet been carried out, Departments should ensure that this is factored into the Departments’ EQIA/Screening timetable for development and delivery. Departmental Equality Units should be informed of the timetable for EQIAs arising from proposals included in Priorities and Budget to enable inclusion in the annually published EQIA timetable. The Priorities and Budget documentation will include a list of all the proposed equality impact assessments (EQIAs) or screening exercises arising out of the allocations.

4.16 During the process of completing HLIAs, Departments will wish to consider the views of key stakeholders, including drawing on previous consultation exercises on relevant policies and previous Priorities and Budgets Plans. They can do this through engagement with key sectoral stakeholders relevant to their work. Initial consultation of this nature represents best practice in policy-making, especially in relation to Section 75.

4.17 In completing the section on sustainable development, departments will wish to refer to the screening tool (available from OFMDFM after []). This explains how policies and decisions can be screened and should be used to underpin the HLIAs. You will note that the tool poses questions about each of the three interconnected pillars (economy, society and environment) of sustainable development. Departments may wish to ensure that these are completed as necessary as they may be required to verify the scrutiny process.

Verification and Quality Assurance

4.18 Equality Commission guidance states that mainstreaming equality considerations must be central to public policy processes and uses the definition adopted by The Council for Europe “the (re) organisation, improvement, development and evaluation of policy processes, so that a(n) … equality perspective is incorporated in all policies at all levels and at all stages, by the actors normally involved in policy making" Departments are therefore required to:

a. Ensure that individual HLIAs are completed by policy lead officials with respect to each spending proposal – this needs to be conducted in a robust and timely manner in order to inform Ministerial decisions;

b. Ensure that HLIA’s comply with the specified guidance attached;

c. Ensure that they quality assure their HLIAs with advice from Equality Unit and note that they have indicated that they are content; and

d. Provide DFP (who will jointly consider with OFMDFM) with a completed overall summary of the equality, good relations and poverty, social inclusion impacts of all of the departments spending proposals. This includes a summary table of equality impacts as set out in Annex B.

4.19 It is critical that there is engagement with Departmental Equality Units as early as possible in the process of developing spending proposals.

4.20 Equality aspects have formed an important aspect of the formal consultation process when the draft Priorities and Budget document is published. The summaries produced by departments will form the basis of the equality, good relations, poverty and social inclusion and sustainable development analysis of the Priorities & Budget document although consideration will also be given to publishing the summaries alongside the Departmental sections of the document.

Annex A

High-Level impact assessment pro forma – Priorities and Budget 2008-2011

Public Authority Date Contact person

1. Description of Funding proposal including aim/objective

 

2. Will the funding proposal impact differently on any group within the nine S75 equality categories?

  Positive Negative Neutral Description of impacts Evidence used
Gender          
Religion          
Race          
Political Opinion          
Sexual Orientation          
Marital Status          
Disability          
Dependents          
Age          

3. (a) If the impacts identified are potentially positive or neutral please consider whether any further adjustments could be made that would increase equality of opportunity for groups within any of the Section 75 categories.

 

(b) If any of the impacts are negative you MUST provide evidence of consideration of all possible mitigations or policy alternatives that were considered, including data or evidence sources where available, and justification of any decision to proceed despite the negative impacts.

 

4. Will the funding proposal have impacts on individuals, groups or areas suffering from poverty and/or social exclusion?

Positive           Negative            No impact/neutral

Description of Impacts

5a. Does the funding proposal provide an opportunity to promote good relations between people of different:

  Yes No Description of way(s) in which good relations is/could be promoted
Race      
Religion      
Political Opinion      
Sexual orientation      
Persons with a disability and persons without      

5b. Could the funding proposal inadvertently inhibit or damage good relations between groups within any of the above categories?

NO YES

(if yes, please provide description below, and justification for proceeding, notwithstanding the impact)

 

6. Will the funding proposal have impacts on the promotion of sustainable development and if so, how?

Positive          Negative             No impact/neutral

Description of Impacts:

Notes for completion of form:

1. Provide a brief description of the funding proposal you are considering, together with an explanation of the aim or objective it is designed to fulfill. A title alone will not suffice. Think SMART. Aim for a single jargon free sentence which will allow someone with no prior knowledge of the area to grasp what you intend to achieve through the allocation.

2. This is a high-level equality impact consideration known as a High Level Impact Assessment or HLIA which has been designed, and agreed by the Equality Commission, for the purposes of mainstreaming equality considerations arising from statutory equality duty in S75(1) of the Northern Ireland Act 1998 into the Priorities and Budget process. It does not use the detailed S75 EQIA methodology, as it is designed for use at a higher level and earlier in the process than a full EQIA. Where the topic under consideration is a policy, it MUST be subject to equality screening and, if necessary, impact assessment at a later stage. Where impacts are positive or negative, this high level form requires you to provide evidence of impacts or summarise the data considered so far where it is available. If data is not yet available, use this column to provide a brief explanation of your reasons for identifying the impacts you have noted in column 4 (such as an outline of your background knowledge or similarities to impacts in other policy areas). The availability of data will strengthen your case. In particular, some evidence or data will usually be needed to justify carrying on with a policy or allocation which has potential negative equality impacts. If data is not available in such a case, you will need extremely strong justification, and should indicate what you propose to do to obtain data, and when. The evidence base for your proposal should be summarised briefly on this form and copies kept with your own records. Departmental Equality Units will advise on your proposal’s classification into one of the following categories.

3 Major and very positive equality impacts supported by evidence.

2 Positive equality impacts, supported by evidence.

1 Minor positive equality impacts or assertion of positive impacts unsupported by any evidence.

O Neutral equality impacts

N Negative equality impacts which require justification

Please note: a positive impact on the whole population is not a positive equality impact. The form seeks information about positive and negative differential impacts on equality, in other words, where one or more groups within equality categories will benefit more, or suffer a greater disadvantage than the population at large.

3. Question 3a asks you to think creatively about whether you can further promote equality of opportunity for any other S75 group, via this policy/funding decision. If so, you should write up a brief explanatory note, for example outlining measures which could increase access amongst minority ethnic communities or by people with disabilities. You should make it clear whether these additional measures will be actioned as part of the proposal under consideration, or whether they are speculative.

Question 3b is one of the most important in the pro forma. A negative impact does not mean that you cannot proceed with the policy or proposed allocation. However, it must be mitigated or an alternative found, if at all possible. In the last resort, the negative impact must be weighed against all other factors and may be justified.

4. Tackling poverty and promoting social inclusion considerations (previously New TSN) requires the skewing of money and resources to those individuals (e.g. unemployed / long-term unemployed; low/no educational qualifications or skills), groups (e.g. people with a disability; lone parents; older people) and areas (e.g. those scoring high on measures of multiple deprivation) in greatest objective need. If the target or funding proposal has a positive impact in terms of tackling poverty and/or social inclusion, this will count in its favour when Ministerial decisions are made. Explain how and to what extent the measure will have an impact, and whether it is positive or negative.

5. S75 (2) of the N.I. Act 1998 requires a public authority to consider whether a proposal provides an opportunity to better promote good relations within 3 of the 9 S75 (1) equality criteria or may inadvertently inhibit or damage them. This is an opportunity to consider the wider context of the policy, and any collateral benefits, which may be gained. Given the current policy [A Shared Future, 21 March 2005] and the wider desirability of promoting good relations, this pro forma also asks whether there is an opportunity to promote good relations between persons within two other S75 (1) categories who may benefit from similar measures, namely people of different sexual orientation and people with and without disabilities. It is important to think proactively and creatively about this question, as promoting good relations requires fresh thinking and sensitivity to potential damage that may be caused. There will be positive S75(1) impacts where positive measures are identified but they should also be mentioned here.

6. You should note that failure to fill in questions 2, 3 and 5 (insofar as it reflects s. 75(2)) of this pro forma may demonstrate failure to comply with your statutory duties. It also prevents Ministers from effectively complying with their statutory duties in relation to your proposals. Accordingly, if this pro forma is not completed, your PSA, bid or other documentation will be returned to you without consideration, and no decision will be taken in relation to it. If you need any assistance in relation to this form, please contact your Departmental Equality Unit.

7. Similarly, failure to complete question 6 may be taken as an indication that you have not complied with your statutory duty in terms of sustainable development. It will be necessary to engage a reasonable degree of proportionality between economic, social and environmental considerations in responding to identified impacts. But sufficient justification of conclusions should be included to make the rationale behind the judgement clear.

Annex B

Department [INSERT NAME]

Departmental HLIA Summary -
Equality, Good Relations and Poverty/Social Inclusion impacts of proposals

Equality Impacts
Anti Poverty/Social Inclusion Impact
Good Relations Impacts
Date HLIA Approved by Equality Unit
Comments
Spending Proposal Gender Religion Race Political opinion Sexual Orientation Marital Status Disability Dependents Age Religion Race Political opinion Sexual Orientation Disability      
List here 1? 1? 1? 1? 1? 1? 1? 1? 1? +? +? +? +? +? +? 25/05/07  

[1] PESA 2006 indicates that identifiable public expenditure per head of population in Northern Ireland was 34.3% higher than in England in 2005-06 or 36.1% higher if public order and safety, and social protection are excluded.

[2] The 5% real reduction in administration costs was originally to be implemented on the basis of 2.2% inflation and hence a 2.8% nominal reduction. However, in light of the impact from the NICS settlement, the nominal savings rates for 2008-09 only has been reduced to 1% with the overall 3% savings rate remaining constant.

[3] Whilst it is appreciated that both capital and resource projects/programmes require some form of administrative support, this will be at a lower level due to the 3% savings target while efficiencies can be derived from administrative support functions.

[4] “Ministers" should read as either “the Executive" or “Secretary of State" depending on the local political situation.

[5] The recent NICS pay settlement applies up until August 2009. Future NICS pay negotiations will need to be consistent in terms of affordability with the 2.8% nominal reductions in administration budgets for 2009-10 and 2010-11.

PfG & PSA Monitoring – Guidance

Introduction

The Monitoring Framework will collect and monitor information on the prospects for the targets and commitments within the Budget and Programme for Government being realised in full and within the resources allocated. The attached templates cover not just the 23 PSAs but also any key goals or commitments within the PfG that are not precisely covered by the 23 PSAs.

Returns must be made using the attached templates only.

Contributing departments should send their input directly to the lead department for consolidation and submission to OFMDFM/DFP. Only the lead department for each PSA should submit the final consolidated return to OFMDFM/DFP.

A PSA Monitoring Template should be completed for each PSA for which the Department leads. The template includes specified fields for the responses – these are embedded within the template and, where required, information should only be entered into these specified fields. In many cases relevant information from earlier monitoring rounds is retained and will only need to be updated should things have moved on.

Equality Monitoring

The Executive have agreed that departments should report on progress against addressing the issues identified in the EQIA carried out at a strategic level on the PfG, Budget and Investment Strategy. In line with the Executive’s decision, a separate template is attached setting out key issues/differentials identified and requesting an update from departments on how these have been considered in the delivery and/or development of policies.

Each department should complete the Equality template and return direct to OFMDFM/DFP.

Section 1: Lead Department and Delivery Board

The front page of the template seeks information on the SRO for the target. As agreed by the Executive on 5 March 2009, a single lead SRO must be identified for each PSA.

The templates also seek information on those individuals who comprise the Governance arrangements (Delivery Board) currently in place to oversee progress and delivery against the targets. As agreed by the Executive where the PSA or key goal or commitment requires the delivery of actions across a number of departments/divisions, the SRO must establish a Delivery Board, comprising accountable senior officials (at SCS level) from contributing departments/divisions.

There is also an opportunity, in the Governance section of the return, to elaborate further (if necessary) on the Governance arrangements themselves. Typically the relevant governance information from earlier monitoring rounds is retained and will only need to be updated if, for example, there have been any changes.

Role of the SRO

The SRO is accountable for coordinating quarterly returns to the centre and will be the first point of contact where follow up action or additional information is required.

While the SRO is not accountable for the delivery of targets outside their area of responsibility, they are responsible for ensuring that appropriate reporting mechanisms, data systems, risk management frameworks and delivery structures/processes are in place across all targets. Where the SRO has concerns in this regard, these should be set out in the appropriate sections within the template.

Section 2: Measurement

As before the Measurement section requests an update on each of the targets within the PSA as of 31 March 2009. As before the update for each indicator will be in one of two forms – one for Measurable Indicators and one for Narrative Indicators.

Measurable Indicators cover those targets that might typically be described as “SMART" targets. An update on each and an illustration of how to complete the Measurable Indicators within the template is included in Figure 1 below. The tables also record the value of the indicators reported in earlier monitoring rounds. Again previously provided information on, for example, baselines and milestone targets have been retained.

For Measurable Indicators a section is also included to enable a brief narrative to be added in support of the quantitative information on the indicators. In some cases we have used the narrative section to request some specific information in relation to a particular indicator – often in relation to the timing of the next available update for the indicator. A number of the targets within the PfG have indicators that have yet to be fully developed. Recognising that the results of this monitoring round will be published, departments will wish to address this and related issues as a matter of priority.

Narrative Indicators: Once again for those targets not defined in particularly measurable terms – for example targets that involve completing a particular action by a point in time – a narrative update is sought. Departments should avoid responding solely with simplistic statements within the narrative such as “progress on track" but should rather include additional supporting commentary to briefly describe and illustrate what progress has been made in the time to date. For example the narrative update might briefly describe what actions have been completed or progressed to date and how this compares to their target times.

In addition a RAG summary section is provided for both measurable and narrative indicators which seeks a judgement, on behalf of the Department, on the current prospects of successful delivery against each of the targets. As a default each is currently set to “RED". The box below provides a broad indication of how delivery is to be assessed against the RAG scale.

The RAG Assessment:

Red:

Amber:

Green / Amber:

Green:

Departments should use the narrative update space, which is now provided for both narrative and measurable indicators, to briefly outline the rationale for the RAG rating assigned to each indicator. OFMDFM/DFP will review departmental assessments of the RAG status of targets to ensure that all assessments are consistent with the above framework.

Where Departments have assigned a RED or AMBER RAG rating to an indicator, they must outline any remedial actions being taken to address performance on that indicator.

Where an update on progress is not provided, or data is not available, the RAG status WILL BE ASSESSED AND REPORTED AS RED.

Section 3: Governance

In the Governance section a response to a number of questions is sought to provide information on the Governance arrangements that are in place in relation to the monitoring and delivery of PSA Targets. Departments have the opportunity to confirm if a SRO and PSA Board are in place – or the embedded fields can be used to describe the arrangements that are in place. Typically the information reported in earlier returns (or an abridged version) will have been retained in the template and, as a result, Departments will only need to update this section should changes have occurred since the last return.

Section 4: Risk & Delivery Management

In relation to Risk & Delivery Management, a response is sought to a number of questions to provide confirmation that adequate mechanisms are in place to identify and deal with risks, to progress actions critical to meeting the PSA target and to manage the realisation of benefits. Once again, in order to minimise the administrative burden, responses are sought on a Y or N basis – although additional information is sought on any actions to support delivery of the targeted outcome that are not proceeding as planned.

Section 5: Financial

In the Financial section Departments / SROs are asked to confirm that the actions to support the delivery of the targeted outcome have robust estimates of the costs and benefits involved – and that costs remain within budget and spending has not slipped behind schedule. Where this cannot be confirmed Departments / SROs should provide additional information – for example on any relevant projects or programmes where there are delays or escalating costs, or on projects where costings and business cases have yet to be developed. Finally Departments / SROs are also asked to set out any approvals that have yet to be sought or secured – or alternatively confirm that all relevant approvals have already been secured.

Departments / SROs should also briefly set out the key Upcoming Actions that are intended to be progressed over the next six months to advance delivery of the PSA targets. Finally, in the Emerging Issues section, Departments / SROs should provide brief information on any emerging or potential threats to the delivery of the PSA targets. This might include such things as rising costs, project delays, a lack of cooperation from stakeholders, unexpected events, industrial action, ICT problems and so forth.

PFG Goals & Commitments

A number of Key Goals and Commitments within the PfG are not precisely covered within the 23 PSA targets. Typically there are about 3 or 4 per Departments and these have been identified, within a separate template, and Departments should report progress once again using the approach set out in Figure 1.

[Extract]

NIA Image


ofmdfm logo
FROM: JOHN MCMILLEN

DATE: 10 JUNE 2009

TO: PSA SROs Copy Distribution List Below

Programme for Government End-Year Delivery Report – ‘Rag’ Ratings

1. Thank you for submitting your completed Programme for Government (PFG) and PSA monitoring pro forma which provides an update on progress as at the end March 2009. As you are aware, the information provided in these monitoring returns will form the basis of an end year Delivery Report which will be tabled at the Executive meeting on 25 June.

2. You will recall that in the monitoring returns you were asked to award a ‘RAG’ rating against the PFG goals and commitments and the PSA indicators. The purpose of the ‘RAG’ assessment was to provide a clear and transparent assessment of progress which would allow comparison across PSAs, goals and commitments and highlight areas where we are doing well as well as areas of concern.

3. While clear guidance was issued on how delivery should be assessed against the ‘RAG’ scale, there is scope for inconsistency in the application of ‘RAG’ ratings across and within departments. Recognising that the Delivery Report is likely to be subject to considerable scrutiny it is imperative that there is consistency in the RAG assessment of targets and that these provide a robust and accurate reflection of progress in line with the guidance issued to departments.

4. With this is in mind, a central team comprising officials from EPU, Supply and PEDU were commissioned to provide their assessment of goals, commitments and PSA indicators against the RAG scale based on information provided in the monitoring returns. I have attached a spreadsheet which lists the central team’s assessment of the RAG Status for indicators provided by SROs.

5. I would be grateful if you could review the spreadsheet and complete the template at Annex B indicating whether you agree with the central team’s assessment or providing supplementary information in support of your department’ s original rating where appropriate. In the absence of additional information or clarification, the central team’s assessment will stand.

6. For ease of reference, I have also attached guidance on RAG assessments at Annex C.


John McMillen

[Extract]

Annex B

RAG ASSESSMENTS

Ref Number Key Goal Dept’s RAG Central Team’s RAG Agree with Central Team Rating
(yes/no)
If no, please provide information to support departmental rating
           
           
           
           
           

 

PSA Number Indicator Dept’s RAG Central Team’s RAG Agree with Central Team Rating
(yes/no)
If no, please provide information to support departmental rating
           
           
           
           
           

Annex C

RAG Guidance

The box below provides a broad indication of how delivery is to be assessed against the RAG scale.

The RAG Assessment:

Red:

  • Where little or no progress has been observed;
  • Where the measured rate of progress is highly unlikely to lead to the achievement of the targeted outcome;
  • Where delivery of the targeted outcome is likely to be achieved, but with significant delay;
  • Where confirmed baselines and/or milestones have not been established;
  • Where data on progress is not yet available or is not provided.

Amber:

  • Where there is a lack of robust information on progress, or the rate of progress is less than planned, against the targeted outcome;
  • Where some measurable progress has been made but the rate of progress is less than anticipated or falling appreciably short of interim milestones;
  • Where there is significant doubt around the achievement of the target outcomes in the targeted timeframe.

Green / Amber:

  • Where progress is broadly on track and is broadly meeting interim milestones, perhaps with small but redeemable deviations from plan;
  • Progress has been good but there is diminished confidence around sustaining future progress towards the targets;
  • There is significant confidence around the prospects of getting close to the targeted outcome.

Green:

  • Where targets have already been met (and, if relevant, should continue to be met);
  • Where progress is on track and interim milestones are being achieved or exceeded;
  • Where there is significant confidence, drawing on robust monitoring systems/data, around the prospects for delivering the targeted outcome on schedule

OFMDFM Logo


FROM: JOHN MCMILLEN

DATE: 28 September 2009

TO: PSA SROs                

Copy Distribution List Below

Programme for Government Monitoring Quarter 1 2009/2010 - ‘Rag’ Ratings

1. Thank you for submitting your completed Programme for Government (PfG) and PSA monitoring pro forma which provides an update on progress as at the end June 2009.

2. You will recall that in the monitoring returns you were asked to award a ‘RAG’ rating against the PFG goals and commitments and the PSA indicators. The purpose of the ‘RAG’ assessment is to provide a clear and transparent assessment of progress which allows comparison across PSAs, goals and commitments and highlights areas where we are doing well in addition to areas of concern.

3. While clear guidance was issued on how delivery should be assessed against the ‘RAG’ scale, there is scope for inconsistency in the application of ‘RAG’ ratings across and within departments. It is imperative that there is consistency in the RAG assessment of targets and that these provide a robust and accurate reflection of progress in line with the guidance issued to departments.

4. With this is in mind, a central team comprising officials from EPU, Supply and PEDU was commissioned to provide their assessment of goals, commitments and PSA indicators against the RAG scale based on information provided in the monitoring returns. I have attached a spreadsheet which lists the central team’s assessment of the RAG Status for indicators provided by SROs.

5. I should be grateful if you would review the spreadsheet and complete the template at Annex B indicating whether you accept the central team’s assessment or providing supplementary information in support of your department’s original rating where appropriate. In the absence of additional information or clarification, the central team’s assessment will stand.

6. For ease of reference, I have also attached guidance on RAG assessments at Annex C.

siganature - John McMillen

John McMillen

[Extract]

Annex B

RAG ASSESSMENTS

Ref Number Key Goal Dept’s RAG Central Team’s RAG Agree with Central Team Rating
(yes/no)
If no, please provide information to support departmental rating
           
           
           
           
           

 

PSA Number Indicator Dept’s RAG Central Team’s RAG Agree with Central Team Rating
(yes/no)
If no, please provide information to support departmental rating
           
           
           
           
           

Annex C

RAG Guidance

The box below provides a broad indication of how delivery is to be assessed against the RAG scale.

The RAG Assessment:

Red:

  • Where little or no progress has been observed;
  • Where the measured rate of progress is highly unlikely to lead to the achievement of the targeted outcome;
  • Where delivery of the targeted outcome is likely to be achieved, but with significant delay;
  • Where confirmed baselines and/or milestones have not been established;
  • For level of service indicators also;
  • The current level of service is some distance away from the level targeted - in excess of 10% (as opposed to 10 percentage points).
  • The service standard is within 10% of being met but not expected to exhibit any real improvement in the future.

Amber:

  • Where there is a lack of robust information on progress, or the rate of progress is less than planned, against the targeted outcome;
  • Where some measurable progress has been made but the rate of progress is less than anticipated or falling appreciably short of interim milestones;
  • Where there is significant doubt around the achievement of the target outcomes in the targeted timeframe.
  • For level of service indicators also;
  • There is a lack of information on the current level of service;
  • The target level of service is within 10% of being met;
  • There is confidence around improving performance against the standard in the (near) future.

Green / Amber:

  • Where progress is broadly on track and is broadly meeting interim milestones, perhaps with small but redeemable deviations from plan;
  • Progress has been good but there is diminished confidence around sustaining future progress towards the targets;
  • There is significant confidence around the prospects of getting close to the targeted outcome.
  • For level of service indicators also;
  • The target level of service is currently very close to being met (and no more than 5% away from the target);
  • There is confidence around meeting the target level of service in the in future.

Green:

  • Where targets have already been met (and, if relevant, should continue to be met);
  • Where progress is on track and interim milestones are being achieved or exceeded;
  • Where there is significant confidence, drawing on robust monitoring systems/data, around the prospects for delivering the targeted outcome on schedule;
  • For level of service indicators also;
  • The target level of service is currently being met;
NIA Image
NIA Image
NIA Image
NIA Image
NIA Image

Correspondence of 4 November 2009 from Mr Richard Pengelly

DFP Logo

Richard Pengelly
Public Spending Director
Central Finance Group
Room P6
Rathgael House
Balloo Road

BANGOR BT19 7NA

Tel No: 028 91858240 (x 68240)
Fax No: 028 91858175
email: richard.pengelly@dfpni.gov.uk
and christine.mills@dfpni.gov.uk

Copy Distribution List Below:

Paul Maskey
Chair of Public Account Committee
Northern Ireland Assembly
Room 371
Parliament Buildings
BELFAST BT4 3XX

4 November 2009

Dear Paul

Evidence Session on Public Service Agreements

Your letter of 12 October refers, in which you requested an overview of potential improvements that could enhance the effectiveness of the PEDU perspective on processes underpinning Public Service Agreements.

In terms of the effectiveness of processes underpinning Public Service Agreements – which we have taken as referring to the processes around the setting, monitoring and reporting of PSA progress – I have concentrated on the three elements which are key to effective scrutiny.

Target Setting

The foundations of effective scrutiny will always lie in effective and focussed Target Setting. In relation to the overall position, there is the need to choose a balanced basket of output and outcome indicators that can properly capture the broad intention of the objective and aims of a PSA. And in doing so, where output measures are included, it is important that there is a clear understanding of the linkages and influence between the output and the ultimate outcome being sought.

In relation to the individual targets within each PSA there is also a need to ensure there isn’t a heavy dependence on indicators which are prone to data weaknesses. In particular, related to issues touched upon in the NIAO Report, there is a need to ensure there is no undue reliance on data with disproportionate sampling errors (compared to the targeted change), long lags or infrequent reporting since these are some of the more difficult data issues to solve.

Baselines & Milestones

For performance management, monitoring and reporting it is imperative that the starting point should be clear if there is to be confidence around the degree of any improvements that might be demonstrate. And so for targets within PSAs the dataset should be capable of establishing a robust baseline (and of course a known and robust baseline is obviously also relevant to target setting). Beyond baselines there is also the need for milestones to be set – milestones that reflect a plausible path to the desired target and that take account of information that might exist on the relationship between inputs and outputs. Combined together robust baselines and milestones are essential ingredients to establishing an effective means to performance manage the delivery of objective and targets.

Monitoring & Reporting

Data collection might appear a mundane activity but it is crucial to monitoring and improving performance. Data is collected to provide information, that information is used to improve knowledge and understanding – and with better understanding and knowledge then better decisions can be made and more targeted action can be taken.

Monitoring and Reporting progress is about using the data to improve our information and knowledge. But an effective approach to performance management helps ensure that improved information and knowledge also encourages and facilitates decision making and the taking of action – in particular corrective action. And so the approach employed within the PfG Monitoring Framework, built upon the interim system developed by DFP during 2008, aims not just to collect information for the Executive but it aims to improve and encourage better performance. Through greater transparency and the central challenge function it also aims to promote realistic self assessment by departments of progress which in turn should help encourage the information available to be used for decisions on corrective action where that is needed.

Beyond this I would also like to see organisations make better use of their data to see what it reveals about the efficiency and effectiveness of their processes and their approach to delivery. Undoubtedly this is a longer term aim of our current approach to monitoring and reporting PfG performance to the Executive.

I hope these comments are helpful.

Richard Pengelly

Richard Pengelly

Copy Distribution List:
David Thomson
Fiona Hamill

Correspondence of 13 November 2009
from Mr David Sterling

From the Permanent Secretary
David Sterling

Ms Aoibhinn Treanor
Clerk to the Public Accounts Committee
Room 371
Parliament Buildings
Stormont
BT4 3XX

Netherleigh
Massey Avenue
BELFAST BT4 2JP
Telephone: (028) 9052 9441
Facsimile: (028) 9052 9545
Email: david.sterling@detini.gov.uk
janice.davison@detini.gov.uk

Our ref: NPS DETI 054/09
Your ref

13 November 2009

Dear Aoibhinn

Pac Evidence Session 8 October 2009

Thank you for your letter of 10 November.

During the 8 October evidence session I had stated, in a response to a question from Ms Purvis, that Northern Ireland’s productivity is currently 81% of the United Kingdom. On reviewing the Hansard Transcript, I realised that I was referring to the gap in economic prosperity between Northern Ireland and the UK (currently estimated to be 82%). In terms of the productivity gap between Northern Ireland and the United Kingdom (excluding the Greater South East of England), that is currently estimated to be 94%. I apologise for referring to the estimates for the NI / UK prosperity gap, when responding to a question relating to productivity. I should be grateful if the correction could be placed on record by publishing this letter along with the report.

Yours sincerely

DAVID STERLING

DAVID STERLING

[Extract]

Appendix 4

List of Witnesses
Who Gave Oral Evidence
to the Committee

List of Witnesses Who Gave
Oral Evidence to the Committee

1. Mr John McMillen, Accounting Officer, Office of the First Minister and deputy First Minister

2. Mr Damian Prince, Head of Economic Policy Unit, Office of the First Minister and deputy First Minister

3. Mr Gerry Lavery, Senior Finance Director, Department of Agriculture and Rural Deveoplemt

4. Mr David Sterling, Accounting Officer, Department of Enterprise, Trade and Investment

5. Mr Kieran Donnelly, Comptroller and Auditor General

6. Ms Fiona Hamill, Deputy Treasury Officer of Accounts, Department of Finance and Personnel