Refine search Expand filter

Reports

Published

Actions for Newcastle Urban Transformation and Transport Program

Newcastle Urban Transformation and Transport Program

Transport
Planning
Compliance
Infrastructure
Management and administration
Procurement
Project management

The urban renewal projects on former railway land in the Newcastle city centre are well targeted to support the objectives of the Newcastle Urban Transformation and Transport Program (the Program), according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. The planned uses of the former railway land achieve a balance between the economic and social objectives of the Program at a reasonable cost to the government. However, the evidence that the cost of the light rail will be justified by its contribution to the Program is not convincing.

The Newcastle Urban Transformation and Transport Program (the Program) is an urban renewal and transport program in the Newcastle city centre. The Hunter and Central Coast Development Corporation (HCCDC) has led the Program since 2017. UrbanGrowth NSW led the Program from 2014 until 2017. Transport for NSW has been responsible for delivering the transport parts of the Program since the Program commenced. All references to HCCDC in this report relate to both HCCDC and its predecessor, the Hunter Development Corporation. All references to UrbanGrowth NSW in this report relate only to its Newcastle office from 2014 to 2017.

This audit had two objectives:

  1. To assess the economy of the approach chosen to achieve the objectives of the Program.
  2. To assess the effectiveness of the consultation and oversight of the Program.

We addressed the audit objectives by answering the following questions:

a) Was the decision to build light rail an economical option for achieving Program objectives?
b) Has the best value been obtained for the use of the former railway land?
c) Was good practice used in consultation on key Program decisions?
d) Did governance arrangements support delivery of the program?

Conclusion
1. The urban renewal projects on the former railway land are well targeted to support the objectives of the Program. However, there is insufficient evidence that the cost of the light rail will be justified by its contribution to Program objectives.

The planned uses of the former railway land achieve a balance between the economic and social objectives of the Program at a reasonable cost to the Government. HCCDC, and previously UrbanGrowth NSW, identified and considered options for land use that would best meet Program objectives. Required probity processes were followed for developments that involved financial transactions. Our audit did not assess the achievement of these objectives because none of the projects have been completed yet.

Analysis presented in the Program business case and other planning documents showed that the light rail would have small transport benefits and was expected to make a modest contribution to broader Program objectives. Analysis in the Program business case argued that despite this, the light rail was justified because it would attract investment and promote economic development around the route. The Program business case referred to several international examples to support this argument, but did not make a convincing case that these examples were comparable to the proposed light rail in Newcastle.

The audited agencies argue that the contribution of light rail cannot be assessed separately because it is a part of a broader Program. The cost of the light rail makes up around 53 per cent of the total Program funding. Given the cost of the light rail, agencies need to be able to demonstrate that this investment provides value for money by making a measurable contribution to the Program objectives.

2. Consultation and oversight were mostly effective during the implementation stages of the Program. There were weaknesses in both areas in the planning stages.

Consultations about the urban renewal activities from around 2015 onward followed good practice standards. These consultations were based on an internationally accepted framework and met their stated objectives. Community consultations on the decision to close the train line were held in 2006 and 2009. However, the final decision in 2012 was made without a specific community consultation. There was no community consultation on the decision to build a light rail.

The governance arrangements that were in place during the planning stages of the Program did not provide effective oversight. This meant there was not a single agreed set of Program objectives until 2016 and roles and responsibilities for the Program were not clear. Leadership and oversight improved during the implementation phase of the Program. Roles and responsibilities were clarified and a multi-agency steering committee was established to resolve issues that needed multi-agency coordination.
The light rail is not justified by conventional cost-benefit analysis and there is insufficient evidence that the indirect contribution of light rail to achieving the economic development objectives of the Program will justify the cost.
Analysis presented in Program business cases and other planning documents showed that the light rail would have small transport benefits and was expected to make a modest contribution to broader Program objectives. Analysis in the Program business case argued that despite this, the light rail was justified because it would attract investment and promote economic development around the route. The Program business case referred to several international examples to support this argument, but did not make a convincing case that these examples were comparable to the proposed light rail in Newcastle.
The business case analysis of the benefits and costs of light rail was prepared after the decision to build light rail had been made and announced. Our previous reports, and recent reports by others, have emphasised the importance of completing thorough analysis before announcing infrastructure projects. Some advice provided after the initial light rail decision was announced was overly optimistic. It included benefits that cannot reasonably be attributed to light rail and underestimated the scope and cost of the project.
The audited agencies argue that the contribution of light rail cannot be assessed separately because it is part of a broader Program. The cost of the light rail makes up around 53 per cent of the total Program funding. Given the high cost of the light rail, we believe agencies need to be able to demonstrate that this investment provides value for money by making a measurable contribution to the Program objectives.

Recommendations
For future infrastructure programs, NSW Government agencies should support economical decision-making on infrastructure projects by:
  • providing balanced advice to decision makers on the benefits and risks of large infrastructure investments at all stages of the decision-making process
  • providing scope and cost estimates that are as accurate and complete as possible when initial funding decisions are being made
  • making business cases available to the public.​​​​​​
The planned uses of the former railway land achieve a balance between the economic and social objectives of the Program at a reasonable cost to the government.

The planned uses of the former railway land align with the objectives of encouraging people to visit and live in the city centre, creating attractive public spaces, and supporting growth in employment in the city. The transport benefits of the activities are less clear, because the light rail is the major transport project and this will not make significant improvements to transport in Newcastle.

The processes used for selling and leasing parts of the former railway land followed industry standards. Options for the former railway land were identified and assessed systematically. Competitive processes were used for most transactions and the required assessment and approval processes were followed. The sale of land to the University of Newcastle did not use a competitive process, but required processes for direct negotiations were followed.

Recommendation
By March 2019, the Hunter and Central Coast Development Corporation should:
  • work with relevant stakeholders to explore options for increasing the focus on the heritage objective of the Program in projects on the former railway land. This could include projects that recognise the cultural and industrial heritage of Newcastle.
Consultations about the urban renewal activities followed good practice standards, but consultation on transport decisions for the Program did not.

Consultations focusing on urban renewal options for the Program included a range of stakeholders and provided opportunities for input into decisions about the use of the former railway land. These consultations received mostly positive feedback from participants. Changes and additions were made to the objectives of the Program and specific projects in response to feedback received. 

There had been several decades of debate about the potential closure of the train line, including community consultations in 2006 and 2009. However, the final decision to close the train line was made and announced in 2012 without a specific community consultation. HCCDC states that consultation with industry and business representatives constitutes community consultation because industry representatives are also members of the community. This does not meet good practice standards because it is not a representative sample of the community.

There was no community consultation on the decision to build a light rail. There were subsequent opportunities for members of the community to comment on the implementation options, but the decision to build it had already been made. A community and industry consultation was held on which route the light rail should use, but the results of this were not made public. 

Recommendation
For future infrastructure programs, NSW Government agencies should consult with a wide range of stakeholders before major decisions are made and announced, and report publicly on the results and outcomes of consultations. 

The governance arrangements that were in place during the planning stages of the Program did not provide effective oversight. Project leadership and oversight improved during the implementation phase of the Program.

Multi-agency coordination and oversight were ineffective during the planning stages of the Program. Examples include: multiple versions of Program objectives being in circulation; unclear reporting lines for project management groups; and poor role definition for the initial advisory board. Program ownership was clarified in mid-2016 with the appointment of a new Program Director with clear accountability for the delivery of the Program. This was supported by the creation of a multi-agency steering committee that was more effective than previous oversight bodies.

The limitations that existed in multi-agency coordination and oversight had some negative consequences in important aspects of project management for the Program. This included whole-of-government benefits management and the coordination of work to mitigate impacts of the Program on small businesses.

Recommendations
For future infrastructure programs, NSW Government agencies should: 

  • develop and implement a benefits management approach from the beginning of a program to ensure responsibility for defining benefits and measuring their achievement is clear
  • establish whole-of-government oversight early in the program to guide major decisions. This should include:
    • agreeing on objectives and ensuring all agencies understand these
    • clearly defining roles and responsibilities for all agencies
    • establishing whole-of-government coordination for the assessment and mitigation of the impact of major construction projects on businesses and the community.

By March 2019, the Hunter and Central Coast Development Corporation should update and implement the Program Benefits Realisation Plan. This should include:

  • setting measurable targets for the desired benefits
  • clearly allocating ownership for achieving the desired benefits
  • monitoring progress toward achieving the desired benefits and reporting publicly on the results.

Appendix one - Response from agencies    

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #310 - released 12 December 2018

Published

Actions for Unsolicited proposal process for the lease of Ausgrid

Unsolicited proposal process for the lease of Ausgrid

Premier and Cabinet
Asset valuation
Infrastructure
Internal controls and governance
Management and administration
Procurement
Project management
Service delivery
Shared services and collaboration

In October 2016, the NSW Government accepted an unsolicited proposal from IFM Investors and AustralianSuper to lease 50.4 per cent of Ausgrid for 99 years. The deal followed the Federal Government’s rejection of two bids from foreign investors, for national security reasons.

A performance audit of the lease of Ausgrid has found shortcomings in the unsolicited proposal process. Releasing the audit findings today, the Auditor-General for New South Wales, Margaret Crawford said ‘this transaction involved a $20 billion asset owned by the people of New South Wales. As such, it warranted strict adherence to established guidelines’.

Ausgrid is a distributor of electricity to eastern parts of Sydney, the Central Coast, Newcastle and the Hunter Region.

In June 2014, the then government announced its commitment to lease components of the state's electricity network as part of the Rebuilding NSW plan. Implementation of the policy began after the government was re-elected in 2015. Between November 2015 and August 2016, the NSW Government held a competitive tender process to lease 50.4 per cent of Ausgrid for 99 years. The NSW Government abandoned the process on 19 August 2016 after the Australian Treasurer rejected two bids from foreign investors, for national security reasons. That day, the Premier and Treasurer released a media statement clarifying the government's objective to complete the transaction via a competitive process in time to include the proceeds in the 2017–18 budget.

On 31 August 2016, the state received an unsolicited proposal from IFM Investors and AustralianSuper to acquire an interest in Ausgrid under the same terms proposed by the state during the tender process. In October 2016, the government accepted the unsolicited proposal. 

This audit examined whether the unsolicited proposal process for the partial long-term lease of Ausgrid was effectively conducted and in compliance with the government’s 2014 Unsolicited Proposals: Guide for Submission and Assessment (Unsolicited Proposals Guide or the Guide). 

The audit focused on how the government-appointed Assessment Panel and Proposal Specific Steering Committee assessed key requirements in the Guide that unsolicited proposals must be demonstrably unique and represent value for money. 

Conclusion

The evidence available does not conclusively demonstrate the unsolicited proposal was unique, and there were some shortcomings in the negotiation process, documentation and segregation of duties. That said, before the final commitment to proceed with the lease, the state obtained assurance that the proposal delivered value for money. 

It is particularly important to demonstrate unsolicited proposals are unique, in order to justify the departure from other transaction processes that offer greater competition, transparency and certainty about value for money.

The Assessment Panel and the Proposal Specific Steering Committee determined the Ausgrid unsolicited proposal was unique, primarily on the basis that the proponent did not require foreign investment approval from the Australian Treasurer, and the lease transaction could be concluded earlier than through a second tender process. However, the evidence that persuaded the Panel and Committee did not demonstrate that no other proponent could conclude the transaction in time to meet the government’s deadline. 

It is not appropriate to determine an unsolicited proposal is unique because it delivers an earlier outcome than possible through a tender process. The Panel and Committee did not contend, and it is not evident, that the unsolicited proposal was the only way to meet the government’s transaction deadline.

The evidence does not demonstrate that the proponent was the only party that would not have needed foreign investment approval to participate in the transaction. It also does not demonstrate that the requirement for foreign investment approval would have reduced the pool of foreign buyers to the degree that it would be reasonable to assume none would emerge. 

The Panel, Committee and financial advisers determined that the final price represented value for money, and that retendering offered a material risk of a worse financial outcome. However, an acceptable price was revealed early in the negotiation process, and doing so made it highly unlikely that the proponent would offer a higher price than that disclosed. The Department of Premier and Cabinet (DPC) and NSW Treasury were not able to provide a documented reserve price, bargaining strategy or similar which put the negotiations in context. It is not evident that the Panel or Committee authorised, justified or endorsed negotiations in advance. 

Key aspects of governance recommended by the Guide were in place. Some shortcomings relating to role segregation, record keeping and probity assurance weakened the effectiveness of the unsolicited proposal process adopted for Ausgrid.

The reasons for accepting that the proposal and proponent were unique are not compelling.

The Unsolicited Proposals Guide says the 'unique benefits of the proposal and the unique ability of the proponent to deliver the proposal' must be demonstrated. 

The conclusion reached by the Panel and Committee that the proposal offered a ‘unique ability to deliver (a) strategic outcome’ was primarily based on the proponent not requiring foreign investment approval from the Australian Treasurer, and allowing the government to complete the lease transaction earlier than by going through a second tender process. 

It is not appropriate to determine an unsolicited proposal is unique because it delivers an earlier outcome than possible through a tender process. The Panel and Committee did not contend, and it is not evident, that the unsolicited proposal was the only way to meet the government’s transaction deadline.

The evidence does not demonstrate that the proponent was the only party that would not have needed foreign investment approval to participate in the transaction. Nor does it demonstrate that the requirement for foreign investment approval would have reduced the pool of foreign buyers to the degree that it would be reasonable to assume none would emerge. 

That said, the Australian Treasurer’s decision to reject the two bids from the previous tender process created uncertainty about the conditions under which he would approve international bids. The financial advisers engaged for the Ausgrid transaction informed the Panel and Committee that:

  • it was not likely another viable proponent would emerge soon enough to meet the government’s transaction deadline
  • the market would be unlikely to deliver a better result than offered by the proponent
  • going to tender presented a material risk of a worse financial result. 

The Unsolicited Proposals Guide says that a proposal to directly purchase or acquire a government-owned entity or property will generally not be unique. The Ausgrid unsolicited proposal fell into this category. 

Recommendations:
DPC should ensure future Assessment Panels and Steering Committees considering a proposal to acquire a government business or asset:

  • recognise that when considering uniqueness they should: 
    • require very strong evidence to decide that both the proponent and proposal are the only ones of their kind that could meet the government’s objectives 
    • give thorough consideration to any reasonable counter-arguments against uniqueness.
  • rigorously consider all elements of the Unsolicited Proposals Guide when determining whether a proposal should be dealt with as an unsolicited proposal, and document these deliberations and all relevant evidence
  • do not use speed of transaction compared to a market process as justification for uniqueness.
The process to obtain assurance that the final price represented value for money was adequate. However, the negotiation approach reduced assurance that the bid price was maximised. 

The Panel and Committee concluded the price represented value for money, based on peer-reviewed advice from their financial advisers and knowledge acquired from previous tenders. The financial advisers also told the Panel and Committee that there was a material risk the state would receive a lower price than offered by the unsolicited proposal if it immediately proceeded with a second market transaction. 

The state commenced negotiations on price earlier than the Guide says they should have. Early disclosure of a price that the state would accept reduced the likelihood of achieving a price greater than this. DPC says the intent of this meeting was to quickly establish whether the proponents could meet the state’s benchmark rather than spending more time and resources on a proposal which had no prospect of proceeding.

DPC and NSW Treasury were not able to provide a documented reserve price, negotiation strategy or similar which put the negotiations and price achieved in context. It was not evident that the Panel or Committee authorised, justified or endorsed negotiations in advance. However, the Panel and Committee endorsed the outcomes of the negotiations. 

The negotiations were informed by the range of prices achieved for similar assets and the specific bids for Ausgrid from the earlier market process.

Recommendations:
DPC should ensure any future Assessment Panels and Steering Committees considering a proposal to acquire a government business or asset:

  • document a minimum acceptable price, and a negotiating strategy designed to maximise price, before commencing negotiations
  • do not communicate an acceptable price to the proponent, before the negotiation stage of the process, and then only as part of a documented bargaining strategy.
Key aspects of governance recommended by the Guide were in place, but there were some shortcomings around role segregation, record keeping and probity assurance.

The state established a governance structure in accordance with the Unsolicited Proposals Guide, including an Assessment Panel and Proposal Specific Steering Committee. The members of the Panel and Steering Committee were senior and experienced officers, as befitted the size and nature of the unsolicited proposal. 

The separation of negotiation, assessment and review envisaged by the Guide was not maintained fully. The Chair of the Assessment Panel and a member of the Steering Committee were involved in negotiations with the proponent. 

DPC could not provide comprehensive records of some key interactions with the proponent or a documented negotiation strategy. The absence of such records means the Department cannot demonstrate engagement and negotiation processes were authorised and rigorous. 

The probity adviser reported there were no material probity issues with the transaction. The probity adviser also provided audit services. This is not good practice. The same party should not provide both advisory and audit services on the same transaction.

Recommendations:
DPC should ensure any future Assessment Panels and Steering Committees considering a proposal to acquire a government entity or asset:
•    maintain separation between negotiation, assessment and review in line with the Unsolicited Proposals Guide
•    keep an auditable trail of documentation relating to the negotiation process
•    maintain separation between any probity audit services engaged and the probity advisory and reporting services recommended in the current Guide.

Published

Actions for Mobile speed cameras

Mobile speed cameras

Transport
Compliance
Financial reporting
Information technology
Internal controls and governance
Management and administration
Regulation
Service delivery

Key aspects of the state’s mobile speed camera program need to be improved to maximise road safety benefits, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Mobile speed cameras are deployed in a limited number of locations with a small number of these being used frequently. This, along with decisions to limit the hours that mobile speed cameras operate, and to use multiple warning signs, have reduced the broad deterrence of speeding across the general network - the main policy objective of the mobile speed camera program.

The primary goal of speed cameras is to reduce speeding and make the roads safer. Our 2011 performance audit on speed cameras found that, in general, speed cameras change driver behaviour and have a positive impact on road safety.

Transport for NSW published the NSW Speed Camera Strategy in June 2012 in response to our audit. According to the Strategy, the main purpose of mobile speed cameras is to reduce speeding across the road network by providing a general deterrence through anywhere, anytime enforcement and by creating a perceived risk of detection across the road network. Fixed and red-light speed cameras aim to reduce speeding at specific locations.

Roads and Maritime Services and Transport for NSW deploy mobile speed cameras (MSCs) in consultation with NSW Police. The cameras are operated by contractors authorised by Roads and Maritime Services. MSC locations are stretches of road that can be more than 20 kilometres long. MSC sites are specific places within these locations that meet the requirements for a MSC vehicle to be able to operate there.

This audit assessed whether the mobile speed camera program is effectively managed to maximise road safety benefits across the NSW road network.

Conclusion

The mobile speed camera program requires improvements to key aspects of its management to maximise road safety benefits. While camera locations have been selected based on crash history, the limited number of locations restricts network coverage. It also makes enforcement more predictable, reducing the ability to provide a general deterrence. Implementation of the program has been consistent with government decisions to limit its hours of operation and use multiple warning signs. These factors limit the ability of the mobile speed camera program to effectively deliver a broad general network deterrence from speeding.

Many locations are needed to enable network-wide coverage and ensure MSC sessions are randomised and not predictable. However, there are insufficient locations available to operate MSCs that meet strict criteria for crash history, operator safety, signage and technical requirements. MSC performance would be improved if there were more locations.

A scheduling system is meant to randomise MSC location visits to ensure they are not predictable. However, a relatively small number of locations have been visited many times making their deployment more predictable in these places. The allocation of MSCs across the time of day, day of week and across regions is prioritised based on crash history but the frequency of location visits does not correspond with the crash risk for each location.

There is evidence of a reduction in fatal and serious crashes at the 30 best-performing MSC locations. However, there is limited evidence that the current MSC program in NSW has led to a behavioural change in drivers by creating a general network deterrence. While the overall reduction in serious injuries on roads has continued, fatalities have started to climb again. Compliance with speed limits has improved at the sites and locations that MSCs operate, but the results of overall network speed surveys vary, with recent improvements in some speed zones but not others.
There is no supporting justification for the number of hours of operation for the program. The rate of MSC enforcement (hours per capita) in NSW is less than Queensland and Victoria. The government decision to use multiple warning signs has made it harder to identify and maintain suitable MSC locations, and impeded their use for enforcement in both traffic directions and in school zones. 

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #308 - released 18 October 2018

Published

Actions for Progress and measurement of the Premier's Priorities

Progress and measurement of the Premier's Priorities

Premier and Cabinet
Compliance
Internal controls and governance
Management and administration
Project management
Risk
Service delivery
Shared services and collaboration
Workforce and capability

The Premier’s Implementation Unit uses a systematic approach to measuring and reporting progress towards the Premier’s Priorities performance targets, but public reporting needed to improve, according to a report released today by the Auditor-General of NSW, Margaret Crawford.

The Premier of New South Wales has established 12 Premier’s Priorities. These are key performance targets for government.

The 12 Premier's Priorities
  • 150,000 new jobs by 2019

  • Reduce the volume of litter by 40 per cent by 2020

  • 10 key projects in metro and regional areas to be delivered on time and on budget, and nearly 90 local infrastructure projects to be delivered on time

  • Increase the proportion of NSW students in the top two NAPLAN bands by eight per cent by 2019

  • Increase the proportion of women in senior leadership roles in the NSW Government sector from 33 to 50 per cent by 2025 and double the number of Aboriginal and Torres Strait Islander people in senior leadership roles in the NSW Government sector, from 57 to 114

  • Increase the proportion of young people who successfully move from Specialist Homelessness Services to long-term accommodation to more than 34 per cent by 2019

  • 61,000 housing completions on average per year to 2021

  • Reduce the proportion of domestic violence perpetrators reoffending by 25 per cent by 2021

  • Improve customer satisfaction with key government services every year, this term of government to 2019

  • Decrease the percentage of children and young people re-reported at risk of significant harm by 15 per cent by 2020

  • 81 per cent of patients through emergency departments within four hours by 2019

  • Reduce overweight and obesity rates of children by five percentage points by 2025


Source: Department of Premier and Cabinet, Premier’s Priorities website.

Each Premier’s Priority has a lead agency and minister responsible for achieving the performance target.

The Premier’s Implementation Unit (PIU) was established within the Department of Premier and Cabinet (DPC) in 2015. The PIU is a delivery unit that supports agencies to measure and monitor performance, make progress toward the Premier’s Priorities targets, and report progress to the Premier, key ministers and the public.

This audit assessed how effectively the NSW Government is progressing and reporting on the Premier's Priorities.

 


The Premier’s Implementation Unit (PIU) is effective in assisting agencies to make progress against the Premier’s Priorities targets. Progress reporting is regular but transparency to the public is weakened by the lack of information about specific measurement limitations and lack of clarity about the relationship of the targets to broader government objectives.The PIU promotes a systematic approach to measuring performance and reporting progress towards the Premier’s Priorities’ performance targets. Public reporting would be improved with additional information about the rationale for choosing specific targets to report on broader government objectives.

The PIU provides a systematic approach to measuring performance and reporting progress towards the Premier's Priorities performance targets. Public reporting would be improved with additional information about the rationale for choosing specific targets to report on broader government objectives. The data used to measure the Premier’s Priorities comes from a variety of government and external datasets, some of which have known limitations. These limitations are not revealed in public reporting, and only some are revealed in progress reported to the Premier and ministers. This limits the transparency of reporting.

The PIU assists agencies to avoid unintended outcomes that can arise from prioritising particular performance measures over other areas of activity. The PIU has adopted a collaborative approach to assisting agencies to analyse performance using data, and helping them work across organisational silos to achieve the Premier’s Priorities targets.


 


Data used to measure progress for some of the Premier’s Priorities has limitations which are not made clear when progress is reported. This reduces transparency about the reported progress. Public reporting would also be improved with additional information about the relationship between specific performance measures and broader government objectives.

The PIU is responsible for reporting progress to the Premier, key ministers and the public. Agencies provide performance data and some play a role in preparing progress reports for the Premier and ministers. For 11 of the Premier's Priorities, progress is reported against measurable and time-related performance targets. For the infrastructure priority, progress is reported against project milestones.

Progress of some Priorities is measured using data that has known limitations, which should be noted wherever progress is reported. For example, the data used to report on housing completions does not take housing demolitions into account, and is therefore overstating the contribution of this performance measure to housing supply. This known limitation is not explained in progress reports or on the public website.

Data used to measure progress is sourced from a mix of government and external datasets. Updated progress data for most Premier’s Priorities is published on the Premier’s Priorities website annually, although reported to the Premier and key ministers more frequently. The PIU reviews the data and validates it through fieldwork with front line agencies. The PIU also assists agencies to avoid unintended outcomes that can arise from prioritising single performance measures. Most, but not all, agencies use additional indicators to check for misuse of data or perverse outcomes.

We examined the reporting processes and controls for five of the Premier’s Priorities. We found that there is insufficient assurance over the accuracy of the data on housing approvals.

The relationships between performance measures and broader government objectives is not always clearly explained on the Premier’s Priority website, which is the key source of public information about the Premier’s Priorities. For example, the Premier’s Priority to reduce litter volumes is communicated as “Keeping our Environment Clean.” While the website explains why reducing litter is important, it does not clearly explain why that particular target has been chosen to measure progress in keeping the environment clean.

By December 2018, the Department of Premier and Cabinet should:

  1. improve transparency of public reporting by:
    • providing information about limitations of reported data and associated performance
    • clarifying the relationship between the Premier’s Priorities performance targets and broader government objectives.
  2. ensure that processes to check and verify data are in place for all agency data sources
  3. encourage agencies to develop and implement additional supporting indicators for all Premier’s Priority performance measures to prevent and detect unintended consequences or misuse of data.

 


The Premier's Implementation Unit is effective in supporting agencies to deliver progress towards the Premier’s Priority targets.

The PIU promotes a systematic approach to monitoring and reporting progress against a target, based on a methodology used in delivery units elsewhere in the world. The PIU undertakes internal self-evaluation, and commissions regular reviews of methodology implementation from the consultancy that owns the methodology and helped to establish the PIU. However, the unit lacks periodic independent reviews of their overall effectiveness. The PIU has adopted a collaborative approach and assists agencies to analyse performance using data, and work across organisational silos to achieve the Premier’s Priorities targets.

Agency representatives recognise the benefits of being responsible for a Premier's Priority and speak of the value of being held to account and having the attention of the Premier and senior ministers.

By June 2019, the Department of Premier and Cabinet should:

  1. establish routine collection of feedback about PIU performance including:
    • independent assurance of PIU performance
    • opportunity for agencies to provide confidential feedback.

 

 

Published

Actions for Managing Antisocial behaviour in public housing

Managing Antisocial behaviour in public housing

Community Services
Asset valuation
Infrastructure
Regulation
Service delivery
Workforce and capability

The Department of Family and Community Services (FACS) has not adequately supported or resourced its staff to manage antisocial behaviour in public housing according to a report released today by the Deputy Auditor-General for New South Wales, Ian Goodwin. 

In recent decades, policy makers and legislators in Australian states and territories have developed and implemented initiatives to manage antisocial behaviour in public housing environments. All jurisdictions now have some form of legislation or policy to encourage public housing tenants to comply with rules and obligations of ‘good neighbourliness’. In November 2015, the NSW Parliament changed legislation to introduce a new approach to manage antisocial behaviour in public housing. This approach is commonly described as the ‘strikes’ approach. 

When introduced in the NSW Parliament, the ‘strikes’ approach was described as a means to:

  • improve the behaviour of a minority of tenants engaging in antisocial behaviour 
  • create better, safer communities for law abiding tenants, including those who are ageing and vulnerable.

FACS has a number of tasks as a landlord, including a responsibility to collect rent and organise housing maintenance. FACS also has a role to support tenants with complex needs and manage antisocial behaviour. These roles have some inherent tensions. The FACS antisocial behaviour management policy aims are: 

to balance the responsibilities of tenants, the rights of their neighbours in social housing, private residents and the broader community with the need to support tenants to sustain their public housing tenancies.

This audit assessed the efficiency and effectiveness of the ‘strikes’ approach to managing antisocial behaviour in public housing environments.

We examined whether:

  • the approach is being implemented as intended and leading to improved safety and security in social housing environments
  • FACS and its partner agencies have the capability and capacity to implement the approach
  • there are effective mechanisms to monitor, report and progressively improve the approach.
Conclusion

FACS has not adequately supported or resourced its staff to implement the antisocial behaviour policy. FACS antisocial behaviour data is incomplete and unreliable. Accordingly, there is insufficient data to determine the nature and extent of the problem and whether the implementation of the policy is leading to improved safety and security

FACS management of minor and moderate incidents of antisocial behaviour is poor. FACS has not dedicated sufficient training to equip frontline housing staff with the relevant skills to apply the antisocial behaviour management policy. At more than half of the housing offices we visited, staff had not been trained to:

  • conduct effective interviews to determine whether an antisocial behaviour complaint can be substantiated

  • de escalate conflict and manage complex behaviours when required

  • properly manage the safety of staff and tenants

  • establish information sharing arrangements with police

  • collect evidence that meets requirements at the NSW Civil and Administrative Tribunal

  • record and manage antisocial behaviour incidents using the information management system HOMES ASB.

When frontline housing staff are informed about serious and severe illegal antisocial behaviour incidents, they generally refer them to the FACS Legal Division. Staff in the Legal Division are trained and proficient in managing antisocial behaviour in compliance with the policy and therefore, the more serious incidents are managed effectively using HOMES ASB. 


FACS provides housing services to most remote townships via outreach visits from the Dubbo office. In remote townships, the policy is not being fully implemented due to insufficient frontline housing staff. There is very limited knowledge of the policy in these areas and FACS data shows few recorded antisocial behaviour incidents in remote regions. 


The FACS information management system (HOMES ASB) is poorly designed and has significant functional limitations that impede the ability of staff to record and manage antisocial behaviour. Staff at most of the housing offices we visited were unable to accurately record antisocial behaviour matters in HOMES ASB, making the data incorrect and unreliable.

Published

Actions for Matching skills training with market needs

Matching skills training with market needs

Industry
Compliance
Internal controls and governance
Management and administration
Risk
Service delivery
Workforce and capability

The NSW Department of Industry targets subsidies towards training programs delivering skills most needed in New South Wales. However, the Department still provides subsidies to qualifications that the market may no longer need, according to a report released by Margaret Crawford, Auditor-General for New South Wales. 

In 2012, governments across Australia entered into the National Partnership Agreement on Skills Reform. Under the National Partnership Agreement, the Australian Government provided incentive payments to States and Territories to move towards a more contestable Vocational Education and Training (VET) market. The aim of the National Partnership Agreement was to foster a more accessible, transparent, efficient and high quality training sector that is responsive to the needs of students and industry. 

The New South Wales Government introduced the Smart and Skilled program in response to the National Partnership Agreement. Through Smart and Skilled, students can choose a vocational course from a list of approved qualifications and training providers. Students pay the same fee for their chosen qualification regardless of the selected training provider and the government covers the gap between the student fee and the fixed price of the qualification through a subsidy paid to their training provider. 

Smart and Skilled commenced in January 2015, with the then Department of Education and Communities having primary responsibility for its implementation. Since July 2015, the NSW Department of Industry (the Department) has been responsible for VET in New South Wales and the implementation of Smart and Skilled. 

The NSW Skills Board, comprising nine part-time members appointed by the Minister for Skills, provides independent strategic advice on VET reform and funding. In line with most other States and Territories, the Department maintains a 'Skills List' which contains government subsidised qualifications to address identified priority skill needs in New South Wales.

This audit assessed the effectiveness of the Department in identifying, prioritising, and aligning course subsidies to the skill needs of NSW. To do this we examined whether:

  • the Department effectively identifies and prioritises present and future skill needs 
  • Smart and Skilled funding is aligned with the priority skill areas
  • skill needs and available VET courses are effectively communicated to potential participants and training providers.

Smart and Skilled is a relatively new and complex program, and is being delivered in the context of significant reform to VET nationally and in New South Wales. A large scale government funded contestable market was not present in the VET sector in New South Wales before the introduction of Smart and Skilled. This audit's findings should be considered in that context.
 

Conclusion
The Department effectively consults with industry, training providers and government departments to identify skill needs, and targets subsidies to meet those needs. However, the Department does not have a robust, data driven process to remove subsidies from qualifications which are no longer a priority. There is a risk that some qualifications are being subsidised which do not reflect the skill needs of New South Wales. 
The Department needs to better use the data it has, and collect additional data, to support its analysis of priority skill needs in New South Wales, and direct funding accordingly.
In addition to subsidising priority qualifications, the Department promotes engagement in skills training by:
  • funding scholarships and support for disadvantaged students
  • funding training in regional and remote areas
  • providing additional support to deliver some qualifications that the market is not providing.

The Department needs to evaluate these funding strategies to ensure they are achieving their goals. It should also explore why training providers are not delivering some priority qualifications through Smart and Skilled.

Training providers compete for funding allocations based on their capacity to deliver. The Department successfully manages the budget by capping funding allocated to each Smart and Skilled training provider. However, training providers have only one year of funding certainty at present. Training providers that are performing well are not rewarded with greater certainty.

The Department needs to improve its communication with prospective students to ensure they can make informed decisions in the VET market.

The Department also needs to communicate more transparently to training providers about its funding allocations and decisions about changes to the NSW Skills List. 

The NSW Skills List is unlikely to be missing high priority qualifications, but may include lower priority qualifications because the Department does not have a robust process to identify and remove these qualifications from the list. The Department needs to better use available data, and collect further data, to support decisions about which qualifications should be on the NSW Skills List.

The Department relies on stakeholder proposals to update the NSW Skills List. Stakeholders include industry, training providers and government departments. These stakeholders, particularly industry, are likely to be aware of skill needs, and have a strong incentive to propose qualifications that address these needs. The Department’s process of collecting stakeholder proposals helps to ensure that it can identify qualifications needed to address material skill needs. 

It is also important that the Department ensures the NSW Skills List only includes priority qualifications that need to be subsidised by government. The Department does not have robust processes in place to remove qualifications from the NSW Skills List. As a result, there is a risk that the list may include lower priority skill areas. Since the NSW Skills List was first created, new additions to the list have outnumbered those removed by five to one.

The Department does not always validate information gathered from stakeholder proposals, even when it has data to do so. Further, its decision making about what to include on, or delete from, the NSW Skills List is not transparent because the rationale for decisions is not adequately documented. 

The Department is undertaking projects to better use data to support its decisions about what should be on the NSW Skills List. Some of these projects should deliver useful data soon, but some can only provide useful information when sufficient trend data is available. 

Recommendation

The Department should: 

  • by June 2019, increase transparency of decisions about proposed changes to the NSW Skills List and improve record-keeping of deliberations regarding these changes
  • by December 2019, use data more effectively and consistently to ensure that the NSW Skills List only includes high priority qualifications
The Department funds training providers that deliver qualifications on the NSW Skills List. Alignment of funding to skill needs relies on the accuracy of the NSW Skills List, which may include some lower priority qualifications.

Only qualifications on the NSW Skills List are eligible for subsidies under Smart and Skilled. As the Department does not have a robust process for removing low priority qualifications from the NSW Skills list, some low priority qualifications may be subsidised. 

The Department allocates the Smart and Skilled budget through contracts with Smart and Skilled training providers. Training providers that meet contractual obligations and perform well in terms of enrolments and completion rates are rewarded with renewed contracts and more funding for increased enrolments, but these decisions are not based on student outcomes. The Department reduces or removes funding from training providers that do not meet quality standards, breach contract conditions or that are unable to spend their allocated funding effectively. Contracts are for only one year, offering training providers little funding certainty. 

Smart and Skilled provides additional funding for scholarships and for training providers in locations where the cost of delivery is high or to those that cater to students with disabilities. The Department has not yet evaluated whether this additional funding is achieving its intended outcomes. 

Eight per cent of the qualifications that have been on the NSW Skills List since 2015 are not delivered under Smart and Skilled anywhere in New South Wales. A further 14 per cent of the qualifications that are offered by training providers have had no student commencements. The Department is yet to identify the reasons that these high priority qualifications are either not offered or not taken up by students.

Recommendation

The Department should:

  • by June 2019, investigate why training providers do not offer, and prospective students do not enrol in, some Smart and Skilled subsidised qualifications 
  • by December 2019, evaluate the effectiveness of Smart and Skilled funding which supplements standard subsidies for qualifications on the NSW Skills List, to determine whether it is achieving its objectives
  • by December 2019, provide longer term funding certainty to high performing training providers, while retaining incentives for them to continue to perform well.
The Department needs to improve its communication, particularly with prospective students.

In a contestable market, it is important for consumers to have sufficient information to make informed decisions. The Department does not provide some key information to prospective VET students to support their decisions, such as measures of provider quality and examples of employment and further education outcomes of students completing particular courses. Existing information is spread across numerous channels and is not presented in a user friendly manner. This is a potential barrier to participation in VET for those less engaged with the system or less ICT literate.

The Department conveys relevant information about the program to training providers through its websites and its regional offices. However, it could better communicate some specific information directly to individual Smart and Skilled training providers, such as reasons their proposals to include new qualifications on the NSW Skills List are accepted or rejected. 

While the Department is implementing a communication strategy for VET in New South Wales, it does not have a specific communications strategy for Smart and Skilled which comprehensively identifies the needs of different stakeholders and how these can be addressed. 

Recommendation

By December 2019, the Department should develop and implement a specific communications strategy for Smart and Skilled to:

  • support prospective student engagement and informed decision making
  • meet the information needs of training providers 

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #305 - released 26 July 2018

Published

Actions for HealthRoster benefits realisation

HealthRoster benefits realisation

Health
Compliance
Information technology
Management and administration
Project management
Workforce and capability

The HealthRoster system is delivering some business benefits but Local Health Districts are yet to use all of its features, according to a report released today by the Auditor-General for New South Wales,  Margaret Crawford. HealthRoster is an IT system designed to more effectively roster staff to meet the needs of Local Health Districts and other NSW health agencies.

The NSW public health system employs over 100,000 people in clinical and non-clinical roles across the state. With increasing demand for services, it is vital that NSW Health effectively rosters staff to ensure high quality and efficient patient care, while maintaining good workplace practices to support staff in demanding roles.

NSW Health is implementing HealthRoster as its single state-wide rostering system to more effectively roster staff according to the demands of each location. Between 2013–14 and 2016–17, our financial audits of individual LHDs had reported issues with rostering and payroll processes and systems.

NSW Health grouped all Local Health Districts (LHDs), and other NSW Health organisations, into four clusters to manage the implementation of HealthRoster over four years. Refer to Exhibit 4 for a list of the NSW Health entities in each cluster.

  • Cluster 1 implementation commenced in 2014–15 and was completed in 2015–16.
  • Cluster 2 implementation commenced in 2015–16 and was completed in 2016–17.
  • Cluster 3 began implementation in 2016–17 and was underway during the conduct of the audit.
  • Cluster 4 began planning for implementation in 2017–18.

Full implementation, including capability for centralised data and reporting, is planned for completion in 2019.

This audit assessed the effectiveness of the HealthRoster system in delivering business benefits. In making this assessment, we examined whether:

  • expected business benefits of HealthRoster were well-defined
  • HealthRoster is achieving business benefits where implemented.

The HealthRoster project has a timespan from 2009 to 2019. We examined the HealthRoster implementation in LHDs, and other NSW Health organisations, focusing on the period from 2014, when eHealth assumed responsibility for project implementation, to early 2018.

Conclusion
The HealthRoster system is realising functional business benefits in the LHDs where it has been implemented. In these LHDs, financial control of payroll expenditure and rostering compliance with employment award conditions has improved. However, these LHDs are not measuring the value of broader benefits such as better management of staff leave and overtime.
NSW Health has addressed the lessons learned from earlier implementations to improve later implementations. Business benefits identified in the business case were well defined and are consistent with business needs identified by NSW Health. Three of four cluster 1 LHDs have been able to reduce the number of issues with rostering and payroll processes. LHDs in earlier implementations need to use HealthRoster more effectively to ensure they are getting all available benefits from it.
HealthRoster is taking six years longer, and costing $37.2 million more, to fully implement than originally planned. NSW Health attributes the increased cost and extended timeframe to the large scale and complexity of the full implementation of HealthRoster.

Business benefits identified for HealthRoster accurately reflect business needs.

NSW Health has a good understanding of the issues in previous rostering systems and has designed HealthRoster to adequately address these issues. Interviews with frontline staff indicate that HealthRoster facilitates rostering which complies with industrial awards. This is a key business benefit that supports the provision of quality patient care. We saw no evidence that any major business needs or issues with the previous rostering systems are not being addressed by HealthRoster.

In the period examined in this audit since 2015, NSW Health has applied appropriate project management and governance structures to ensure that risks and issues are well managed during HealthRoster implementation.

HealthRoster has had two changes to its budget and timeline. Overall, the capital cost for the project has increased from $88.6 million to $125.6 million (42 per cent) and has delayed expected project completion by four years from 2015 to 2019. NSW Health attributes the increased cost and extended time frame to the large scale and complexity of the full implementation of HealthRoster.

NSW Health has established appropriate governance arrangements to ensure that HealthRoster is successfully implemented and that it will achieve business benefits in the long term. During implementation, local steering committees monitor risks and resolve implementation issues. Risks or issues that cannot be resolved locally are escalated to the state-wide steering committee.

NSW Health has grouped local health districts, and other NSW Health organisations, into four clusters for implementation. This has enabled NSW Health to apply lessons learnt from each implementation to improve future implementations.

NSW Health has a benefits realisation framework, but it is not fully applied to HealthRoster.

NSW Health can demonstrate that HealthRoster has delivered some functional business benefits, including rosters that comply with a wide variety of employment awards.

NSW Health is not yet measuring and tracking the value of business benefits achieved. NSW Health did not have benefits realisation plans with baseline measures defined for LHDs in cluster 1 and 2 before implementation. Without baseline measures NSW Health is unable to quantify business benefits achieved. However, analysis of post-implementation reviews and interviews with frontline staff indicate that benefits are being achieved. As a result, NSW Health now includes defining baseline measures and setting targets as part of LHD implementation planning. It has created a benefits realisation toolkit to assist this process from cluster 3 implementations onwards.

NSW Health conducted post-implementation reviews for clusters 1 and 2 and found that LHDs in these clusters were not using HealthRoster to realise all the benefits that HealthRoster could deliver.

By September 2018, NSW Health should:

  1. Ensure that Local Health Districts undertake benefits realisation planning according to the NSW Health benefits realisation framework
  2. Regularly measure benefits realised, at state and local health district levels, from the statewide implementation of HealthRoster
  3. Review the use of HealthRoster in Local Health Districts in clusters 1 and 2 and assist them to improve their HealthRoster related processes and practices.

By June 2019, NSW Health should:

  1. Ensure that all Local Health Districts are effectively using demand based rostering.

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #301 - released 7 June 2018

Published

Actions for Managing risks in the NSW public sector: risk culture and capability

Managing risks in the NSW public sector: risk culture and capability

Finance
Health
Justice
Treasury
Internal controls and governance
Management and administration
Risk
Workforce and capability

The Ministry of Health, NSW Fair Trading, NSW Police Force, and NSW Treasury Corporation are taking steps to strengthen their risk culture, according to a report released today by the Auditor-General, Margaret Crawford. 'Senior management communicates the importance of managing risk to their staff, and there are many examples of risk management being integrated into daily activities', the Auditor-General said.

We did find that three of the agencies we examined could strengthen their culture so that all employees feel comfortable speaking openly about risks. To support innovation, senior management could also do better at communicating to their staff the levels of risk they are willing to accept.

Effective risk management is essential to good governance, and supports staff at all levels to make informed judgements and decisions. At a time when government is encouraging innovation and exploring new service delivery models, effective risk management is about seizing opportunities as well as managing threats.

Over the past decade, governments and regulators around the world have increasingly turned their attention to risk culture. It is now widely accepted that organisational culture is a key element of risk management because it influences how people recognise and engage with risk. Neglecting this ‘soft’ side of risk management can prevent institutions from managing risks that threaten their success and lead to missed opportunities for change, improvement or innovation.

This audit assessed how effectively NSW Government agencies are building risk management capabilities and embedding a sound risk culture throughout their organisations. To do this we examined whether:

  • agencies can demonstrate that senior management is committed to risk management
  • information about risk is communicated effectively throughout agencies
  • agencies are building risk management capabilities.

The audit examined four agencies: the Ministry of Health, the NSW Fair Trading function within the Department of Finance, Services and Innovation, NSW Police Force and NSW Treasury Corporation (TCorp). NSW Treasury was also included as the agency responsible for the NSW Government's risk management framework.

Conclusion
All four agencies examined in the audit are taking steps to strengthen their risk culture. In these agencies, senior management communicates the importance of managing risk to their staff. They have risk management policies and funded central functions to oversee risk management. We also found many examples of risk management being integrated into daily activities.
That said, three of the four case study agencies could do more to understand their existing risk culture. As good practice, agencies should monitor their employees’ attitude to risk. Without a clear understanding of how employees identify and engage with risk, it is difficult to tell whether the 'tone' set by the executive and management is aligned with employee behaviours.
Our survey of risk culture found that three agencies could strengthen a culture of open communication, so that all employees feel comfortable speaking openly about risks. To support innovation, senior management could also do better at communicating to their staff the levels of risk they are willing to accept.
Some agencies are performing better than others in building their risk capabilities. Three case study agencies have reviewed the risk-related skills and knowledge of their workforce, but only one agency has addressed the gaps the review identified. In three agencies, staff also need more practical guidance on how to manage risks that are relevant to their day-to-day responsibilities.
NSW Treasury provides agencies with direction and guidance on risk management through policy and guidelines. Its principles-based approach to risk management is consistent with better practice. Nevertheless, there is scope for NSW Treasury to develop additional practical guidance and tools to support a better risk culture in the NSW public sector. NSW Treasury should encourage agency heads to form a view on the current risk culture in their agencies, identify desirable changes to that risk culture, and take steps to address those changes. 

In assessing an agency’s risk culture, we focused on four key areas:

Executive sponsorship (tone at the top)

In the four agencies we reviewed, senior management is communicating the importance of managing risk. They have endorsed risk management frameworks and funded central functions tasked with overseeing risk management within their agencies.

That said, we found that three case study agencies do not measure their existing risk culture. Without clear measures of how employees identify and engage with risk, it is difficult for agencies to tell whether employee's behaviours are aligned with the 'tone' set by the executive and management.

For example, in some agencies we examined we found a disconnect between risk tolerances espoused by senior management and how these concepts were understood by staff.

Employee perceptions of risk management

Our survey of staff indicated that while senior leaders have communicated the importance of managing risk, more could be done to strengthen a culture of open communication so that all employees feel comfortable speaking openly about risks. We found that senior management could better communicate to their staff the levels of risk they should be willing to accept.

Integration of risk management into daily activities and links to decision-making

We found examples of risk management being integrated into daily activities. On the other hand, we also identified areas where risk management deviated from good practice. For example, we found that corporate risk registers are not consistently used as a tool to support decision-making.

Support and guidance to help staff manage risks

Most case study agencies are monitoring risk-related skills and knowledge of their workforce, but only one agency has addressed the gaps it identified. While agencies are providing risk management training, surveyed staff in three case study agencies reported that risk management training is not adequate.

NSW Treasury provides agencies with direction and guidance on risk management through policy and guidelines. In line with better practice, NSW Treasury's principles-based policy acknowledges that individual agencies are in a better position to understand their own risks and design risk management frameworks that address those risks. Nevertheless, there is scope for NSW Treasury to refine its guidance material to support a better risk culture in the NSW public sector.

Recommendation

By May 2019, NSW Treasury should:

  • Review the scope of its risk management guidance, and identify additional guidance, training or activities to improve risk culture across the NSW public sector. This should focus on encouraging agency heads to form a view on the current risk culture in their agencies, identify desirable changes to that risk culture, and take steps to address those changes.

Published

Actions for Detecting and responding to cyber security incidents

Detecting and responding to cyber security incidents

Finance
Cyber security
Information technology
Internal controls and governance
Management and administration
Workforce and capability

A report released today by the Auditor-General for New South Wales, Margaret Crawford, found there is no whole-of-government capability to detect and respond effectively to cyber security incidents. There is very limited sharing of information on incidents amongst agencies, and some agencies have poor detection and response practices and procedures.

The NSW Government relies on digital technology to deliver services, organise and store information, manage business processes, and control critical infrastructure. The increasing global interconnectivity between computer networks has dramatically increased the risk of cyber security incidents. Such incidents can harm government service delivery and may include the theft of information, denial of access to critical technology, or even the hijacking of systems for profit or malicious intent.

This audit examined cyber security incident detection and response in the NSW public sector. It focused on the role of the Department of Finance, Services and Innovation (DFSI), which oversees the Information Security Community of Practice, the Information Security Event Reporting Protocol, and the Digital Information Security Policy (the Policy).

The audit also examined ten case study agencies to develop a perspective on how they detect and respond to incidents. We chose agencies that are collectively responsible for personal data, critical infrastructure, financial information and intellectual property.

Conclusion
There is no whole‑of‑government capability to detect and respond effectively to cyber security incidents. There is limited sharing of information on incidents amongst agencies, and some of the agencies we reviewed have poor detection and response practices and procedures. There is a risk that incidents will go undetected longer than they should, and opportunities to contain and restrict the damage may be lost.
Given current weaknesses, the NSW public sector’s ability to detect and respond to incidents needs to improve significantly and quickly. DFSI has started to address this by appointing a Government Chief Information Security Officer (GCISO) to improve cyber security capability across the public sector. Her role includes coordinating efforts to increase the NSW Government’s ability to respond to and recover from whole‑of‑government threats and attacks.

Some of our case study agencies had strong processes for detection and response to cyber security incidents but others had a low capability to detect and respond in a timely way.

Most agencies have access to an automated tool for analysing logs generated by their IT systems. However, coverage of these tools varies. Some agencies do not have an automated tool and only review logs periodically or on an ad hoc basis, meaning they are less likely to detect incidents.

Few agencies have contractual arrangements in place for IT service providers to report incidents to them. If a service provider elects to not report an incident, it will delay the agency’s response and may result in increased damage.

Most case study agencies had procedures for responding to incidents, although some lack guidance on who to notify and when. Some agencies do not have response procedures, limiting their ability to minimise the business damage that may flow from a cyber security incident. Few agencies could demonstrate that they have trained their staff on either incident detection or response procedures and could provide little information on the role requirements and responsibilities of their staff in doing so.

Most agencies’ incident procedures contain limited information on how to report an incident, who to report it to, when this should occur and what information should be provided. None of our case study agencies’ procedures mentioned reporting to DFSI, highlighting that even though reporting is mandatory for most agencies their procedures do not require it.

Case study agencies provided little evidence to indicate they are learning from incidents, meaning that opportunities to better manage future incidents may be lost.

Recommendations

The Department of Finance, Services and Innovation should:

  • assist agencies by providing:
    • better practice guidelines for incident detection, response and reporting to help agencies develop their own practices and procedures
    • training and awareness programs, including tailored programs for a range of audiences such as cyber professionals, finance staff, and audit and risk committees
    • role requirements and responsibilities for cyber security across government, relevant to size and complexity of each agency
    • a support model for agencies that have limited detection and response capabilities
       
  • revise the Digital Information Security Policy and Information Security Event Reporting Protocol by
    • clarifying what security incidents must be reported to DFSI and when
    • extending mandatory reporting requirements to those NSW Government agencies not currently covered by the policy and protocol, including State owned corporations.

DFSI lacks a clear mandate or capability to provide effective detection and response support to agencies, and there is limited sharing of information on cyber security incidents.

DFSI does not currently have a clear mandate and the necessary resources and systems to detect, receive, share and respond to cyber security incidents across the NSW public sector. It does not have a clear mandate to assess whether agencies have an acceptable detection and response capability. It is aware of deficiencies in agencies and across whole‑of‑government, and has begun to conduct research into this capability.

Intelligence gathering across the public sector is also limited, meaning agencies may not respond to threats in a timely manner. DFSI has not allocated resources for gathering of threat intelligence and communicating it across government, although it has begun to build this capacity.

Incident reporting to DFSI is mandatory for most agencies, however, most of our case study agencies do not report incidents to DFSI, reducing the likelihood of containing an incident if it spreads to other agencies. When incidents have been reported, DFSI has not provided dedicated resources to assess them and coordinate the public sector’s response. There are currently no formal requirements for DFSI to respond to incidents and no guidance on what it is meant to do if an incident is reported. The lack of central coordination in incident response risks delays and increased damage to multiple agencies.

DFSI's reporting protocol is weak and does not clearly specify what agencies should report and when. This makes agencies less likely to report incidents. The lack of a standard format for incident reporting and a consistent method for assessing an incident, including the level of risk associated with it, also make it difficult for DFSI to determine an appropriate response.

There are limited avenues for sharing information amongst agencies after incidents have been resolved, meaning the public sector may be losing valuable opportunities to improve its protection and response.

Recommendations

The Department of Finance, Services and Innovation should:

  • develop whole‑of‑government procedure, protocol and supporting systems to effectively share reported threats and respond to cyber security incidents impacting multiple agencies, including follow-up and communicating lessons learnt
  • develop a means by which agencies can report incidents in a more effective manner, such as a secure online template, that allows for early warnings and standardised details of incidents and remedial advice
  • enhance NSW public sector threat intelligence gathering and sharing including formal links with Australian Government security agencies, other states and the private sector
  • direct agencies to include standard clauses in contracts requiring IT service providers report all cyber security incidents within a reasonable timeframe
  • provide assurance that agencies have appropriate reporting procedures and report to DFSI as required by the policy and protocol by:
    • extending the attestation requirement within the DISP to cover procedures and reporting
    • reviewing a sample of agencies' incident reporting procedures each year.

Published

Actions for Council reporting on service delivery

Council reporting on service delivery

Local Government
Compliance
Internal controls and governance
Management and administration
Service delivery

New South Wales local government councils’ could do more to demonstrate how well they are delivering services in their reports to the public, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Many councils report activity, but do not report on outcomes in a way that would help their communities assess how well they are performing. Most councils also did not report on the cost of services, making it difficult for communities to see how efficiently they are being delivered. And councils are not consistently publishing targets to demonstrate what they are striving for.

I am pleased to present my first local government performance audit pursuant to section 421D of the Local Government Act 1993.

My new mandate supports the Parliament’s objectives to:

  • strengthen governance and financial oversight in the local government sector
  • improve financial management, fiscal responsibility and public accountability for how councils use citizens’ funds.

Performance audits aim to help councils improve their efficiency and effectiveness. They will also provide communities with independent information on the performance of their councils.

For this inaugural audit in the local government sector, I have chosen to examine how well councils report to their constituents about the services they provide.

In this way, the report will enable benchmarking and provide improvement guidance to all councils across New South Wales.

Specific recommendations to drive improved reporting are directed to the Office of Local Government, which is the regulator of councils in New South Wales.

Councils provide a range of services which have a direct impact on the amenity, safety and health of their communities. These services need to meet the needs and expectations of their communities, as well as relevant regulatory requirements set by state and federal governments. Councils have a high level of autonomy in decisions about how and to whom they provide services, so it is important that local communities have access to information about how well they are being delivered and meeting community needs. Ultimately councils should aim to ensure that reporting performance is subject to quality controls designed to provide independent assurance.

Conclusion
While councils report on outputs, reporting on outcomes and performance over time can be improved. Improved reporting would include objectives with targets that better demonstrate performance over time. This would help communities understand what services are being delivered, how efficiently and effectively they are being delivered, and what improvements are being made.
To ensure greater transparency on service effectiveness and efficiency, the Office of Local Government (OLG) should work with councils to develop guidance principles to improve reporting on service delivery to local communities. This audit identified an interest amongst councils in improving their reporting and broad agreement with the good practice principles developed as part of the audit.
The Integrated Planning and Reporting Framework (the Framework), which councils are required to use to report on service delivery, is intended to promote better practice. However, the Framework is silent on efficiency reporting and provides limited guidance on how long-term strategic documents link with annual reports produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.
OLG should also work with state agencies to reduce the overall reporting burden on councils by consolidating state agency reporting requirements. 

Councils report extensively on the things they have done, but minimally on the outcomes from that effort, efficiency and performance over time.

Councils could improve reporting on service delivery by more clearly relating the resources needed with the outputs produced, and by reporting against clear targets. This would enable communities to understand how efficiently services are being delivered and how well councils are tracking against their goals and priorities.

Across the sector, a greater focus is also needed on reporting performance over time so that communities can track changes in performance and councils can demonstrate whether they are on target to meet any agreed timeframes for service improvements.

The degree to which councils demonstrate good practice in reporting on service delivery varies greatly between councils. Metropolitan and regional town and city councils generally produce better quality reporting than rural councils. This variation indicates that, at least in the near-term, OLG's efforts in building capability in reporting would be best directed toward rural councils.

Recommendation

By mid-2018, OLG should:

  • assist rural councils to develop their reporting capability.

The Framework which councils are required to use to report on service delivery, is intended to drive good practice in reporting. Despite this, the Framework is silent on a number of aspects of reporting that should be considered fundamental to transparent reporting on service delivery. It does not provide guidance on reporting efficiency or cost effectiveness in service delivery and provides limited guidance on how annual reports link with other plans produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.

Recommendation

By mid-2018, OLG should:

  • issue additional guidance on good practice in council reporting, with specific information on:
    • reporting on performance against targets
    • reporting on performance against outcome
    • assessing and reporting on efficiency and cost effectiveness
    • reporting performance over time
    • clearer integration of all reports and plans that are required by the Framework, particularly the role of End of Term Reporting
    • defining reporting terms to encourage consistency.

The Framework is silent on inclusion of efficiency or cost effectiveness indicators in reports

The guidelines produced by OLG in 2013 to assist councils to implement their Framework requirements advise that performance measures should be included in all plans. However, the Framework does not specifically state that efficiency or cost effectiveness indicators should be included as part of this process. This has been identified as a weakness in the 2012 performance audit report and the Local Government Reform Panel review of reporting by councils on service delivery.

The Framework and supporting documents provide limited guidance on reporting

Councils' annual reports provide a consolidated summary of their efforts and achievements in service delivery and financial management. However, OLG provides limited guidance on:

  • good practice in reporting to the community
  • how the annual report links with other plans and reports required by the Framework.

Further, the Framework includes both Annual and End of Term Reports. However, End of Term reports are published prior to council elections and are mainly a consolidation of annual reports produced during a council’s term. The relationship between Annual reports and End of Term reports is not clear.

OLG is reviewing the Framework and guidance

OLG commenced work on reviewing of the Framework in 2013 but this was deferred with work re‑starting in 2017. The revised guidelines and manual were expected to be released late in 2017.

OLG should build on the Framework to improve guidance on reporting on service delivery, including in annual reports

The Framework provides limited guidance on how best to report on service delivery, including in annual reports. It is silent on inclusion of efficiency or cost effectiveness indicators in reporting, which are fundamental aspects of performance reporting. Councils we consulted would welcome more guidance from OLG on these aspects of reporting.

Our consultation with councils highlighted that many council staff would welcome a set of reporting principles that provide guidance to councils, without being prescriptive. This would allow councils to tailor their approach to the individual characteristics, needs and priorities of their local communities.

Consolidating what councils are required to report to state agencies would reduce the reporting burden and enable councils to better report on performance. Comparative performance indicators are also needed to provide councils and the public with a clear understanding of councils' performance relative to each other.

Recommendations

By mid-2018, OLG should:

  • commence work to consolidate the information reported by individual councils to NSW Government agencies as part of their compliance requirements.
  • progress work on the development of a Performance Measurement Framework, and associated performance indicators, that can be used by councils and the NSW Government in sector-wide performance reporting.

Streamlining the reporting burden would help councils improve reporting

The NSW Government does not have a central view of all local government reporting, planning and compliance obligations. A 2016 draft IPART ‘Review of reporting and compliance burdens on Local Government’ noted that councils provide a wide range of services under 67 different Acts, administered by 27 different NSW Government agencies. Consolidating and coordinating reporting requirements would assist with better reporting over time and comparative reporting. It would also provide an opportunity for NSW Government agencies to reduce the reporting burden on councils by identifying and removing duplication.

Enabling rural councils to perform tailored surveys of their communities may be more beneficial than a state-wide survey in defining outcome indicators

Some councils use community satisfaction survey data to develop outcome indicators for reporting. The results from these are used by councils to set service delivery targets and report on outcomes. This helps to drive service delivery in line with community expectations. While some regional councils do conduct satisfaction surveys, surveys are mainly used by metropolitan councils which generally have the resources needed to run them.

OLG and the Department of Premier and Cabinet have explored the potential to conduct state-wide resident satisfaction surveys with a view to establishing measures to improve service delivery. This work has drawn from a similar approach adopted in Victoria. Our consultation with stakeholders in Victoria indicated that the state level survey is not sufficiently detailed or specific enough to be used as a tool in setting targets that respond to local circumstances, expectations and priorities. Our analysis of reports and consultation with stakeholders suggest that better use of resident survey data in rural and regional areas may support improvements in performance reporting in these areas. Rural councils may benefit more from tailored surveys of groups of councils with similar challenges, priorities and circumstances than from a standard state-wide survey. These could potentially be achieved through regional cooperation between groups of similar councils or regional groups.

Comparative reporting indicators are needed to enable councils to respond to service delivery priorities of their communities

The Local Government Reform Panel in 2012 identified the need for ‘more consistent data collection and benchmarking to enable councils and the public to gain a clear understanding of how a council is performing relative to their peers’.

OLG commenced work in 2012 to build a new performance measurement Framework for councils which aimed to move away from compliance reporting. This work was also strongly influenced by the approach used in Victoria that requires councils to report on a set of 79 indicators which are reported on the Victorian 'Know your council' website. OLG’s work did not fully progress at the time and several other local government representative bodies have since commenced work to establish performance measurement frameworks. OLG advised us it has recently recommenced its work on this project.

Our consultation identified some desire amongst councils to be able to compare their performance to support improvement in the delivery of services. We also identified a level of frustration that more progress has not been made toward establishment of a set of indicators that councils can use to measure performance and drive improvement in service delivery.

Several councils we spoke with were concerned that the current approaches to comparative reporting did not adequately acknowledge that councils need to tailor their service types, level and mix to the needs of their community. Comparative reporting approaches tend to focus on output measures such as number of applications processed, library loans annually and opening hours for sporting facilities, rather than outcome measures. These approaches risk unjustified and adverse interpretations of performance where councils have made a decision based on community consultation, local priorities and available resources. To mitigate this, it is important to

  • adopt a partnership approach to the development of indicators
  • ensure indicators measure performance, not just level of activity
  • compare performance between councils that are similar in terms of size and location.

It may be more feasible, at least in the short term, for OLG to support small groups of like councils to develop indicators suited to their situation.

Based on our consultations, key lessons from implementing a sector-wide performance indicator framework in Victoria included the benefits of:

  • consolidation of the various compliance data currently being reported by councils to provide an initial platform for comparative performance reporting
  • adopting a partnership approach to development of common indicators with groups of like councils.