Refine search Expand filter

Reports

Published

Actions for Newcastle Urban Transformation and Transport Program

Newcastle Urban Transformation and Transport Program

Transport
Planning
Compliance
Infrastructure
Management and administration
Procurement
Project management

The urban renewal projects on former railway land in the Newcastle city centre are well targeted to support the objectives of the Newcastle Urban Transformation and Transport Program (the Program), according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. The planned uses of the former railway land achieve a balance between the economic and social objectives of the Program at a reasonable cost to the government. However, the evidence that the cost of the light rail will be justified by its contribution to the Program is not convincing.

The Newcastle Urban Transformation and Transport Program (the Program) is an urban renewal and transport program in the Newcastle city centre. The Hunter and Central Coast Development Corporation (HCCDC) has led the Program since 2017. UrbanGrowth NSW led the Program from 2014 until 2017. Transport for NSW has been responsible for delivering the transport parts of the Program since the Program commenced. All references to HCCDC in this report relate to both HCCDC and its predecessor, the Hunter Development Corporation. All references to UrbanGrowth NSW in this report relate only to its Newcastle office from 2014 to 2017.

This audit had two objectives:

  1. To assess the economy of the approach chosen to achieve the objectives of the Program.
  2. To assess the effectiveness of the consultation and oversight of the Program.

We addressed the audit objectives by answering the following questions:

a) Was the decision to build light rail an economical option for achieving Program objectives?
b) Has the best value been obtained for the use of the former railway land?
c) Was good practice used in consultation on key Program decisions?
d) Did governance arrangements support delivery of the program?

Conclusion
1. The urban renewal projects on the former railway land are well targeted to support the objectives of the Program. However, there is insufficient evidence that the cost of the light rail will be justified by its contribution to Program objectives.

The planned uses of the former railway land achieve a balance between the economic and social objectives of the Program at a reasonable cost to the Government. HCCDC, and previously UrbanGrowth NSW, identified and considered options for land use that would best meet Program objectives. Required probity processes were followed for developments that involved financial transactions. Our audit did not assess the achievement of these objectives because none of the projects have been completed yet.

Analysis presented in the Program business case and other planning documents showed that the light rail would have small transport benefits and was expected to make a modest contribution to broader Program objectives. Analysis in the Program business case argued that despite this, the light rail was justified because it would attract investment and promote economic development around the route. The Program business case referred to several international examples to support this argument, but did not make a convincing case that these examples were comparable to the proposed light rail in Newcastle.

The audited agencies argue that the contribution of light rail cannot be assessed separately because it is a part of a broader Program. The cost of the light rail makes up around 53 per cent of the total Program funding. Given the cost of the light rail, agencies need to be able to demonstrate that this investment provides value for money by making a measurable contribution to the Program objectives.

2. Consultation and oversight were mostly effective during the implementation stages of the Program. There were weaknesses in both areas in the planning stages.

Consultations about the urban renewal activities from around 2015 onward followed good practice standards. These consultations were based on an internationally accepted framework and met their stated objectives. Community consultations on the decision to close the train line were held in 2006 and 2009. However, the final decision in 2012 was made without a specific community consultation. There was no community consultation on the decision to build a light rail.

The governance arrangements that were in place during the planning stages of the Program did not provide effective oversight. This meant there was not a single agreed set of Program objectives until 2016 and roles and responsibilities for the Program were not clear. Leadership and oversight improved during the implementation phase of the Program. Roles and responsibilities were clarified and a multi-agency steering committee was established to resolve issues that needed multi-agency coordination.
The light rail is not justified by conventional cost-benefit analysis and there is insufficient evidence that the indirect contribution of light rail to achieving the economic development objectives of the Program will justify the cost.
Analysis presented in Program business cases and other planning documents showed that the light rail would have small transport benefits and was expected to make a modest contribution to broader Program objectives. Analysis in the Program business case argued that despite this, the light rail was justified because it would attract investment and promote economic development around the route. The Program business case referred to several international examples to support this argument, but did not make a convincing case that these examples were comparable to the proposed light rail in Newcastle.
The business case analysis of the benefits and costs of light rail was prepared after the decision to build light rail had been made and announced. Our previous reports, and recent reports by others, have emphasised the importance of completing thorough analysis before announcing infrastructure projects. Some advice provided after the initial light rail decision was announced was overly optimistic. It included benefits that cannot reasonably be attributed to light rail and underestimated the scope and cost of the project.
The audited agencies argue that the contribution of light rail cannot be assessed separately because it is part of a broader Program. The cost of the light rail makes up around 53 per cent of the total Program funding. Given the high cost of the light rail, we believe agencies need to be able to demonstrate that this investment provides value for money by making a measurable contribution to the Program objectives.

Recommendations
For future infrastructure programs, NSW Government agencies should support economical decision-making on infrastructure projects by:
  • providing balanced advice to decision makers on the benefits and risks of large infrastructure investments at all stages of the decision-making process
  • providing scope and cost estimates that are as accurate and complete as possible when initial funding decisions are being made
  • making business cases available to the public.​​​​​​
The planned uses of the former railway land achieve a balance between the economic and social objectives of the Program at a reasonable cost to the government.

The planned uses of the former railway land align with the objectives of encouraging people to visit and live in the city centre, creating attractive public spaces, and supporting growth in employment in the city. The transport benefits of the activities are less clear, because the light rail is the major transport project and this will not make significant improvements to transport in Newcastle.

The processes used for selling and leasing parts of the former railway land followed industry standards. Options for the former railway land were identified and assessed systematically. Competitive processes were used for most transactions and the required assessment and approval processes were followed. The sale of land to the University of Newcastle did not use a competitive process, but required processes for direct negotiations were followed.

Recommendation
By March 2019, the Hunter and Central Coast Development Corporation should:
  • work with relevant stakeholders to explore options for increasing the focus on the heritage objective of the Program in projects on the former railway land. This could include projects that recognise the cultural and industrial heritage of Newcastle.
Consultations about the urban renewal activities followed good practice standards, but consultation on transport decisions for the Program did not.

Consultations focusing on urban renewal options for the Program included a range of stakeholders and provided opportunities for input into decisions about the use of the former railway land. These consultations received mostly positive feedback from participants. Changes and additions were made to the objectives of the Program and specific projects in response to feedback received. 

There had been several decades of debate about the potential closure of the train line, including community consultations in 2006 and 2009. However, the final decision to close the train line was made and announced in 2012 without a specific community consultation. HCCDC states that consultation with industry and business representatives constitutes community consultation because industry representatives are also members of the community. This does not meet good practice standards because it is not a representative sample of the community.

There was no community consultation on the decision to build a light rail. There were subsequent opportunities for members of the community to comment on the implementation options, but the decision to build it had already been made. A community and industry consultation was held on which route the light rail should use, but the results of this were not made public. 

Recommendation
For future infrastructure programs, NSW Government agencies should consult with a wide range of stakeholders before major decisions are made and announced, and report publicly on the results and outcomes of consultations. 

The governance arrangements that were in place during the planning stages of the Program did not provide effective oversight. Project leadership and oversight improved during the implementation phase of the Program.

Multi-agency coordination and oversight were ineffective during the planning stages of the Program. Examples include: multiple versions of Program objectives being in circulation; unclear reporting lines for project management groups; and poor role definition for the initial advisory board. Program ownership was clarified in mid-2016 with the appointment of a new Program Director with clear accountability for the delivery of the Program. This was supported by the creation of a multi-agency steering committee that was more effective than previous oversight bodies.

The limitations that existed in multi-agency coordination and oversight had some negative consequences in important aspects of project management for the Program. This included whole-of-government benefits management and the coordination of work to mitigate impacts of the Program on small businesses.

Recommendations
For future infrastructure programs, NSW Government agencies should: 

  • develop and implement a benefits management approach from the beginning of a program to ensure responsibility for defining benefits and measuring their achievement is clear
  • establish whole-of-government oversight early in the program to guide major decisions. This should include:
    • agreeing on objectives and ensuring all agencies understand these
    • clearly defining roles and responsibilities for all agencies
    • establishing whole-of-government coordination for the assessment and mitigation of the impact of major construction projects on businesses and the community.

By March 2019, the Hunter and Central Coast Development Corporation should update and implement the Program Benefits Realisation Plan. This should include:

  • setting measurable targets for the desired benefits
  • clearly allocating ownership for achieving the desired benefits
  • monitoring progress toward achieving the desired benefits and reporting publicly on the results.

Appendix one - Response from agencies    

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #310 - released 12 December 2018

Published

Actions for Matching skills training with market needs

Matching skills training with market needs

Industry
Compliance
Internal controls and governance
Management and administration
Risk
Service delivery
Workforce and capability

The NSW Department of Industry targets subsidies towards training programs delivering skills most needed in New South Wales. However, the Department still provides subsidies to qualifications that the market may no longer need, according to a report released by Margaret Crawford, Auditor-General for New South Wales. 

In 2012, governments across Australia entered into the National Partnership Agreement on Skills Reform. Under the National Partnership Agreement, the Australian Government provided incentive payments to States and Territories to move towards a more contestable Vocational Education and Training (VET) market. The aim of the National Partnership Agreement was to foster a more accessible, transparent, efficient and high quality training sector that is responsive to the needs of students and industry. 

The New South Wales Government introduced the Smart and Skilled program in response to the National Partnership Agreement. Through Smart and Skilled, students can choose a vocational course from a list of approved qualifications and training providers. Students pay the same fee for their chosen qualification regardless of the selected training provider and the government covers the gap between the student fee and the fixed price of the qualification through a subsidy paid to their training provider. 

Smart and Skilled commenced in January 2015, with the then Department of Education and Communities having primary responsibility for its implementation. Since July 2015, the NSW Department of Industry (the Department) has been responsible for VET in New South Wales and the implementation of Smart and Skilled. 

The NSW Skills Board, comprising nine part-time members appointed by the Minister for Skills, provides independent strategic advice on VET reform and funding. In line with most other States and Territories, the Department maintains a 'Skills List' which contains government subsidised qualifications to address identified priority skill needs in New South Wales.

This audit assessed the effectiveness of the Department in identifying, prioritising, and aligning course subsidies to the skill needs of NSW. To do this we examined whether:

  • the Department effectively identifies and prioritises present and future skill needs 
  • Smart and Skilled funding is aligned with the priority skill areas
  • skill needs and available VET courses are effectively communicated to potential participants and training providers.

Smart and Skilled is a relatively new and complex program, and is being delivered in the context of significant reform to VET nationally and in New South Wales. A large scale government funded contestable market was not present in the VET sector in New South Wales before the introduction of Smart and Skilled. This audit's findings should be considered in that context.
 

Conclusion
The Department effectively consults with industry, training providers and government departments to identify skill needs, and targets subsidies to meet those needs. However, the Department does not have a robust, data driven process to remove subsidies from qualifications which are no longer a priority. There is a risk that some qualifications are being subsidised which do not reflect the skill needs of New South Wales. 
The Department needs to better use the data it has, and collect additional data, to support its analysis of priority skill needs in New South Wales, and direct funding accordingly.
In addition to subsidising priority qualifications, the Department promotes engagement in skills training by:
  • funding scholarships and support for disadvantaged students
  • funding training in regional and remote areas
  • providing additional support to deliver some qualifications that the market is not providing.

The Department needs to evaluate these funding strategies to ensure they are achieving their goals. It should also explore why training providers are not delivering some priority qualifications through Smart and Skilled.

Training providers compete for funding allocations based on their capacity to deliver. The Department successfully manages the budget by capping funding allocated to each Smart and Skilled training provider. However, training providers have only one year of funding certainty at present. Training providers that are performing well are not rewarded with greater certainty.

The Department needs to improve its communication with prospective students to ensure they can make informed decisions in the VET market.

The Department also needs to communicate more transparently to training providers about its funding allocations and decisions about changes to the NSW Skills List. 

The NSW Skills List is unlikely to be missing high priority qualifications, but may include lower priority qualifications because the Department does not have a robust process to identify and remove these qualifications from the list. The Department needs to better use available data, and collect further data, to support decisions about which qualifications should be on the NSW Skills List.

The Department relies on stakeholder proposals to update the NSW Skills List. Stakeholders include industry, training providers and government departments. These stakeholders, particularly industry, are likely to be aware of skill needs, and have a strong incentive to propose qualifications that address these needs. The Department’s process of collecting stakeholder proposals helps to ensure that it can identify qualifications needed to address material skill needs. 

It is also important that the Department ensures the NSW Skills List only includes priority qualifications that need to be subsidised by government. The Department does not have robust processes in place to remove qualifications from the NSW Skills List. As a result, there is a risk that the list may include lower priority skill areas. Since the NSW Skills List was first created, new additions to the list have outnumbered those removed by five to one.

The Department does not always validate information gathered from stakeholder proposals, even when it has data to do so. Further, its decision making about what to include on, or delete from, the NSW Skills List is not transparent because the rationale for decisions is not adequately documented. 

The Department is undertaking projects to better use data to support its decisions about what should be on the NSW Skills List. Some of these projects should deliver useful data soon, but some can only provide useful information when sufficient trend data is available. 

Recommendation

The Department should: 

  • by June 2019, increase transparency of decisions about proposed changes to the NSW Skills List and improve record-keeping of deliberations regarding these changes
  • by December 2019, use data more effectively and consistently to ensure that the NSW Skills List only includes high priority qualifications
The Department funds training providers that deliver qualifications on the NSW Skills List. Alignment of funding to skill needs relies on the accuracy of the NSW Skills List, which may include some lower priority qualifications.

Only qualifications on the NSW Skills List are eligible for subsidies under Smart and Skilled. As the Department does not have a robust process for removing low priority qualifications from the NSW Skills list, some low priority qualifications may be subsidised. 

The Department allocates the Smart and Skilled budget through contracts with Smart and Skilled training providers. Training providers that meet contractual obligations and perform well in terms of enrolments and completion rates are rewarded with renewed contracts and more funding for increased enrolments, but these decisions are not based on student outcomes. The Department reduces or removes funding from training providers that do not meet quality standards, breach contract conditions or that are unable to spend their allocated funding effectively. Contracts are for only one year, offering training providers little funding certainty. 

Smart and Skilled provides additional funding for scholarships and for training providers in locations where the cost of delivery is high or to those that cater to students with disabilities. The Department has not yet evaluated whether this additional funding is achieving its intended outcomes. 

Eight per cent of the qualifications that have been on the NSW Skills List since 2015 are not delivered under Smart and Skilled anywhere in New South Wales. A further 14 per cent of the qualifications that are offered by training providers have had no student commencements. The Department is yet to identify the reasons that these high priority qualifications are either not offered or not taken up by students.

Recommendation

The Department should:

  • by June 2019, investigate why training providers do not offer, and prospective students do not enrol in, some Smart and Skilled subsidised qualifications 
  • by December 2019, evaluate the effectiveness of Smart and Skilled funding which supplements standard subsidies for qualifications on the NSW Skills List, to determine whether it is achieving its objectives
  • by December 2019, provide longer term funding certainty to high performing training providers, while retaining incentives for them to continue to perform well.
The Department needs to improve its communication, particularly with prospective students.

In a contestable market, it is important for consumers to have sufficient information to make informed decisions. The Department does not provide some key information to prospective VET students to support their decisions, such as measures of provider quality and examples of employment and further education outcomes of students completing particular courses. Existing information is spread across numerous channels and is not presented in a user friendly manner. This is a potential barrier to participation in VET for those less engaged with the system or less ICT literate.

The Department conveys relevant information about the program to training providers through its websites and its regional offices. However, it could better communicate some specific information directly to individual Smart and Skilled training providers, such as reasons their proposals to include new qualifications on the NSW Skills List are accepted or rejected. 

While the Department is implementing a communication strategy for VET in New South Wales, it does not have a specific communications strategy for Smart and Skilled which comprehensively identifies the needs of different stakeholders and how these can be addressed. 

Recommendation

By December 2019, the Department should develop and implement a specific communications strategy for Smart and Skilled to:

  • support prospective student engagement and informed decision making
  • meet the information needs of training providers 

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #305 - released 26 July 2018

Published

Actions for Fraud controls in local councils

Fraud controls in local councils

Local Government
Fraud
Internal controls and governance
Management and administration
Risk

Many local councils need to improve their fraud control systems, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. The report highlights that councils often have fraud control procedures and systems in place, but are not ensuring people understand them and how they work. There is also significant variation between councils in the quality of their fraud controls.

Fraud can directly influence councils’ ability to deliver services, and undermine community confidence and trust. ICAC investigations, such as the recent Operation Ricco into the former City of Botany Bay Council, show the financial and reputational damage that major fraud can cause. Good fraud control practices are critical for councils and the community. 

The Audit Office of New South Wales 2015 Fraud Control Improvement Kit (the Kit) aligns with the Fraud and Corruption Control Standard AS8001-2008 and identifies ten attributes of an effective fraud control system. This audit used the Kit to assess how councils manage the risk of fraud. It identifies areas where fraud control can improve. 

Fraud can disrupt the delivery and quality of services and threaten the financial stability of councils.

Recent reviews of local government in Queensland and Victoria identify that councils are at risk of fraud because they purchase large quantities of goods and services using devolved decision making arrangements. The Queensland Audit Office in its 2014–15 report 'Fraud Management in Local Government' found that ‘Councils are exposed to high-risks of fraud and corruption because of the high volume of goods and services they procure, often from local suppliers; and because of the high degree of decision making vested in councils'. They also highlight some common problems faced by councils including the absence of fraud control plans and failure to conduct regular reviews of their internal controls. Also, in 2008 and 2012 the Victorian Auditor-General identified the importance of up-to-date fraud control planning, clearly documented related policies, training staff to identify fraud risks and the importance of controls such as third party management. 

Investigations into councils by the NSW Independent Commission Against Corruption (ICAC), such as the recent Operation Ricco, show the impact that fraud can have on councils. These impacts include significant financial loss, and negative public perceptions about how well councils manage fraud. The findings of these investigations also show the importance of good fraud controls for councils.

Operation Ricco

In its report on Operation Ricco, the ICAC found that the Chief Financial Officer (CFO) of the City of Botany Bay Council and others dishonestly exercised official functions to obtain financial benefits for themselves and others by causing fraudulent payments from the Council for their benefit. It also identified the CFO received inducements for favourable treatment of contractors.

The report noted that there were overwhelming failures in the council’s procedures and governance framework that created significant opportunities for corruption, of which the CFO and others took advantage.

It found weaknesses across a wide variety of governance processes and functions, including those involving the general manager, the internal audit function, external audit, and the operation of the audit committee.

Source: Published reports of ICAC investigations July 2017.

The strength of fraud control systems varies significantly across New South Wales local councils, and many councils we surveyed need to improve significantly. 

Most surveyed councils do not have fraud control plans that direct resources to mitigating the specific fraud risks they face. Few councils reported that they conduct regular risk assessments or health checks to ensure they respond effectively to the risks they identify. 

There are sector wide weaknesses that impact on the strength of councils' fraud control practice. Less than one-third of councils that responded to the survey:

  • communicate their expectations about ethical conduct and responsibility for fraud control to staff 
  • regularly train staff to identify and respond to suspected fraud
  • inform staff or the wider community how to report suspected fraud and how reports made will be investigated.

The audit also identified a pattern of councils developing policies, procedures or systems without ensuring people understand them, or assessing that they work. This reduces the likelihood that staff will actually use them. 

In general, metropolitan and regional councils surveyed have stronger fraud control systems than rural councils. 

Newly amalgamated councils are operating with systems inherited from two or more pre-amalgamated councils. These councils are developing new systems for their changed circumstances.

Five councils surveyed reported that they did not comply with the Public Interest Disclosure Act 1994

Observations for the sector:
Councils should improve their fraud controls by:

  • tailoring fraud control plans to their circumstances and specific risks
  • systematically and regularly reviewing their fraud risks and fraud control systems to keep their plans up to-date
  • effectively communicating fraud risks, and how staff and the community can report suspected fraud 
  • ensuring that they comply with the Public Interest Disclosure Act 1994.

Recommendation:
That the Office of Local Government: 

  • work with councils to ensure they comply with the Public Interest Disclosure Act 1994.
     
Despite several New South Wales state entities collecting data on suspected fraud, the cost, extent, and nature of fraud in local councils is not clear. 
There are weaknesses in data collection and categorisation. Several state entities receive complaints about councils. These entities often do not separate complaints about fraud from other complaint data, do not separate local council data from other public-sector data, and do not separate complaints about council decisions or councillors from complaints about council staff conduct. Complaints about one incidence of suspected fraud can also be reported multiple times. 
Collaboration between state entities and councils to address these weaknesses in data collection could provide a clearer picture to the public and councils on the incidence of suspected fraud. Better information may also help councils decide where to focus fraud control efforts and apply resources more effectively.
Including measures for fraud control strength and maturity in the OLG performance framework may also improve practice in councils. Further, OLG may want to consider how a revised Model Code could better drive fraud control practice in councils.
Recommendations
That the Office of Local Government:
  •  work with state entities and councils to develop a common approach to how fraud complaints and incidences are defined and categorised so that they can:
    • better use data to provide a clearer picture of the level of fraud within councils
    • measure the effectiveness of, and drive improvement in councils' fraud controls systems

Published

Actions for Shared services in local government

Shared services in local government

Local Government
Internal controls and governance
Management and administration
Shared services and collaboration

Local councils need to properly assess the performance of their current services before considering whether to enter into arrangements with other councils to jointly manage back-office functions or services for their communities. This is one of the recommended practices for councils in a report released today by the Auditor-General for New South Wales, Margaret Crawford. ‘When councils have decided to jointly provide services, they do not always have a strong business case, which clearly identifies the expected costs, benefits and risks of shared service arrangements’, said the Auditor-General.

Councils provide a range of services to meet the needs of their communities. It is important that they consider the most effective and efficient way to deliver them. Many councils work together to share knowledge, resources and services. When done well, councils can save money and improve access to services. This audit assessed how efficiently and effectively councils engage in shared service arrangements. We define ‘shared services’ as two or more councils jointly managing activities to deliver services to communities or perform back-office functions. 

The information we gathered for this audit included a survey of all general-purpose councils in NSW. In total 67 councils (52 per cent) responded to the survey from 128 invited to participate. Appendix two outlines in more detail some of the results from our survey. 

Conclusion
Most councils we surveyed are not efficiently and effectively engaging in shared services. This is due to three main factors. 
First, not all surveyed councils are assessing the performance of their current services before deciding on the best service delivery model. Where they have decided that sharing services is the best way to deliver services, they do not always build a business case which outlines the costs, benefits and risks of the proposed shared service arrangement before entering into it.
Second, some governance models used by councils to share services affect the scope, management and effectiveness of their shared service operations. Not all models are subject to the same checks and balances applied to councils, risking transparency and accountability. Councils must comply with legislative obligations under the Local Government Act 1993 (NSW), including principles for their day-to-day operations. When two or more councils decide to share services, they should choose the most suitable governance model in line with these obligations. 
Third, some councils we surveyed and spoke to lack the capability required to establish and manage shared service arrangements. Identifying whether sharing is the best way to deliver council services involves analysing how services are currently being delivered and building a business case. Councils also need to negotiate with partner councils and determine which governance model is fit for purpose. Planning to establish a shared service arrangement involves strong project management. Evaluating the arrangements identifies whether they are delivering to the expected outcomes. All of these tasks need a specialised skill set that councils do not always have in-house. Resources are available to support councils and to build their capability, but not all councils are seeking this out or considering their capability needs before proceeding.  
Some councils are not clearly defining the expected costs and benefits of shared service arrangements. As a result, the benefits from these arrangements cannot be effectively evaluated.
Some councils are entering into shared service arrangements without formally assessing their costs and benefits or investigating alternative service delivery models. Some councils are also not evaluating shared services against baseline data or initial expectations. Councils should base their arrangements on a clear analysis of the costs, benefits and risks involved. They should evaluate performance against clearly defined outcomes.
The decision to share a service involves an assessment of financial and non-financial costs and benefits. Non-financial benefits include being able to deliver additional services, improve service quality, and deliver regional services across councils or levels of government. 
When councils need support to assess and evaluate shared service arrangements, guidance is available through organisations or by peer learning with other councils.
The governance models councils use for shared services can affect their scope and effectiveness. Some councils need to improve their project management practices to better manage issues, risks and reporting. 
Shared services can operate under several possible governance models. Each governance model has different legal or administrative obligations, risks and benefits. Some arrangements can affect the scope and effectiveness of shared services. For example, some models do not allow councils to jointly manage services, requiring one council to take all risks and responsibilities. In addition, some models may reduce transparency and accountability to councils and their communities.
Regardless of these obligations and risks, councils can still improve how they manage their shared services operations by focusing on project management and better oversight. They would benefit from more guidance on shared service governance models to help them ensure the they are fit for purpose.
Recommendation
The Office of Local Government should, by April 2019:

Develop guidance which outlines the risks and opportunities of governance models that councils can use to share services. This should include advice on legal requirements, transparency in decisions, and accountability for effective use of public resources.

Published

Actions for HealthRoster benefits realisation

HealthRoster benefits realisation

Health
Compliance
Information technology
Management and administration
Project management
Workforce and capability

The HealthRoster system is delivering some business benefits but Local Health Districts are yet to use all of its features, according to a report released today by the Auditor-General for New South Wales,  Margaret Crawford. HealthRoster is an IT system designed to more effectively roster staff to meet the needs of Local Health Districts and other NSW health agencies.

The NSW public health system employs over 100,000 people in clinical and non-clinical roles across the state. With increasing demand for services, it is vital that NSW Health effectively rosters staff to ensure high quality and efficient patient care, while maintaining good workplace practices to support staff in demanding roles.

NSW Health is implementing HealthRoster as its single state-wide rostering system to more effectively roster staff according to the demands of each location. Between 2013–14 and 2016–17, our financial audits of individual LHDs had reported issues with rostering and payroll processes and systems.

NSW Health grouped all Local Health Districts (LHDs), and other NSW Health organisations, into four clusters to manage the implementation of HealthRoster over four years. Refer to Exhibit 4 for a list of the NSW Health entities in each cluster.

  • Cluster 1 implementation commenced in 2014–15 and was completed in 2015–16.
  • Cluster 2 implementation commenced in 2015–16 and was completed in 2016–17.
  • Cluster 3 began implementation in 2016–17 and was underway during the conduct of the audit.
  • Cluster 4 began planning for implementation in 2017–18.

Full implementation, including capability for centralised data and reporting, is planned for completion in 2019.

This audit assessed the effectiveness of the HealthRoster system in delivering business benefits. In making this assessment, we examined whether:

  • expected business benefits of HealthRoster were well-defined
  • HealthRoster is achieving business benefits where implemented.

The HealthRoster project has a timespan from 2009 to 2019. We examined the HealthRoster implementation in LHDs, and other NSW Health organisations, focusing on the period from 2014, when eHealth assumed responsibility for project implementation, to early 2018.

Conclusion
The HealthRoster system is realising functional business benefits in the LHDs where it has been implemented. In these LHDs, financial control of payroll expenditure and rostering compliance with employment award conditions has improved. However, these LHDs are not measuring the value of broader benefits such as better management of staff leave and overtime.
NSW Health has addressed the lessons learned from earlier implementations to improve later implementations. Business benefits identified in the business case were well defined and are consistent with business needs identified by NSW Health. Three of four cluster 1 LHDs have been able to reduce the number of issues with rostering and payroll processes. LHDs in earlier implementations need to use HealthRoster more effectively to ensure they are getting all available benefits from it.
HealthRoster is taking six years longer, and costing $37.2 million more, to fully implement than originally planned. NSW Health attributes the increased cost and extended timeframe to the large scale and complexity of the full implementation of HealthRoster.

Business benefits identified for HealthRoster accurately reflect business needs.

NSW Health has a good understanding of the issues in previous rostering systems and has designed HealthRoster to adequately address these issues. Interviews with frontline staff indicate that HealthRoster facilitates rostering which complies with industrial awards. This is a key business benefit that supports the provision of quality patient care. We saw no evidence that any major business needs or issues with the previous rostering systems are not being addressed by HealthRoster.

In the period examined in this audit since 2015, NSW Health has applied appropriate project management and governance structures to ensure that risks and issues are well managed during HealthRoster implementation.

HealthRoster has had two changes to its budget and timeline. Overall, the capital cost for the project has increased from $88.6 million to $125.6 million (42 per cent) and has delayed expected project completion by four years from 2015 to 2019. NSW Health attributes the increased cost and extended time frame to the large scale and complexity of the full implementation of HealthRoster.

NSW Health has established appropriate governance arrangements to ensure that HealthRoster is successfully implemented and that it will achieve business benefits in the long term. During implementation, local steering committees monitor risks and resolve implementation issues. Risks or issues that cannot be resolved locally are escalated to the state-wide steering committee.

NSW Health has grouped local health districts, and other NSW Health organisations, into four clusters for implementation. This has enabled NSW Health to apply lessons learnt from each implementation to improve future implementations.

NSW Health has a benefits realisation framework, but it is not fully applied to HealthRoster.

NSW Health can demonstrate that HealthRoster has delivered some functional business benefits, including rosters that comply with a wide variety of employment awards.

NSW Health is not yet measuring and tracking the value of business benefits achieved. NSW Health did not have benefits realisation plans with baseline measures defined for LHDs in cluster 1 and 2 before implementation. Without baseline measures NSW Health is unable to quantify business benefits achieved. However, analysis of post-implementation reviews and interviews with frontline staff indicate that benefits are being achieved. As a result, NSW Health now includes defining baseline measures and setting targets as part of LHD implementation planning. It has created a benefits realisation toolkit to assist this process from cluster 3 implementations onwards.

NSW Health conducted post-implementation reviews for clusters 1 and 2 and found that LHDs in these clusters were not using HealthRoster to realise all the benefits that HealthRoster could deliver.

By September 2018, NSW Health should:

  1. Ensure that Local Health Districts undertake benefits realisation planning according to the NSW Health benefits realisation framework
  2. Regularly measure benefits realised, at state and local health district levels, from the statewide implementation of HealthRoster
  3. Review the use of HealthRoster in Local Health Districts in clusters 1 and 2 and assist them to improve their HealthRoster related processes and practices.

By June 2019, NSW Health should:

  1. Ensure that all Local Health Districts are effectively using demand based rostering.

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #301 - released 7 June 2018

Published

Actions for Regional Assistance Programs

Regional Assistance Programs

Premier and Cabinet
Planning
Transport
Compliance
Infrastructure
Management and administration
Project management

Infrastructure NSW effectively manages how grant applications for regional assistance programs are assessed and recommended for funding. Its contract management processes are also effective. However, we are unable to conclude whether the objectives of these programs have been achieved as the relevant agencies have not yet measured their benefits, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. 

In 2011, the NSW Government established Restart NSW to fund new infrastructure with the proceeds from the sale and lease of government assets. From 2011 to 2017, the NSW Government allocated $1.7 billion from the fund for infrastructure in regional areas, with an additional commitment of $1.3 billion to be allocated by 2021. The NSW Government allocates these funds through regional assistance programs such as Resources for Regions and Fixing Country Roads. NSW councils are the primary recipients of funding provided under these programs.

The NSW Government announced the Resources for Regions program in 2012 with the aim of addressing infrastructure constraints in mining affected communities. Infrastructure NSW administers the program, with support from the Department of Premier and Cabinet.

The NSW Government announced the Fixing Country Roads program in 2014 with the aim of building more efficient road freight networks. Transport for NSW and Infrastructure NSW jointly administer this program, which funds local councils to deliver projects that help connect local and regional roads to state highways and freight hubs.

This audit assessed whether these two programs (Resources for Regions and Fixing Country Roads) were being effectively managed and achieved their objectives. In making this assessment, we answered the following questions:

  • How well are the relevant agencies managing the assessment and recommendation process?
  • How do the relevant agencies ensure that funded projects are being delivered?
  • Do the funded projects meet program and project objectives?

The audit focussed on four rounds of Resources for Regions funding between 2013–14 to 2015–16, as well as the first two rounds of Fixing Country Roads funding in 2014–15 and 2015–16.

Conclusion
Infrastructure NSW effectively manages how grant applications are assessed and recommended for funding. Infrastructure NSW’s contract management processes are also effective. However, we are unable to conclude on whether program objectives are being achieved as Infrastructure NSW has not yet measured program benefits.
While Infrastructure NSW and Transport for NSW managed the assessment processes effectively overall, they have not fully maintained all required documentation, such as conflict of interest registers. Keeping accurate records is important to support transparency and accountability to the public about funding allocation. The relevant agencies have taken steps to address this in the current funding rounds for both programs.
For both programs assessed, the relevant agencies have developed good strategies over time to support councils through the application process. These strategies include workshops, briefings and feedback for unsuccessful applicants. Transport for NSW and the Department of Premier and Cabinet have implemented effective tools to assist applicants in demonstrating the economic impact of their projects.
Infrastructure NSW is effective in identifying projects that are 'at‑risk' and assists in bringing them back on track. Infrastructure NSW has a risk‑based methodology to verify payment claims, which includes elements of good practice in grants administration. For example, it requires grant recipients to provide photos and engages Public Works Advisory to review progress claims and visit project sites.
Infrastructure NSW collects project completion reports for all Resources for Regions and Fixing Country Roads funded projects. Infrastructure NSW intends to assess benefits for both programs once each project in a funding round is completed. To date, no funding round has been completed. As a result, no benefits assessment has been done for any completed project funded in either program.
 

The project selection criteria are consistent with the program objectives set by the NSW Government, and the RIAP applied the criteria consistently. Probity and record keeping practices did not fully comply with the probity plans.

The assessment methodology designed by Infrastructure NSW is consistent with2 the program objectives and criteria. In the rounds that we reviewed, all funded projects met the assessment criteria.

Infrastructure NSW developed probity plans for both programs which provided guidance on the record keeping required to maintain an audit trail, including the use of conflict of interest registers. Infrastructure NSW and Transport for NSW did not fully comply with these requirements. The relevant agencies have taken steps to address this in the current funding rounds for both programs.

NSW Procurement Board Directions require agencies to ensure that they do not engage a probity advisor that is engaged elsewhere in the agency. Infrastructure NSW has not fully complied with this requirement. A conflict of interest arose when Infrastructure NSW engaged the same consultancy to act as its internal auditor and probity advisor.

While these infringements of probity arrangements are unlikely to have had a major impact on the assessment process, they weaken the transparency and accountability of the process.

Some councils have identified resourcing and capability issues which impact on their ability to participate in the application process. For both programs, the relevant agencies conducted briefings and webinars with applicants to provide advice on the objectives of the programs and how to improve the quality of their applications. Additionally, Transport for NSW and the Department of Premier and Cabinet have developed tools to assist councils to demonstrate the economic impact of their applications.

The relevant agencies provided feedback on unsuccessful applications to councils. Councils reported that the quality of this feedback has improved over time.

Recommendations

  1. By June 2018, Infrastructure NSW should:
    • ensure probity reports address whether all elements of the probity plan have been effectively implemented.
  1. By June 2018, Infrastructure NSW and Transport for NSW should:
    • maintain and store all documentation regarding assessment and probity matters according to the State Records Act 1998, the NSW Standard on Records Management and the relevant probity plans

Infrastructure NSW is responsible for overseeing and monitoring projects funded under Resources for Regions and Fixing Country Roads. Infrastructure NSW effectively manages projects to keep them on track, however it could do more to assure itself that all recipients have complied with funding deeds. Benefits and outcomes should also start to be measured and reported as soon as practicable after projects are completed to inform assessment of future projects.

Infrastructure NSW identifies projects experiencing unreasonable delays or higher than expected expenses as 'at‑risk'. After Infrastructure NSW identifies a project as 'at‑risk', it puts in place processes to resolve issues to bring them back on track. Infrastructure NSW, working with Public Works Advisory regional offices, employs a risk‑based approach to validate payment claims, however this process should be strengthened. Infrastructure NSW would get better assurance by also conducting annual audits of compliance with the funding deed for a random sample of projects.

Infrastructure NSW collects project completion reports for all Resources for Regions and Fixing Country Roads funded projects. It applies the Infrastructure Investor Assurance Framework to Resources for Regions and Fixing Country Roads at a program level. This means that each round of funding (under both programs) is treated as a distinct program for the purposes of benefits realisation. It plans to assess whether benefits have been realised once each project in a funding round is completed. As a result, no benefits realisation assessment has been done for any project funded under either Resources for Regions or Fixing Country Roads. Without project‑level benefits realisation, future decisions are not informed by the lessons from previous investments.

Recommendations

  1. By December 2018, Infrastructure NSW should:
    • conduct annual audits of compliance with the funding deed for a random sample of projects funded under Resources for Regions and Fixing Country Roads
    • publish the circumstances under which unspent funds can be allocated to changes in project scope
    • measure benefits delivered by projects that were completed before December 2017
    • implement an annual process to measure benefits for projects completed after December 2017
  1. By December 2018, Transport for NSW and Infrastructure NSW should:
    • incorporate a benefits realisation framework as part of the detailed application.

Published

Actions for Grants to non-government schools

Grants to non-government schools

Education
Compliance
Internal controls and governance
Management and administration

The NSW Department of Education could strengthen its management of the $1.2 billion provided to non-government schools annually. This would provide greater accountability for the use of public funds, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford.

Non‑government schools educate 418,000 school children each year, representing 35 per cent of all students in NSW. The NSW Department of Education administers several grant schemes to support these schools, with the aim of improving student learning outcomes and supporting parent choice. To be eligible for NSW Government funding, non‑government schools must be registered with the NSW Education Standards Authority (NESA) and not operate 'for profit' as per section 83C of the NSW Education Act 1990 (the Act). Non‑government schools can either be registered as independent or part of a System Authority.

In 2017–18, non‑government schools in NSW will receive over $1.2 billion from the NSW Government, as well as $3.4 billion from the Australian Government. Recently, the Australian Government has changed the way it funds schools. The NSW Government is assessing how these changes will impact State funding for non‑government schools.

This audit assessed how effectively and efficiently NSW Government grants to non‑government schools are allocated and managed. This audit did not assess the use of NSW Government grants by individual non‑government schools or System Authorities because the Auditor‑General of New South Wales does not have the mandate to assess how government funds are spent by non‑government entities.

Conclusion

The Department of Education effectively and efficiently allocates grants to non‑government schools. Clarifying the objectives of grants, monitoring progress towards these objectives, and improving oversight, would strengthen accountability for the use of public funds by non‑government schools.

We tested a sample of grants provided to non‑government schools under all major schemes, and found that the Department of Education consistently allocates and distributes grants in line with its methodology. The Department has clear processes and procedures to efficiently collect data from schools, calculate the level of funding each school or System should receive, obtain appropriate approvals, and make payments.

We identified three areas where the Department could strengthen its management of grants to provide greater accountability for the use of public funds. First, the Department’s objectives for providing grants to non‑government schools are covered by legislation, intergovernmental agreements and grant guidelines. The Department could consolidate these objectives to allow for more consistent monitoring. Second, the Department relies on schools or System Authorities to engage a registered auditor to certify the accuracy of information on their enrolments and usage of grants. Greater scrutiny of the registration and independence of the auditors would increase confidence in the accuracy of this information. Third, the Department does not monitor how System Authorities reallocate grant funding to their member schools. Further oversight in this area would increase accountability for the use of public funds.

The Department effectively and efficiently allocates grants to non‑government schools. Strengthening its processes would provide greater assurance that the information it collects is accurate.

The Department provides clear guidelines to assist schools to provide the necessary census information to calculate per capita grants. Schools must get an independent external auditor, registered with ASIC, to certify their enrolment figures. The Department checks a sample of the auditors to ensure that they are registered with ASIC. Some other jurisdictions perform additional procedures to increase confidence in the accuracy of the census (for example, independently checking a sample of schools’ census data).

The Department accurately calculates and distributes per capita grants in accordance with its methodology. The previous methodology, used prior to 2018, was not updated frequently enough to reflect changes in schools' circumstances. Over 2014 to 2017, the Department provided additional grants to non‑government schools under the National Education Reform Agreement (NERA), to bring funding more closely in line with the Australian Department of Education and Training's Schooling Resource Standard (SRS). From 2018, the Department has changed the way it calculates per capita grants to more closely align with the Australian Department of Education and Training's approach.

The Department determines eligibility for grants by checking a school's registration status with NESA. However, NESA's approach to monitoring compliance with the registration requirements prioritises student learning and wellbeing requirements over the requirement for policies and procedures for proper governance. Given their importance to the appropriate use of government funding, NESA could increase its monitoring of policies and procedures for proper governance through its program of random inspections. Further, the Department and NESA should enter into a formal agreement to share information to more accurately determine the level of risk of non‑compliance at each school. This may help both agencies more effectively target their monitoring to higher‑risk schools.

By December 2018, the NSW Department of Education should:

  1. Strengthen its processes to provide greater assurance that the enrolment and expenditure information it collects from non‑government schools is accurate. This should build on the work the Australian Government already does in this area.
  2. Establish formal information‑sharing arrangements with the NSW Education Standards Authority to more effectively monitor schools' eligibility to receive funding.
     

By December 2018, the NSW Education Standards Authority should:

  1. Extend its inspection practices to increase coverage of the registration requirement for policies and procedures for the proper governance of schools.
  2. Establish formal information‑sharing arrangements with the NSW Department of Education to more effectively monitor schools' continued compliance with the registration requirements.

The Department’s current approach to managing grants to non‑government schools could be improved to provide greater confidence that funds are being spent in line with the objectives of the grant schemes.

The NSW Government provides funding to non‑government schools to improve student learning outcomes, and to support schooling choices by parents, but does not monitor whether these grants are achieving this. In addition, each grant program has specific objectives. The main objectives for the per capita grant program is to increase the rate of students completing Year 12 (or equivalent), and to improve education outcomes for students. While non‑government schools publicly report on some educational measures via the MySchool website, these measures do not address all the objectives. Strengthened monitoring and reporting of progress towards objectives, at a school level, would increase accountability for public funding. This may require the Department to formalise its access to student level information.

The Department has listed five broad categories of acceptable use for per capita grants, however, provides no further guidance on what expenditure would fit into these categories. Clarifying the appropriate use of grants would increase confidence that funding is being used as intended. Schools must engage an independent auditor, registered with ASIC, to certify that the funding has been spent. The Department could strengthen this approach by improving its processes to check the registration of the auditor, and to verify their independence.

The Department has limited oversight of funding provided to System Authorities (Systems). The Department provides grants to Systems for all their member schools. The Systems can distribute the grants to their schools according to their own methodology. Systems are not required to report to the Department how much of their grant was retained for administrative or centralised expenses. Increased oversight over how the Systems distribute this grant could provide increased transparency for the use of public funds by systems.

By December 2018, the NSW Department of Education should:

  1. Establish and communicate funding conditions that require funded schools to:
    • adhere to conditions of funding, such as the acceptable use of grants, and accounting requirements to demonstrate compliance
    • report their progress towards the objectives of the scheme or wider Government initiatives
    • allow the Department to conduct investigations to verify enrolment and expenditure of funds
    • provide the Department with access to existing student level data to inform policy development and analysis.
  1. Increase its oversight of System Authorities by requiring them to:
    • re‑allocate funds across their system on a needs basis, and report to the Department on this
    • provide a yearly submission with enough detail to demonstrate that each System school has spent their State funding in line with the Department's requirements.

Published

Actions for Managing risks in the NSW public sector: risk culture and capability

Managing risks in the NSW public sector: risk culture and capability

Finance
Health
Justice
Treasury
Internal controls and governance
Management and administration
Risk
Workforce and capability

The Ministry of Health, NSW Fair Trading, NSW Police Force, and NSW Treasury Corporation are taking steps to strengthen their risk culture, according to a report released today by the Auditor-General, Margaret Crawford. 'Senior management communicates the importance of managing risk to their staff, and there are many examples of risk management being integrated into daily activities', the Auditor-General said.

We did find that three of the agencies we examined could strengthen their culture so that all employees feel comfortable speaking openly about risks. To support innovation, senior management could also do better at communicating to their staff the levels of risk they are willing to accept.

Effective risk management is essential to good governance, and supports staff at all levels to make informed judgements and decisions. At a time when government is encouraging innovation and exploring new service delivery models, effective risk management is about seizing opportunities as well as managing threats.

Over the past decade, governments and regulators around the world have increasingly turned their attention to risk culture. It is now widely accepted that organisational culture is a key element of risk management because it influences how people recognise and engage with risk. Neglecting this ‘soft’ side of risk management can prevent institutions from managing risks that threaten their success and lead to missed opportunities for change, improvement or innovation.

This audit assessed how effectively NSW Government agencies are building risk management capabilities and embedding a sound risk culture throughout their organisations. To do this we examined whether:

  • agencies can demonstrate that senior management is committed to risk management
  • information about risk is communicated effectively throughout agencies
  • agencies are building risk management capabilities.

The audit examined four agencies: the Ministry of Health, the NSW Fair Trading function within the Department of Finance, Services and Innovation, NSW Police Force and NSW Treasury Corporation (TCorp). NSW Treasury was also included as the agency responsible for the NSW Government's risk management framework.

Conclusion
All four agencies examined in the audit are taking steps to strengthen their risk culture. In these agencies, senior management communicates the importance of managing risk to their staff. They have risk management policies and funded central functions to oversee risk management. We also found many examples of risk management being integrated into daily activities.
That said, three of the four case study agencies could do more to understand their existing risk culture. As good practice, agencies should monitor their employees’ attitude to risk. Without a clear understanding of how employees identify and engage with risk, it is difficult to tell whether the 'tone' set by the executive and management is aligned with employee behaviours.
Our survey of risk culture found that three agencies could strengthen a culture of open communication, so that all employees feel comfortable speaking openly about risks. To support innovation, senior management could also do better at communicating to their staff the levels of risk they are willing to accept.
Some agencies are performing better than others in building their risk capabilities. Three case study agencies have reviewed the risk-related skills and knowledge of their workforce, but only one agency has addressed the gaps the review identified. In three agencies, staff also need more practical guidance on how to manage risks that are relevant to their day-to-day responsibilities.
NSW Treasury provides agencies with direction and guidance on risk management through policy and guidelines. Its principles-based approach to risk management is consistent with better practice. Nevertheless, there is scope for NSW Treasury to develop additional practical guidance and tools to support a better risk culture in the NSW public sector. NSW Treasury should encourage agency heads to form a view on the current risk culture in their agencies, identify desirable changes to that risk culture, and take steps to address those changes. 

In assessing an agency’s risk culture, we focused on four key areas:

Executive sponsorship (tone at the top)

In the four agencies we reviewed, senior management is communicating the importance of managing risk. They have endorsed risk management frameworks and funded central functions tasked with overseeing risk management within their agencies.

That said, we found that three case study agencies do not measure their existing risk culture. Without clear measures of how employees identify and engage with risk, it is difficult for agencies to tell whether employee's behaviours are aligned with the 'tone' set by the executive and management.

For example, in some agencies we examined we found a disconnect between risk tolerances espoused by senior management and how these concepts were understood by staff.

Employee perceptions of risk management

Our survey of staff indicated that while senior leaders have communicated the importance of managing risk, more could be done to strengthen a culture of open communication so that all employees feel comfortable speaking openly about risks. We found that senior management could better communicate to their staff the levels of risk they should be willing to accept.

Integration of risk management into daily activities and links to decision-making

We found examples of risk management being integrated into daily activities. On the other hand, we also identified areas where risk management deviated from good practice. For example, we found that corporate risk registers are not consistently used as a tool to support decision-making.

Support and guidance to help staff manage risks

Most case study agencies are monitoring risk-related skills and knowledge of their workforce, but only one agency has addressed the gaps it identified. While agencies are providing risk management training, surveyed staff in three case study agencies reported that risk management training is not adequate.

NSW Treasury provides agencies with direction and guidance on risk management through policy and guidelines. In line with better practice, NSW Treasury's principles-based policy acknowledges that individual agencies are in a better position to understand their own risks and design risk management frameworks that address those risks. Nevertheless, there is scope for NSW Treasury to refine its guidance material to support a better risk culture in the NSW public sector.

Recommendation

By May 2019, NSW Treasury should:

  • Review the scope of its risk management guidance, and identify additional guidance, training or activities to improve risk culture across the NSW public sector. This should focus on encouraging agency heads to form a view on the current risk culture in their agencies, identify desirable changes to that risk culture, and take steps to address those changes.

Published

Actions for Council reporting on service delivery

Council reporting on service delivery

Local Government
Compliance
Internal controls and governance
Management and administration
Service delivery

New South Wales local government councils’ could do more to demonstrate how well they are delivering services in their reports to the public, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Many councils report activity, but do not report on outcomes in a way that would help their communities assess how well they are performing. Most councils also did not report on the cost of services, making it difficult for communities to see how efficiently they are being delivered. And councils are not consistently publishing targets to demonstrate what they are striving for.

I am pleased to present my first local government performance audit pursuant to section 421D of the Local Government Act 1993.

My new mandate supports the Parliament’s objectives to:

  • strengthen governance and financial oversight in the local government sector
  • improve financial management, fiscal responsibility and public accountability for how councils use citizens’ funds.

Performance audits aim to help councils improve their efficiency and effectiveness. They will also provide communities with independent information on the performance of their councils.

For this inaugural audit in the local government sector, I have chosen to examine how well councils report to their constituents about the services they provide.

In this way, the report will enable benchmarking and provide improvement guidance to all councils across New South Wales.

Specific recommendations to drive improved reporting are directed to the Office of Local Government, which is the regulator of councils in New South Wales.

Councils provide a range of services which have a direct impact on the amenity, safety and health of their communities. These services need to meet the needs and expectations of their communities, as well as relevant regulatory requirements set by state and federal governments. Councils have a high level of autonomy in decisions about how and to whom they provide services, so it is important that local communities have access to information about how well they are being delivered and meeting community needs. Ultimately councils should aim to ensure that reporting performance is subject to quality controls designed to provide independent assurance.

Conclusion
While councils report on outputs, reporting on outcomes and performance over time can be improved. Improved reporting would include objectives with targets that better demonstrate performance over time. This would help communities understand what services are being delivered, how efficiently and effectively they are being delivered, and what improvements are being made.
To ensure greater transparency on service effectiveness and efficiency, the Office of Local Government (OLG) should work with councils to develop guidance principles to improve reporting on service delivery to local communities. This audit identified an interest amongst councils in improving their reporting and broad agreement with the good practice principles developed as part of the audit.
The Integrated Planning and Reporting Framework (the Framework), which councils are required to use to report on service delivery, is intended to promote better practice. However, the Framework is silent on efficiency reporting and provides limited guidance on how long-term strategic documents link with annual reports produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.
OLG should also work with state agencies to reduce the overall reporting burden on councils by consolidating state agency reporting requirements. 

Councils report extensively on the things they have done, but minimally on the outcomes from that effort, efficiency and performance over time.

Councils could improve reporting on service delivery by more clearly relating the resources needed with the outputs produced, and by reporting against clear targets. This would enable communities to understand how efficiently services are being delivered and how well councils are tracking against their goals and priorities.

Across the sector, a greater focus is also needed on reporting performance over time so that communities can track changes in performance and councils can demonstrate whether they are on target to meet any agreed timeframes for service improvements.

The degree to which councils demonstrate good practice in reporting on service delivery varies greatly between councils. Metropolitan and regional town and city councils generally produce better quality reporting than rural councils. This variation indicates that, at least in the near-term, OLG's efforts in building capability in reporting would be best directed toward rural councils.

Recommendation

By mid-2018, OLG should:

  • assist rural councils to develop their reporting capability.

The Framework which councils are required to use to report on service delivery, is intended to drive good practice in reporting. Despite this, the Framework is silent on a number of aspects of reporting that should be considered fundamental to transparent reporting on service delivery. It does not provide guidance on reporting efficiency or cost effectiveness in service delivery and provides limited guidance on how annual reports link with other plans produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.

Recommendation

By mid-2018, OLG should:

  • issue additional guidance on good practice in council reporting, with specific information on:
    • reporting on performance against targets
    • reporting on performance against outcome
    • assessing and reporting on efficiency and cost effectiveness
    • reporting performance over time
    • clearer integration of all reports and plans that are required by the Framework, particularly the role of End of Term Reporting
    • defining reporting terms to encourage consistency.

The Framework is silent on inclusion of efficiency or cost effectiveness indicators in reports

The guidelines produced by OLG in 2013 to assist councils to implement their Framework requirements advise that performance measures should be included in all plans. However, the Framework does not specifically state that efficiency or cost effectiveness indicators should be included as part of this process. This has been identified as a weakness in the 2012 performance audit report and the Local Government Reform Panel review of reporting by councils on service delivery.

The Framework and supporting documents provide limited guidance on reporting

Councils' annual reports provide a consolidated summary of their efforts and achievements in service delivery and financial management. However, OLG provides limited guidance on:

  • good practice in reporting to the community
  • how the annual report links with other plans and reports required by the Framework.

Further, the Framework includes both Annual and End of Term Reports. However, End of Term reports are published prior to council elections and are mainly a consolidation of annual reports produced during a council’s term. The relationship between Annual reports and End of Term reports is not clear.

OLG is reviewing the Framework and guidance

OLG commenced work on reviewing of the Framework in 2013 but this was deferred with work re‑starting in 2017. The revised guidelines and manual were expected to be released late in 2017.

OLG should build on the Framework to improve guidance on reporting on service delivery, including in annual reports

The Framework provides limited guidance on how best to report on service delivery, including in annual reports. It is silent on inclusion of efficiency or cost effectiveness indicators in reporting, which are fundamental aspects of performance reporting. Councils we consulted would welcome more guidance from OLG on these aspects of reporting.

Our consultation with councils highlighted that many council staff would welcome a set of reporting principles that provide guidance to councils, without being prescriptive. This would allow councils to tailor their approach to the individual characteristics, needs and priorities of their local communities.

Consolidating what councils are required to report to state agencies would reduce the reporting burden and enable councils to better report on performance. Comparative performance indicators are also needed to provide councils and the public with a clear understanding of councils' performance relative to each other.

Recommendations

By mid-2018, OLG should:

  • commence work to consolidate the information reported by individual councils to NSW Government agencies as part of their compliance requirements.
  • progress work on the development of a Performance Measurement Framework, and associated performance indicators, that can be used by councils and the NSW Government in sector-wide performance reporting.

Streamlining the reporting burden would help councils improve reporting

The NSW Government does not have a central view of all local government reporting, planning and compliance obligations. A 2016 draft IPART ‘Review of reporting and compliance burdens on Local Government’ noted that councils provide a wide range of services under 67 different Acts, administered by 27 different NSW Government agencies. Consolidating and coordinating reporting requirements would assist with better reporting over time and comparative reporting. It would also provide an opportunity for NSW Government agencies to reduce the reporting burden on councils by identifying and removing duplication.

Enabling rural councils to perform tailored surveys of their communities may be more beneficial than a state-wide survey in defining outcome indicators

Some councils use community satisfaction survey data to develop outcome indicators for reporting. The results from these are used by councils to set service delivery targets and report on outcomes. This helps to drive service delivery in line with community expectations. While some regional councils do conduct satisfaction surveys, surveys are mainly used by metropolitan councils which generally have the resources needed to run them.

OLG and the Department of Premier and Cabinet have explored the potential to conduct state-wide resident satisfaction surveys with a view to establishing measures to improve service delivery. This work has drawn from a similar approach adopted in Victoria. Our consultation with stakeholders in Victoria indicated that the state level survey is not sufficiently detailed or specific enough to be used as a tool in setting targets that respond to local circumstances, expectations and priorities. Our analysis of reports and consultation with stakeholders suggest that better use of resident survey data in rural and regional areas may support improvements in performance reporting in these areas. Rural councils may benefit more from tailored surveys of groups of councils with similar challenges, priorities and circumstances than from a standard state-wide survey. These could potentially be achieved through regional cooperation between groups of similar councils or regional groups.

Comparative reporting indicators are needed to enable councils to respond to service delivery priorities of their communities

The Local Government Reform Panel in 2012 identified the need for ‘more consistent data collection and benchmarking to enable councils and the public to gain a clear understanding of how a council is performing relative to their peers’.

OLG commenced work in 2012 to build a new performance measurement Framework for councils which aimed to move away from compliance reporting. This work was also strongly influenced by the approach used in Victoria that requires councils to report on a set of 79 indicators which are reported on the Victorian 'Know your council' website. OLG’s work did not fully progress at the time and several other local government representative bodies have since commenced work to establish performance measurement frameworks. OLG advised us it has recently recommenced its work on this project.

Our consultation identified some desire amongst councils to be able to compare their performance to support improvement in the delivery of services. We also identified a level of frustration that more progress has not been made toward establishment of a set of indicators that councils can use to measure performance and drive improvement in service delivery.

Several councils we spoke with were concerned that the current approaches to comparative reporting did not adequately acknowledge that councils need to tailor their service types, level and mix to the needs of their community. Comparative reporting approaches tend to focus on output measures such as number of applications processed, library loans annually and opening hours for sporting facilities, rather than outcome measures. These approaches risk unjustified and adverse interpretations of performance where councils have made a decision based on community consultation, local priorities and available resources. To mitigate this, it is important to

  • adopt a partnership approach to the development of indicators
  • ensure indicators measure performance, not just level of activity
  • compare performance between councils that are similar in terms of size and location.

It may be more feasible, at least in the short term, for OLG to support small groups of like councils to develop indicators suited to their situation.

Based on our consultations, key lessons from implementing a sector-wide performance indicator framework in Victoria included the benefits of:

  • consolidation of the various compliance data currently being reported by councils to provide an initial platform for comparative performance reporting
  • adopting a partnership approach to development of common indicators with groups of like councils.

Published

Actions for The Impact of the Raised School Leaving Age

The Impact of the Raised School Leaving Age

Education
Management and administration
Service delivery

The Department monitors the attendance of all students who remain enrolled at government schools, and responds when these students fail to attend. For young people that have been granted an exemption from attending school, the Department monitors apprentices, trainees and those completing the equivalent of Year 10 of secondary education at TAFE. However, the Department does not monitor young people post Year 10 in full-time work or vocational education programs until they turn 17 years of age. In accordance with the law, it is a parent’s responsibility to make sure that a child is attending school or involved in an approved alternate activity until they turn 17 years of age.

 

Parliamentary reference - Report number #226 - released 1 November 2012