Reports
Actions for Transport 2018
Transport 2018
The Auditor-General for New South Wales, Margaret Crawford released her report today on key observations and findings from the 30 June 2018 financial statement audits of agencies in the Transport cluster. Unqualified audit opinions were issued for all agencies' financial statements. However, assessing the fair value of the broad range of transport related assets creates challenges.
This report analyses the results of our audits of financial statements of the Transport cluster for the year ended 30 June 2018. The table below summarises our key observations.
This report provides Parliament and other users of the Transport cluster’s financial statements with the results of our audits, our observations, analysis, conclusions and recommendations in the following areas:
- financial reporting
- audit observations.
Financial reporting is an important element of good governance. Confidence and transparency in public sector decision making are enhanced when financial reporting is accurate and timely.
This chapter outlines our audit observations related to the financial reporting of agencies in the Transport cluster for 2018.
Observation | Conclusions and recommendations |
2.1 Quality of financial reporting | |
Unqualified audit opinions were issued for all agencies' financial statements | Sufficient audit evidence was obtained to conclude the financial statements were free of material misstatement. |
2.2 Key accounting issues | |
Valuation of assets continues to create challenges. Although agencies complied with the requirements of the accounting standards and Treasury policies on valuations, we identified some opportunities for improvements at RMS. |
RMS incorporated data from its asset condition assessments for the first time in the valuation methodology which improved the valuation outcome. Overall, we were satisfied with the valuation methodology and key assumptions, but we noted some deficiencies in the asset data in relation to asset component unit rates and old condition data for some components of assets. Also, a bypass and tunnel were incorrectly excluded from RMS records and valuation process since 2013. This resulted in an increase for these assets’ value by $133 million. The valuation inputs for Wetlands and Moorings were revised this year to better reflect the assets' characteristics resulting in a $98.0 million increase. |
2.3 Timeliness of financial reporting | |
Residual Transport Corporation did not submit its financial statements by the statutory reporting deadline. | Residual Transport Corporation remained a dormant entity with no transactions for the year ended 30 June 2018. |
With the exception of Residual Transport Corporation, all agencies completed early close procedures and submitted financial statements within statutory timeframes. | Early close procedures allow financial reporting issues and risks to be addressed early in the reporting and audit process. |
2.4 Financial sustainability | |
NSW Trains and the Chief Investigator of the Office of Transport Safety Investigations reported negative net assets of $75.7 million and $89,000 respectively at 30 June 2018. | NSW Trains and the Chief Investigator of the Office of Transport Safety Investigations continue to require letters of financial support to confirm their ability to pay liabilities as they fall due. |
2.5 Passenger revenue and patronage | |
Transport agencies revenue growth increased at a higher rate than patronage. | Public transport passenger revenue increased by $114 million (8.3 per cent) in 2017–18, and patronage increased by 37.1 million (5.1 per cent) across all modes of transport based on data provided by TfNSW. |
Negative balance Opal Cards resulted in $3.8 million in revenue not collected in 2017–18 and $7.8 million since the introduction of Opal. A total of 1.1 million Opal cards issued since its introduction have negative balances. | Transport for NSW advised it is liaising with the ticketing vendor to implement system changes and are investigating other ways to reduce the occurrences. |
2.6 Cost recovery from public transport users | |
Overall cost recovery from users has decreased. | Overall cost recovery from public transport users (on rail and bus services by STA) decreased from 23.2 per cent to 22.4 per cent between 2016–17 and 2017–18. The main reason for the decrease is due to expenditure increasing at a faster rate than revenue in 2017–18. |
Appropriate financial controls help ensure the efficient and effective use of resources and administration of agency policies. They are essential for quality and timely decision making.
This chapter outlines our observations and insights from:
- our financial statement audits of agencies in the Transport cluster for 2018
- the areas of focus identified in the Audit Office annual work program.
The Audit Office Annual Work Program provides a summary of all audits to be conducted within the proposed time period as well as detailed information on the areas of focus for each of the NSW Government clusters.
Observation | Conclusions and recommendations |
3.1 Internal controls | |
There was an increase in findings on internal controls across the Transport cluster. | Key themes related to information technology, employee leave entitlements and asset management. Eighteen per cent of all issues were repeat issues. |
3.2 Audit Office Annual work program | |
The Transport cluster wrote-off over $200 million of assets which were replaced by new assets or technology. |
Majority of this write-off was recognised by RMS, with $199 million relating to the write-off of existing assets which have been replaced during the year. |
RailCorp is expected to convert to TAHE from 1 July 2019. | Several working groups are considering different aspects of the TAHE transition including its status as a for-profit Public Trading Enterprise and which assets to transfer to TAHE. We will continue to monitor developments on TAHE for any impact to the financial statements. |
RMS' estimated maintenance backlog at 30 June 2018 of $3.4 billion is lower than last year. Sydney Trains' estimated maintenance backlog at 30 June 2018 increased by 20.6 per cent to $434 million. TfNSW does not quantify its backlog maintenance. | TfNSW advised it is liaising with Infrastructure NSW to develop a consistent definition of maintenance backlog across all transport service providers. |
Not all agencies monitor unplanned maintenance across the Transport cluster. | Unplanned maintenance can be more expensive than planned maintenance. TfNSW should develop a consistent approach to define, monitor and track unplanned maintenance across the cluster. |
This chapter outlines certain service delivery outcomes for 2017–18. The data on activity levels and performance is provided by Cluster agencies. The Audit Office does not have a specific mandate to audit performance information. Accordingly, the information in this chapter is unaudited.
We report this information on service delivery to provide additional context to understand the operations of the Transport cluster and to collate and present service information for different modes of transport in one report.
In our recent performance audit, Progress and measurement of Premier's Priorities, we identified 12 limitations of performance measurement and performance data. We recommended that the Department of Premier and Cabinet ensure that processes to check and verify data are in place for all agency data sources.
Actions for Mobile speed cameras
Mobile speed cameras
The primary goal of speed cameras is to reduce speeding and make the roads safer. Our 2011 performance audit on speed cameras found that, in general, speed cameras change driver behaviour and have a positive impact on road safety.
Transport for NSW published the NSW Speed Camera Strategy in June 2012 in response to our audit. According to the Strategy, the main purpose of mobile speed cameras is to reduce speeding across the road network by providing a general deterrence through anywhere, anytime enforcement and by creating a perceived risk of detection across the road network. Fixed and red-light speed cameras aim to reduce speeding at specific locations.
Roads and Maritime Services and Transport for NSW deploy mobile speed cameras (MSCs) in consultation with NSW Police. The cameras are operated by contractors authorised by Roads and Maritime Services. MSC locations are stretches of road that can be more than 20 kilometres long. MSC sites are specific places within these locations that meet the requirements for a MSC vehicle to be able to operate there.
This audit assessed whether the mobile speed camera program is effectively managed to maximise road safety benefits across the NSW road network.
The mobile speed camera program requires improvements to key aspects of its management to maximise road safety benefits. While camera locations have been selected based on crash history, the limited number of locations restricts network coverage. It also makes enforcement more predictable, reducing the ability to provide a general deterrence. Implementation of the program has been consistent with government decisions to limit its hours of operation and use multiple warning signs. These factors limit the ability of the mobile speed camera program to effectively deliver a broad general network deterrence from speeding.
Many locations are needed to enable network-wide coverage and ensure MSC sessions are randomised and not predictable. However, there are insufficient locations available to operate MSCs that meet strict criteria for crash history, operator safety, signage and technical requirements. MSC performance would be improved if there were more locations.
A scheduling system is meant to randomise MSC location visits to ensure they are not predictable. However, a relatively small number of locations have been visited many times making their deployment more predictable in these places. The allocation of MSCs across the time of day, day of week and across regions is prioritised based on crash history but the frequency of location visits does not correspond with the crash risk for each location.
There is evidence of a reduction in fatal and serious crashes at the 30 best-performing MSC locations. However, there is limited evidence that the current MSC program in NSW has led to a behavioural change in drivers by creating a general network deterrence. While the overall reduction in serious injuries on roads has continued, fatalities have started to climb again. Compliance with speed limits has improved at the sites and locations that MSCs operate, but the results of overall network speed surveys vary, with recent improvements in some speed zones but not others.
There is no supporting justification for the number of hours of operation for the program. The rate of MSC enforcement (hours per capita) in NSW is less than Queensland and Victoria. The government decision to use multiple warning signs has made it harder to identify and maintain suitable MSC locations, and impeded their use for enforcement in both traffic directions and in school zones.
Appendix one - Response from agency
Appendix two - About the audit
Appendix three - Performance auditing
Parliamentary reference - Report number #308 - released 18 October 2018
Actions for Members' Additional Entitlements 2017
Members' Additional Entitlements 2017
Actions for Progress and measurement of the Premier's Priorities
Progress and measurement of the Premier's Priorities
The Premier’s Implementation Unit uses a systematic approach to measuring and reporting progress towards the Premier’s Priorities performance targets, but public reporting needed to improve, according to a report released today by the Auditor-General of NSW, Margaret Crawford.
The Premier of New South Wales has established 12 Premier’s Priorities. These are key performance targets for government.
The 12 Premier's Priorities | |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
Source: Department of Premier and Cabinet, Premier’s Priorities website.
Each Premier’s Priority has a lead agency and minister responsible for achieving the performance target.
The Premier’s Implementation Unit (PIU) was established within the Department of Premier and Cabinet (DPC) in 2015. The PIU is a delivery unit that supports agencies to measure and monitor performance, make progress toward the Premier’s Priorities targets, and report progress to the Premier, key ministers and the public.
This audit assessed how effectively the NSW Government is progressing and reporting on the Premier's Priorities.
The Premier’s Implementation Unit (PIU) is effective in assisting agencies to make progress against the Premier’s Priorities targets. Progress reporting is regular but transparency to the public is weakened by the lack of information about specific measurement limitations and lack of clarity about the relationship of the targets to broader government objectives.The PIU promotes a systematic approach to measuring performance and reporting progress towards the Premier’s Priorities’ performance targets. Public reporting would be improved with additional information about the rationale for choosing specific targets to report on broader government objectives.
The PIU provides a systematic approach to measuring performance and reporting progress towards the Premier's Priorities performance targets. Public reporting would be improved with additional information about the rationale for choosing specific targets to report on broader government objectives. The data used to measure the Premier’s Priorities comes from a variety of government and external datasets, some of which have known limitations. These limitations are not revealed in public reporting, and only some are revealed in progress reported to the Premier and ministers. This limits the transparency of reporting.
The PIU assists agencies to avoid unintended outcomes that can arise from prioritising particular performance measures over other areas of activity. The PIU has adopted a collaborative approach to assisting agencies to analyse performance using data, and helping them work across organisational silos to achieve the Premier’s Priorities targets.
Data used to measure progress for some of the Premier’s Priorities has limitations which are not made clear when progress is reported. This reduces transparency about the reported progress. Public reporting would also be improved with additional information about the relationship between specific performance measures and broader government objectives.
The PIU is responsible for reporting progress to the Premier, key ministers and the public. Agencies provide performance data and some play a role in preparing progress reports for the Premier and ministers. For 11 of the Premier's Priorities, progress is reported against measurable and time-related performance targets. For the infrastructure priority, progress is reported against project milestones.
Progress of some Priorities is measured using data that has known limitations, which should be noted wherever progress is reported. For example, the data used to report on housing completions does not take housing demolitions into account, and is therefore overstating the contribution of this performance measure to housing supply. This known limitation is not explained in progress reports or on the public website.
Data used to measure progress is sourced from a mix of government and external datasets. Updated progress data for most Premier’s Priorities is published on the Premier’s Priorities website annually, although reported to the Premier and key ministers more frequently. The PIU reviews the data and validates it through fieldwork with front line agencies. The PIU also assists agencies to avoid unintended outcomes that can arise from prioritising single performance measures. Most, but not all, agencies use additional indicators to check for misuse of data or perverse outcomes.
We examined the reporting processes and controls for five of the Premier’s Priorities. We found that there is insufficient assurance over the accuracy of the data on housing approvals.
The relationships between performance measures and broader government objectives is not always clearly explained on the Premier’s Priority website, which is the key source of public information about the Premier’s Priorities. For example, the Premier’s Priority to reduce litter volumes is communicated as “Keeping our Environment Clean.” While the website explains why reducing litter is important, it does not clearly explain why that particular target has been chosen to measure progress in keeping the environment clean.
By December 2018, the Department of Premier and Cabinet should:
- improve transparency of public reporting by:
- providing information about limitations of reported data and associated performance
- clarifying the relationship between the Premier’s Priorities performance targets and broader government objectives.
- ensure that processes to check and verify data are in place for all agency data sources
- encourage agencies to develop and implement additional supporting indicators for all Premier’s Priority performance measures to prevent and detect unintended consequences or misuse of data.
The Premier's Implementation Unit is effective in supporting agencies to deliver progress towards the Premier’s Priority targets.
The PIU promotes a systematic approach to monitoring and reporting progress against a target, based on a methodology used in delivery units elsewhere in the world. The PIU undertakes internal self-evaluation, and commissions regular reviews of methodology implementation from the consultancy that owns the methodology and helped to establish the PIU. However, the unit lacks periodic independent reviews of their overall effectiveness. The PIU has adopted a collaborative approach and assists agencies to analyse performance using data, and work across organisational silos to achieve the Premier’s Priorities targets.
Agency representatives recognise the benefits of being responsible for a Premier's Priority and speak of the value of being held to account and having the attention of the Premier and senior ministers.
By June 2019, the Department of Premier and Cabinet should:
- establish routine collection of feedback about PIU performance including:
- independent assurance of PIU performance
- opportunity for agencies to provide confidential feedback.
Appendix one: Response from agency
Appendix three: About the audit
Appendix four: Performance auditing
Parliamentary reference - Report number #307 - released 13 September 2018
Actions for Managing Antisocial behaviour in public housing
Managing Antisocial behaviour in public housing
The Department of Family and Community Services (FACS) has not adequately supported or resourced its staff to manage antisocial behaviour in public housing according to a report released today by the Deputy Auditor-General for New South Wales, Ian Goodwin.
In recent decades, policy makers and legislators in Australian states and territories have developed and implemented initiatives to manage antisocial behaviour in public housing environments. All jurisdictions now have some form of legislation or policy to encourage public housing tenants to comply with rules and obligations of ‘good neighbourliness’. In November 2015, the NSW Parliament changed legislation to introduce a new approach to manage antisocial behaviour in public housing. This approach is commonly described as the ‘strikes’ approach.
When introduced in the NSW Parliament, the ‘strikes’ approach was described as a means to:
- improve the behaviour of a minority of tenants engaging in antisocial behaviour
- create better, safer communities for law abiding tenants, including those who are ageing and vulnerable.
FACS has a number of tasks as a landlord, including a responsibility to collect rent and organise housing maintenance. FACS also has a role to support tenants with complex needs and manage antisocial behaviour. These roles have some inherent tensions. The FACS antisocial behaviour management policy aims are:
to balance the responsibilities of tenants, the rights of their neighbours in social housing, private residents and the broader community with the need to support tenants to sustain their public housing tenancies.
This audit assessed the efficiency and effectiveness of the ‘strikes’ approach to managing antisocial behaviour in public housing environments.
We examined whether:
- the approach is being implemented as intended and leading to improved safety and security in social housing environments
- FACS and its partner agencies have the capability and capacity to implement the approach
- there are effective mechanisms to monitor, report and progressively improve the approach.
Conclusion
FACS has not adequately supported or resourced its staff to implement the antisocial behaviour policy. FACS antisocial behaviour data is incomplete and unreliable. Accordingly, there is insufficient data to determine the nature and extent of the problem and whether the implementation of the policy is leading to improved safety and security. FACS management of minor and moderate incidents of antisocial behaviour is poor. FACS has not dedicated sufficient training to equip frontline housing staff with the relevant skills to apply the antisocial behaviour management policy. At more than half of the housing offices we visited, staff had not been trained to:
When frontline housing staff are informed about serious and severe illegal antisocial behaviour incidents, they generally refer them to the FACS Legal Division. Staff in the Legal Division are trained and proficient in managing antisocial behaviour in compliance with the policy and therefore, the more serious incidents are managed effectively using HOMES ASB.
|
Parliamentary reference - Report number #306 - released 10 August 2018
Actions for Matching skills training with market needs
Matching skills training with market needs
In 2012, governments across Australia entered into the National Partnership Agreement on Skills Reform. Under the National Partnership Agreement, the Australian Government provided incentive payments to States and Territories to move towards a more contestable Vocational Education and Training (VET) market. The aim of the National Partnership Agreement was to foster a more accessible, transparent, efficient and high quality training sector that is responsive to the needs of students and industry.
The New South Wales Government introduced the Smart and Skilled program in response to the National Partnership Agreement. Through Smart and Skilled, students can choose a vocational course from a list of approved qualifications and training providers. Students pay the same fee for their chosen qualification regardless of the selected training provider and the government covers the gap between the student fee and the fixed price of the qualification through a subsidy paid to their training provider.
Smart and Skilled commenced in January 2015, with the then Department of Education and Communities having primary responsibility for its implementation. Since July 2015, the NSW Department of Industry (the Department) has been responsible for VET in New South Wales and the implementation of Smart and Skilled.
The NSW Skills Board, comprising nine part-time members appointed by the Minister for Skills, provides independent strategic advice on VET reform and funding. In line with most other States and Territories, the Department maintains a 'Skills List' which contains government subsidised qualifications to address identified priority skill needs in New South Wales.
This audit assessed the effectiveness of the Department in identifying, prioritising, and aligning course subsidies to the skill needs of NSW. To do this we examined whether:
- the Department effectively identifies and prioritises present and future skill needs
- Smart and Skilled funding is aligned with the priority skill areas
- skill needs and available VET courses are effectively communicated to potential participants and training providers.
Smart and Skilled is a relatively new and complex program, and is being delivered in the context of significant reform to VET nationally and in New South Wales. A large scale government funded contestable market was not present in the VET sector in New South Wales before the introduction of Smart and Skilled. This audit's findings should be considered in that context.
The Department needs to better use the data it has, and collect additional data, to support its analysis of priority skill needs in New South Wales, and direct funding accordingly.
- funding scholarships and support for disadvantaged students
- funding training in regional and remote areas
- providing additional support to deliver some qualifications that the market is not providing.
The Department needs to evaluate these funding strategies to ensure they are achieving their goals. It should also explore why training providers are not delivering some priority qualifications through Smart and Skilled.
Training providers compete for funding allocations based on their capacity to deliver. The Department successfully manages the budget by capping funding allocated to each Smart and Skilled training provider. However, training providers have only one year of funding certainty at present. Training providers that are performing well are not rewarded with greater certainty.
The Department needs to improve its communication with prospective students to ensure they can make informed decisions in the VET market.
The Department also needs to communicate more transparently to training providers about its funding allocations and decisions about changes to the NSW Skills List.
The Department relies on stakeholder proposals to update the NSW Skills List. Stakeholders include industry, training providers and government departments. These stakeholders, particularly industry, are likely to be aware of skill needs, and have a strong incentive to propose qualifications that address these needs. The Department’s process of collecting stakeholder proposals helps to ensure that it can identify qualifications needed to address material skill needs.
It is also important that the Department ensures the NSW Skills List only includes priority qualifications that need to be subsidised by government. The Department does not have robust processes in place to remove qualifications from the NSW Skills List. As a result, there is a risk that the list may include lower priority skill areas. Since the NSW Skills List was first created, new additions to the list have outnumbered those removed by five to one.
The Department does not always validate information gathered from stakeholder proposals, even when it has data to do so. Further, its decision making about what to include on, or delete from, the NSW Skills List is not transparent because the rationale for decisions is not adequately documented.
The Department is undertaking projects to better use data to support its decisions about what should be on the NSW Skills List. Some of these projects should deliver useful data soon, but some can only provide useful information when sufficient trend data is available.
Recommendation
The Department should:
- by June 2019, increase transparency of decisions about proposed changes to the NSW Skills List and improve record-keeping of deliberations regarding these changes
- by December 2019, use data more effectively and consistently to ensure that the NSW Skills List only includes high priority qualifications
Only qualifications on the NSW Skills List are eligible for subsidies under Smart and Skilled. As the Department does not have a robust process for removing low priority qualifications from the NSW Skills list, some low priority qualifications may be subsidised.
The Department allocates the Smart and Skilled budget through contracts with Smart and Skilled training providers. Training providers that meet contractual obligations and perform well in terms of enrolments and completion rates are rewarded with renewed contracts and more funding for increased enrolments, but these decisions are not based on student outcomes. The Department reduces or removes funding from training providers that do not meet quality standards, breach contract conditions or that are unable to spend their allocated funding effectively. Contracts are for only one year, offering training providers little funding certainty.
Smart and Skilled provides additional funding for scholarships and for training providers in locations where the cost of delivery is high or to those that cater to students with disabilities. The Department has not yet evaluated whether this additional funding is achieving its intended outcomes.
Eight per cent of the qualifications that have been on the NSW Skills List since 2015 are not delivered under Smart and Skilled anywhere in New South Wales. A further 14 per cent of the qualifications that are offered by training providers have had no student commencements. The Department is yet to identify the reasons that these high priority qualifications are either not offered or not taken up by students.
Recommendation
The Department should:
- by June 2019, investigate why training providers do not offer, and prospective students do not enrol in, some Smart and Skilled subsidised qualifications
- by December 2019, evaluate the effectiveness of Smart and Skilled funding which supplements standard subsidies for qualifications on the NSW Skills List, to determine whether it is achieving its objectives
- by December 2019, provide longer term funding certainty to high performing training providers, while retaining incentives for them to continue to perform well.
In a contestable market, it is important for consumers to have sufficient information to make informed decisions. The Department does not provide some key information to prospective VET students to support their decisions, such as measures of provider quality and examples of employment and further education outcomes of students completing particular courses. Existing information is spread across numerous channels and is not presented in a user friendly manner. This is a potential barrier to participation in VET for those less engaged with the system or less ICT literate.
The Department conveys relevant information about the program to training providers through its websites and its regional offices. However, it could better communicate some specific information directly to individual Smart and Skilled training providers, such as reasons their proposals to include new qualifications on the NSW Skills List are accepted or rejected.
While the Department is implementing a communication strategy for VET in New South Wales, it does not have a specific communications strategy for Smart and Skilled which comprehensively identifies the needs of different stakeholders and how these can be addressed.
Recommendation
By December 2019, the Department should develop and implement a specific communications strategy for Smart and Skilled to:
- support prospective student engagement and informed decision making
- meet the information needs of training providers
Appendix one - Response from agency
Appendix two - About the audit
Appendix three - Performance auditing
Parliamentary reference - Report number #305 - released 26 July 2018
Actions for Council reporting on service delivery
Council reporting on service delivery
New South Wales local government councils’ could do more to demonstrate how well they are delivering services in their reports to the public, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Many councils report activity, but do not report on outcomes in a way that would help their communities assess how well they are performing. Most councils also did not report on the cost of services, making it difficult for communities to see how efficiently they are being delivered. And councils are not consistently publishing targets to demonstrate what they are striving for.
I am pleased to present my first local government performance audit pursuant to section 421D of the Local Government Act 1993.
My new mandate supports the Parliament’s objectives to:
- strengthen governance and financial oversight in the local government sector
- improve financial management, fiscal responsibility and public accountability for how councils use citizens’ funds.
Performance audits aim to help councils improve their efficiency and effectiveness. They will also provide communities with independent information on the performance of their councils.
For this inaugural audit in the local government sector, I have chosen to examine how well councils report to their constituents about the services they provide.
In this way, the report will enable benchmarking and provide improvement guidance to all councils across New South Wales.
Specific recommendations to drive improved reporting are directed to the Office of Local Government, which is the regulator of councils in New South Wales.
Councils provide a range of services which have a direct impact on the amenity, safety and health of their communities. These services need to meet the needs and expectations of their communities, as well as relevant regulatory requirements set by state and federal governments. Councils have a high level of autonomy in decisions about how and to whom they provide services, so it is important that local communities have access to information about how well they are being delivered and meeting community needs. Ultimately councils should aim to ensure that reporting performance is subject to quality controls designed to provide independent assurance.
Councils report extensively on the things they have done, but minimally on the outcomes from that effort, efficiency and performance over time.
Councils could improve reporting on service delivery by more clearly relating the resources needed with the outputs produced, and by reporting against clear targets. This would enable communities to understand how efficiently services are being delivered and how well councils are tracking against their goals and priorities.
Across the sector, a greater focus is also needed on reporting performance over time so that communities can track changes in performance and councils can demonstrate whether they are on target to meet any agreed timeframes for service improvements.
The degree to which councils demonstrate good practice in reporting on service delivery varies greatly between councils. Metropolitan and regional town and city councils generally produce better quality reporting than rural councils. This variation indicates that, at least in the near-term, OLG's efforts in building capability in reporting would be best directed toward rural councils.
Recommendation
By mid-2018, OLG should:
- assist rural councils to develop their reporting capability.
The Framework which councils are required to use to report on service delivery, is intended to drive good practice in reporting. Despite this, the Framework is silent on a number of aspects of reporting that should be considered fundamental to transparent reporting on service delivery. It does not provide guidance on reporting efficiency or cost effectiveness in service delivery and provides limited guidance on how annual reports link with other plans produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.
Recommendation
By mid-2018, OLG should:
- issue additional guidance on good practice in council reporting, with specific information on:
- reporting on performance against targets
- reporting on performance against outcome
- assessing and reporting on efficiency and cost effectiveness
- reporting performance over time
- clearer integration of all reports and plans that are required by the Framework, particularly the role of End of Term Reporting
- defining reporting terms to encourage consistency.
The Framework is silent on inclusion of efficiency or cost effectiveness indicators in reports
The guidelines produced by OLG in 2013 to assist councils to implement their Framework requirements advise that performance measures should be included in all plans. However, the Framework does not specifically state that efficiency or cost effectiveness indicators should be included as part of this process. This has been identified as a weakness in the 2012 performance audit report and the Local Government Reform Panel review of reporting by councils on service delivery.
The Framework and supporting documents provide limited guidance on reporting
Councils' annual reports provide a consolidated summary of their efforts and achievements in service delivery and financial management. However, OLG provides limited guidance on:
- good practice in reporting to the community
- how the annual report links with other plans and reports required by the Framework.
Further, the Framework includes both Annual and End of Term Reports. However, End of Term reports are published prior to council elections and are mainly a consolidation of annual reports produced during a council’s term. The relationship between Annual reports and End of Term reports is not clear.
OLG is reviewing the Framework and guidance
OLG commenced work on reviewing of the Framework in 2013 but this was deferred with work re‑starting in 2017. The revised guidelines and manual were expected to be released late in 2017.
OLG should build on the Framework to improve guidance on reporting on service delivery, including in annual reports
The Framework provides limited guidance on how best to report on service delivery, including in annual reports. It is silent on inclusion of efficiency or cost effectiveness indicators in reporting, which are fundamental aspects of performance reporting. Councils we consulted would welcome more guidance from OLG on these aspects of reporting.
Our consultation with councils highlighted that many council staff would welcome a set of reporting principles that provide guidance to councils, without being prescriptive. This would allow councils to tailor their approach to the individual characteristics, needs and priorities of their local communities.
Consolidating what councils are required to report to state agencies would reduce the reporting burden and enable councils to better report on performance. Comparative performance indicators are also needed to provide councils and the public with a clear understanding of councils' performance relative to each other.
Recommendations
By mid-2018, OLG should:
- commence work to consolidate the information reported by individual councils to NSW Government agencies as part of their compliance requirements.
- progress work on the development of a Performance Measurement Framework, and associated performance indicators, that can be used by councils and the NSW Government in sector-wide performance reporting.
Streamlining the reporting burden would help councils improve reporting
The NSW Government does not have a central view of all local government reporting, planning and compliance obligations. A 2016 draft IPART ‘Review of reporting and compliance burdens on Local Government’ noted that councils provide a wide range of services under 67 different Acts, administered by 27 different NSW Government agencies. Consolidating and coordinating reporting requirements would assist with better reporting over time and comparative reporting. It would also provide an opportunity for NSW Government agencies to reduce the reporting burden on councils by identifying and removing duplication.
Enabling rural councils to perform tailored surveys of their communities may be more beneficial than a state-wide survey in defining outcome indicators
Some councils use community satisfaction survey data to develop outcome indicators for reporting. The results from these are used by councils to set service delivery targets and report on outcomes. This helps to drive service delivery in line with community expectations. While some regional councils do conduct satisfaction surveys, surveys are mainly used by metropolitan councils which generally have the resources needed to run them.
OLG and the Department of Premier and Cabinet have explored the potential to conduct state-wide resident satisfaction surveys with a view to establishing measures to improve service delivery. This work has drawn from a similar approach adopted in Victoria. Our consultation with stakeholders in Victoria indicated that the state level survey is not sufficiently detailed or specific enough to be used as a tool in setting targets that respond to local circumstances, expectations and priorities. Our analysis of reports and consultation with stakeholders suggest that better use of resident survey data in rural and regional areas may support improvements in performance reporting in these areas. Rural councils may benefit more from tailored surveys of groups of councils with similar challenges, priorities and circumstances than from a standard state-wide survey. These could potentially be achieved through regional cooperation between groups of similar councils or regional groups.
Comparative reporting indicators are needed to enable councils to respond to service delivery priorities of their communities
The Local Government Reform Panel in 2012 identified the need for ‘more consistent data collection and benchmarking to enable councils and the public to gain a clear understanding of how a council is performing relative to their peers’.
OLG commenced work in 2012 to build a new performance measurement Framework for councils which aimed to move away from compliance reporting. This work was also strongly influenced by the approach used in Victoria that requires councils to report on a set of 79 indicators which are reported on the Victorian 'Know your council' website. OLG’s work did not fully progress at the time and several other local government representative bodies have since commenced work to establish performance measurement frameworks. OLG advised us it has recently recommenced its work on this project.
Our consultation identified some desire amongst councils to be able to compare their performance to support improvement in the delivery of services. We also identified a level of frustration that more progress has not been made toward establishment of a set of indicators that councils can use to measure performance and drive improvement in service delivery.
Several councils we spoke with were concerned that the current approaches to comparative reporting did not adequately acknowledge that councils need to tailor their service types, level and mix to the needs of their community. Comparative reporting approaches tend to focus on output measures such as number of applications processed, library loans annually and opening hours for sporting facilities, rather than outcome measures. These approaches risk unjustified and adverse interpretations of performance where councils have made a decision based on community consultation, local priorities and available resources. To mitigate this, it is important to
- adopt a partnership approach to the development of indicators
- ensure indicators measure performance, not just level of activity
- compare performance between councils that are similar in terms of size and location.
It may be more feasible, at least in the short term, for OLG to support small groups of like councils to develop indicators suited to their situation.
Based on our consultations, key lessons from implementing a sector-wide performance indicator framework in Victoria included the benefits of:
- consolidation of the various compliance data currently being reported by councils to provide an initial platform for comparative performance reporting
- adopting a partnership approach to development of common indicators with groups of like councils.
Appendix one - Response from agency
Appendix two - Service delivery categorisation
Appendix three - Reporting targets and performance over time
Appendix four - Performance auditing
Appendix five - About the audit
Parliamentary reference - Report number #296 - released 1 February 2018
Actions for Managing demand for ambulance services 2017
Managing demand for ambulance services 2017
NSW Ambulance has introduced several initiatives over the past decade to better manage the number of unnecessary ambulance responses and transports to hospital emergency departments. However, there is no overall strategy to guide the development of these initiatives nor do NSW Ambulance's data systems properly monitor their impact. As a result, the Audit Office was unable to assess whether NSW Ambulance's approach to managing demand is improving the efficiency of ambulance services.
NSW Ambulance uses a telephone referral system to manage triple zero calls from people with medical issues that do not require an ambulance. This has the potential to achieve efficiency improvements but there are weaknesses in NSW Ambulance's use and monitoring of this system. Paramedics are now able to make decisions about whether patients need transport to a hospital emergency department. NSW Ambulance does not routinely measure or monitor the decisions paramedics make, so it does not know whether these decisions are improving efficiency. Extended Care Paramedics who have additional skills in diagnosing and treating patients with less urgent medical issues were introduced in 2007. NSW Ambulance analysis indicates that these paramedics have the potential to improve efficiency, but have not been used as effectively as possible.
Our 2013 audit of NSW Ambulance found that accurate monitoring of activity and performance was not being conducted. More than four years later, this remains the case.
NSW Ambulance has recognised the need to change the way it manages demand and has developed initiatives that have the potential to improve efficiency. However, there are significant weaknesses in the strategy for and implementation of its demand management initiatives.
NSW Ambulance has identified the goal of moving from an emergency transport provider to a mobile health service and developed several initiatives to support this. Its demand management initiatives have the potential to contribute to the broader policy directions for the health system in New South Wales. However, there is no clear overall strategy guiding these initiatives and their implementation has been poor.
NSW Ambulance's reasons for changing its approach to demand management have not been communicated proactively to the community. Demand management initiatives that have been operating for over a decade still do not have clear performance measures or targets. Project management of new initiatives has been inadequate, with insufficient organisational resources to oversee them and inadequate engagement with other healthcare providers.
NSW Ambulance uses an in-house Vocational Education and Training course to recruit some paramedics, as well as recruiting paramedics who have completed a university degree. No other Australian ambulance services continue to provide their own Vocational Education and Training qualifications. Paramedics will need more support in several key areas to be able to fulfil their expanded roles in providing a mobile health service. Performance and development systems for paramedics are not used effectively. Up to date technology would help paramedics make better decisions and improve NSW Ambulance's ability to monitor demand management activity.
There are gaps in NSW Ambulance's oversight of the risks of some of the initiatives it has introduced, particularly its lack of information on the outcomes for patients who are not transported to hospital. Weaknesses in the way NSW Ambulance uses its data limit its ability to properly assess the risks of the demand management initiatives it has introduced.
Parliamentary reference - Report number #295 - released 13 December 2017
Actions for Planning and evaluating palliative care services in NSW
Planning and evaluating palliative care services in NSW
NSW Health’s approach to planning and evaluating palliative care is not effectively coordinated. There is no overall policy framework for palliative and end-of-life care, nor is there comprehensive monitoring and reporting on services and outcomes.
Parliamentary reference - Report number #291 - released 17 August 2017
Actions for Information and Communication Technologies in schools for teaching and learning
Information and Communication Technologies in schools for teaching and learning
Several factors are reducing effective use of information and communication technology (ICT) in the classroom.
These are primarily:
Parliamentary reference - Report number #289 - released 6 July 2017