Refine search Expand filter

Reports

Published

Actions for Planning and Environment 2018

Planning and Environment 2018

Planning
Environment
Asset valuation
Financial reporting
Information technology
Infrastructure
Internal controls and governance
Service delivery

The Auditor-General for New South Wales, Margaret Crawford, released her report today on the NSW Planning and Environment cluster. The report focuses on key observations and findings from the most recent financial audits of these agencies. Unqualified audit opinions were issued for all agencies' financial statements. However, some cultural institutions had challenges valuing collection assets in 2017–18. These issues were resolved before the financial statements were finalised.

This report analyses the results of our audits of financial statements of the Planning and Environment cluster for the year ended 30 June 2018. The table below summarises our key observations.

This report provides parliament and other users of the Planning and Environment cluster agencies' financial statements with the results of our audits, our observations, analysis, conclusions and recommendations in the following areas:

  • financial reporting
  • audit observations
  • service delivery.

Financial reporting is an important element of good governance. Confidence and transparency in public sector decision making is enhanced when financial reporting is accurate and timely.

This chapter outlines our audit observations related to the financial reporting of agencies in the Planning and Environment cluster for 2018.

Observation Conclusions and recommendations
2.1 Quality of financial reporting
Unqualified audit opinions were issued for all agencies' financial statements. The quality of financial reporting remains high across the cluster.
2.2 Key accounting issues
There were errors in some cultural institutions' collection asset valuations. Recommendation: Collection asset valuations could be improved by:
  • early engagement with key stakeholders regarding the valuation method and approach
  • completing revaluations, including quality review processes earlier 
  • improving the quality of asset data by registering all items in an electronic database. 
2.3 Timeliness of financial reporting
Except for two agencies, the audits of cluster agencies’ financial statements were completed within the statutory timeframe.  Issues with asset revaluations delayed the finalisation of two environment and heritage agencies' financial statement audits. 

Appropriate financial controls help ensure the efficient and effective use of resources and administration of agency policies. They are essential for quality and timely decision making.

This chapter outlines our observations and insights from:

  • our financial statement audits of agencies in the Planning and Environment cluster for 2018
  • the areas of focus identified in the Audit Office work program.

The Audit Office annual work program provides a summary of all audits to be conducted within the proposed time period as well as detailed information on the areas of focus for each of the NSW Government clusters.

Observation Conclusions and recommendations
3.1 Internal controls
One in five internal control weaknesses reported in 2017–18 were repeat issues. Delays in implementing audit recommendations can prolong the risk of fraud and error.
Recommendation (repeat issue): Management letter recommendations to address internal control weaknesses should be actioned promptly, with a focus on addressing repeat issues.
One extreme risk was identified relating to the National Art School. The School does not have an occupancy agreement for the Darlinghurst campus. Lack of formal agreement creates uncertainty over the School's continued occupancy of the Darlinghurst site.

The School should continue to liaise with stakeholders to formalise the occupancy arrangement. 
 
3.2 Information technology controls
The controls and governance arrangements when migrating payroll data from the Aurion system to SAP HR system were effective. Data migration from the Aurion system to SAP HR system had no significant issues.
The Department can improve controls over user access to SAP system. The Department needs to ensure the SAP user access controls are appropriate, including investigation of excess access rights and resolving segregation of duties issues. 
3.3 Annual work program
Agencies used different benchmarks to monitor their maintenance expenditure. The cluster agencies under review operate in different industries. As a result, they do not use the same benchmarks to assess the adequacy of their maintenance spend. 

This chapter outlines certain service delivery outcomes for 2017–18. The data on activity levels and performance is provided by cluster agencies. The Audit Office does not have a specific mandate to audit performance information. Accordingly, the information in this chapter is unaudited. 

We report this information on service delivery to provide additional context to understand the operations of the Planning and Environment cluster, and to collate and present service information for different segments of the cluster in one report. 

In our recent performance audit, ‘Progress and measurement of Premier's Priorities’, we identified 12 limitations of performance measurement and performance data. We recommended the Department of Premier and Cabinet ensure that processes to check and verify data are in place for all relevant agency data sources.

Published

Actions for Regulation of water pollution in drinking water catchments and illegal disposal of solid waste

Regulation of water pollution in drinking water catchments and illegal disposal of solid waste

Environment
Compliance
Internal controls and governance
Management and administration
Regulation
Risk

There are important gaps in how the Environmental Protection Authority (EPA) implements its regulatory framework for water pollution in drinking water catchments and illegal solid waste disposal. This limits the effectiveness of its regulatory responses, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford.

The NSW Environment Protection Authority (the EPA) is the State’s primary environmental regulator. The EPA regulates waste and water pollution under the Protection of the Environment Operations Act 1997 (the Act) through its licensing, monitoring, regulation and enforcement activities. The community should be able to rely on the effectiveness of this regulation to protect the environment and human health. The EPA has regulatory responsibility for more significant and specific activities which can potentially harm the environment.

Activities regulated by the EPA include manufacturing, chemical production, electricity generation, mining, waste management, livestock processing, mineral processing, sewerage treatment, and road construction. For these activities, the operator must have an EPA issued environment protection licence (licence). Licences have conditions attached which may limit the amount and concentrations of substances the activity may produce and discharge into the environment. Conditions also require the licensee to report on its licensed activities.

This audit assessed the effectiveness of the EPA’s regulatory response to water pollution in drinking water catchments and illegal solid waste disposal. The findings and recommendations of this review can be reasonably applied to the EPA’s other regulatory functions, as the areas we examined were indicative of how the EPA regulates all pollution types and incidents.

 
Conclusion
There are important gaps in how the EPA implements its regulatory framework for water pollution in drinking water catchments and illegal solid waste disposal which limit the effectiveness of its regulatory response. The EPA uses a risk-based regulatory framework that has elements consistent with the NSW Government Guidance for regulators to implement outcomes and risk-based regulation. However, the EPA did not demonstrate that it has established reliable practices to accurately and consistently detect the risk of non compliances by licensees, and apply consistent regulatory actions. This may expose the risk of harm to the environment and human health.
The EPA also could not demonstrate that it has effective governance and oversight of its regulatory operations. The EPA operates in a complex regulatory environment where its regional offices have broad discretions for how they operate. The EPA has not balanced this devolved structure with an effective governance approach that includes appropriate internal controls to monitor the consistency or quality of its regulatory activities. It also does not have an effective performance framework that sets relevant performance expectations and outcome-based key performance indicators (KPIs) for its regional offices. 
These deficiencies mean that the EPA cannot be confident that it conducts compliance and enforcement activities consistently across the State and that licensees are complying with their licence conditions or the Act.
The EPA's reporting on environmental and regulatory outcomes is limited and most of the data it uses is self reported by industry. It has not set outcome-based key result areas to assess performance and trends over time. 
The EPA uses a risk-based regulatory framework for water pollution and illegal solid waste disposal but there are important gaps in implementation that reduce its effectiveness.
Elements of the EPA’s risk-based regulatory framework for water pollution and illegal solid waste disposal are consistent with the NSW Government Guidance for regulators to implement outcomes and risk-based regulation. There are important gaps in how the EPA implements its risk-based approach that limit the effectiveness of its regulatory response. The EPA could not demonstrate that it effectively regulates licensees because it has not established reliable practices that accurately and consistently detect licence non compliances or breaches of the Act and enforce regulatory actions.
The EPA lacks effective governance arrangements to support its devolved regional structure. The EPA's performance framework has limited and inconclusive reporting on regional performance to the EPA’s Chief Executive Officer or to the EPA Board. The EPA cannot assure that it is conducting its regulatory responsibilities effectively and efficiently. 
The EPA does not consistently evaluate its regulatory approach to ensure it is effective and efficient. For example, there are no set requirements for how EPA officers conduct mandatory site inspections, which means that there is a risk that officers are not detecting all breaches or non-compliances. The inconsistent approach also means that the EPA cannot rely on the data it collects from these site inspections to understand whether its regulatory response is effective and efficient. In addition, where the EPA identifies instances of non compliance or breaches, it does not apply all available regulatory actions to encourage compliance.
The EPA also does not have a systematic approach to validate self-reported information in licensees’ annual returns, despite the data being used to assess administrative fees payable to the EPA and its regulatory response to non-compliances. 
The EPA does not use performance frameworks to monitor the consistency or quality of work conducted across the State. The EPA has also failed to provide effective guidance for its staff. Many of its policies and procedures are out-dated, inconsistent, hard to access, or not mandated.
Recommendations
By 31 December 2018, to improve governance and oversight, the EPA should:
1. implement a more effective performance framework with regular reports to the Chief Executive Officer and to the EPA Board on outcomes-based key result areas that assess its environmental and regulatory performance and trends over time
By 30 June 2019, to improve consistency in its practices, the EPA should:
2. progressively update and make accessible its policies and procedures for regulatory operations, and mandate procedures where necessary to ensure consistent application
3. implement internal controls to monitor the consistency and quality of its regulatory operations. 
The EPA does not apply a consistent approach to setting licence conditions for discharges to water.
The requirements for setting licence conditions for water pollution are complex and require technical and scientific expertise. In August 2016, the EPA approved guidance developed by its technical experts in the Water Technical Advisory Unit to assist its regional staff. However, the EPA did not mandate the use of the guidance until mid-April 2018. Up until then, the EPA had left discretion to regional offices to decide what guidance their staff use. This meant that practices have differed across the organisation. The EPA is yet to conduct training for staff to ensure they consistently apply the 2016 guidance.
The EPA has not implemented any appropriate internal controls or quality assurance process to monitor the consistency or quality of licence conditions set by its officers across the State. This is not consistent with good regulatory practice.
The triennial 2016 audit of the Sydney drinking water catchment report highlighted that Lake Burragorang has experienced worsening water quality over the past 20 years from increased salinity levels. The salinity levels were nearly twice as high as in other storages in the Sydney drinking water catchment. The report recommended that the source and implication of the increased salinity levels be investigated. The report did not propose which public authority should carry out such an investigation. 
To date, no NSW Government agency has addressed the report's recommendation. There are three public authorities, the EPA, DPE and WaterNSW that are responsible for regulating activities that impact on water quality in the Sydney drinking water catchment, which includes Lake Burragorang. 
Recommendation
By 30 June 2019, to address worsening water quality in Lake Burragorang, the EPA should:
4. (a) review the impact of its licensed activities on water quality in Lake Burragorang, and
  (b) develop strategies relating to its licensed activities (in consultation with other relevant NSW Government agencies) to improve and maintain the lake's water quality.
The EPA’s risk-based approach to monitoring compliance of licensees has limited effectiveness. 
The EPA tailors its compliance monitoring approach based on the performance of licensees. This means that licensees that perform better have a lower administrative fee and fewer mandatory site inspections. 
However, this approach relies on information that is not complete or accurate. Sources of information include licensees’ annual returns, EPA site inspections and compliance audits, and pollution reports from the public. 
Licensees report annually to the EPA on their performance, including compliance against their licence conditions. The Act contains significant financial penalties if licensees provide false and misleading information in their annual returns. However, the EPA does not systematically or consistently validate information self-reported by licensees, or consistently apply regulatory actions if it discovers non-compliance. 
Self-reported compliance data is used in part to assess a licensed premises’ overall environmental risk level, which underpins the calculation of the administrative fee, the EPA’s site inspection frequency, and the licensee’s exposure to regulatory actions. It is also used to assess the load-based licence fee that the licensee pays.
The EPA has set minimum mandatory site inspection frequencies for licensed premises based on its assessed overall risk level. This is a key tool to detect non-compliance or breaches of the Act. However, the EPA has not issued a policy or procedures that define what these mandatory inspections should cover and how they are to be conducted. We found variations in how the EPA officers in the offices we visited conducted these inspections. The inconsistent approach means that the EPA does not have complete and accurate information of licensees’ compliance. The inconsistent approach also means that the EPA is not effectively identifying all non-compliances for it to consider applying appropriate regulatory actions.
The EPA also receives reports of pollution incidents from the public that may indicate non-compliance. However, the EPA has not set expected time frames within which it expects its officers to investigate pollution incidents. The EPA regional offices decide what to investigate and timeframes. The EPA does not measure regional performance regarding timeframes. 
The few compliance audits the EPA conducts annually are effective in identifying licence non-compliances and breaches of the Act. However, the EPA does not have a policy or required procedures for its regulatory officers to consistently apply appropriate regulatory actions in response to compliance audit findings. 
The EPA has not implemented any effective internal controls or quality assurance process to check the consistency or quality of how its regulatory officers monitor compliance across the State. This is not consistent with good regulatory practice.
Recommendations
To improve compliance monitoring, the EPA should implement procedures to:
5. by 30 June 2019, validate self-reported information, eliminate hardcopy submissions and require licensees to report on their breaches of the Act and associated regulations in their annual returns
6. by 31 December 2018, conduct mandatory site inspections under the risk-based licensing scheme to assess compliance with all regulatory requirements and licence conditions.
 
The EPA cannot assure that its regulatory enforcement approach is fully effective.
The EPA’s compliance policy and prosecution guidelines have a large number of available regulatory actions and factors which should be taken into account when selecting an appropriate regulatory response. The extensive legislation determining the EPA’s regulatory activities, and the devolved regional structure the EPA has adopted in delivering its compliance and regulatory functions, increases the risk of inconsistent compliance decisions and regulatory responses. A good regulatory framework needs a consistent approach to enforcement to incentivise compliance. 
The EPA has not balanced this devolved regional structure with appropriate governance arrangements to give it assurance that its regulatory officers apply a consistent approach to enforcement.
The EPA has not issued standard procedures to ensure consistent non-court enforcement action for breaches of the Act or non-compliance with licence conditions. Given our finding that the EPA does not effectively detect breaches and non-compliances, there is a risk that it is not applying appropriate regulatory actions for many breaches and non-compliances.
A recent EPA compliance audit identified significant non-compliances with incident management plan requirements. However, the EPA has not applied regulatory actions for making false statements on annual returns for those licensees that certified their plans complied with such requirements. The EPA also has not applied available regulatory actions for the non-compliances which led to the false or misleading statements.
Recommendation
By 31 December 2018 to improve enforcement, the EPA should:
7. Implement procedures to systematically assess non-compliances with licence conditions and breaches of the Act and to implement appropriate and consistent regulatory actions.
The EPA has implemented the actions listed in the NSW Illegal Dumping Strategy 2014–16. To date, the EPA has also implemented four of the six recommendations made by the ICAC on EPA's oversight of Regional Illegal Dumping Squads.
The EPA did not achieve the NSW Illegal Dumping Strategy 2014–16 target of a 30 per cent reduction in instances of large scale illegal dumping in Sydney, the Illawarra, Hunter and Central Coast from 2011 levels. 
In the reporting period, the incidences of large scale illegal dumping more than doubled. The EPA advised that this increase may be the result of greater public awareness and reporting rather than increased illegal dumping activity. 
By June 2018, the EPA is due to implement one outstanding recommendation made by the ICAC but has not set a time for the other outstanding recommendation.  

Published

Actions for Council reporting on service delivery

Council reporting on service delivery

Local Government
Compliance
Internal controls and governance
Management and administration
Service delivery

New South Wales local government councils’ could do more to demonstrate how well they are delivering services in their reports to the public, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Many councils report activity, but do not report on outcomes in a way that would help their communities assess how well they are performing. Most councils also did not report on the cost of services, making it difficult for communities to see how efficiently they are being delivered. And councils are not consistently publishing targets to demonstrate what they are striving for.

I am pleased to present my first local government performance audit pursuant to section 421D of the Local Government Act 1993.

My new mandate supports the Parliament’s objectives to:

  • strengthen governance and financial oversight in the local government sector
  • improve financial management, fiscal responsibility and public accountability for how councils use citizens’ funds.

Performance audits aim to help councils improve their efficiency and effectiveness. They will also provide communities with independent information on the performance of their councils.

For this inaugural audit in the local government sector, I have chosen to examine how well councils report to their constituents about the services they provide.

In this way, the report will enable benchmarking and provide improvement guidance to all councils across New South Wales.

Specific recommendations to drive improved reporting are directed to the Office of Local Government, which is the regulator of councils in New South Wales.

Councils provide a range of services which have a direct impact on the amenity, safety and health of their communities. These services need to meet the needs and expectations of their communities, as well as relevant regulatory requirements set by state and federal governments. Councils have a high level of autonomy in decisions about how and to whom they provide services, so it is important that local communities have access to information about how well they are being delivered and meeting community needs. Ultimately councils should aim to ensure that reporting performance is subject to quality controls designed to provide independent assurance.

Conclusion
While councils report on outputs, reporting on outcomes and performance over time can be improved. Improved reporting would include objectives with targets that better demonstrate performance over time. This would help communities understand what services are being delivered, how efficiently and effectively they are being delivered, and what improvements are being made.
To ensure greater transparency on service effectiveness and efficiency, the Office of Local Government (OLG) should work with councils to develop guidance principles to improve reporting on service delivery to local communities. This audit identified an interest amongst councils in improving their reporting and broad agreement with the good practice principles developed as part of the audit.
The Integrated Planning and Reporting Framework (the Framework), which councils are required to use to report on service delivery, is intended to promote better practice. However, the Framework is silent on efficiency reporting and provides limited guidance on how long-term strategic documents link with annual reports produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.
OLG should also work with state agencies to reduce the overall reporting burden on councils by consolidating state agency reporting requirements. 

Councils report extensively on the things they have done, but minimally on the outcomes from that effort, efficiency and performance over time.

Councils could improve reporting on service delivery by more clearly relating the resources needed with the outputs produced, and by reporting against clear targets. This would enable communities to understand how efficiently services are being delivered and how well councils are tracking against their goals and priorities.

Across the sector, a greater focus is also needed on reporting performance over time so that communities can track changes in performance and councils can demonstrate whether they are on target to meet any agreed timeframes for service improvements.

The degree to which councils demonstrate good practice in reporting on service delivery varies greatly between councils. Metropolitan and regional town and city councils generally produce better quality reporting than rural councils. This variation indicates that, at least in the near-term, OLG's efforts in building capability in reporting would be best directed toward rural councils.

Recommendation

By mid-2018, OLG should:

  • assist rural councils to develop their reporting capability.

The Framework which councils are required to use to report on service delivery, is intended to drive good practice in reporting. Despite this, the Framework is silent on a number of aspects of reporting that should be considered fundamental to transparent reporting on service delivery. It does not provide guidance on reporting efficiency or cost effectiveness in service delivery and provides limited guidance on how annual reports link with other plans produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.

Recommendation

By mid-2018, OLG should:

  • issue additional guidance on good practice in council reporting, with specific information on:
    • reporting on performance against targets
    • reporting on performance against outcome
    • assessing and reporting on efficiency and cost effectiveness
    • reporting performance over time
    • clearer integration of all reports and plans that are required by the Framework, particularly the role of End of Term Reporting
    • defining reporting terms to encourage consistency.

The Framework is silent on inclusion of efficiency or cost effectiveness indicators in reports

The guidelines produced by OLG in 2013 to assist councils to implement their Framework requirements advise that performance measures should be included in all plans. However, the Framework does not specifically state that efficiency or cost effectiveness indicators should be included as part of this process. This has been identified as a weakness in the 2012 performance audit report and the Local Government Reform Panel review of reporting by councils on service delivery.

The Framework and supporting documents provide limited guidance on reporting

Councils' annual reports provide a consolidated summary of their efforts and achievements in service delivery and financial management. However, OLG provides limited guidance on:

  • good practice in reporting to the community
  • how the annual report links with other plans and reports required by the Framework.

Further, the Framework includes both Annual and End of Term Reports. However, End of Term reports are published prior to council elections and are mainly a consolidation of annual reports produced during a council’s term. The relationship between Annual reports and End of Term reports is not clear.

OLG is reviewing the Framework and guidance

OLG commenced work on reviewing of the Framework in 2013 but this was deferred with work re‑starting in 2017. The revised guidelines and manual were expected to be released late in 2017.

OLG should build on the Framework to improve guidance on reporting on service delivery, including in annual reports

The Framework provides limited guidance on how best to report on service delivery, including in annual reports. It is silent on inclusion of efficiency or cost effectiveness indicators in reporting, which are fundamental aspects of performance reporting. Councils we consulted would welcome more guidance from OLG on these aspects of reporting.

Our consultation with councils highlighted that many council staff would welcome a set of reporting principles that provide guidance to councils, without being prescriptive. This would allow councils to tailor their approach to the individual characteristics, needs and priorities of their local communities.

Consolidating what councils are required to report to state agencies would reduce the reporting burden and enable councils to better report on performance. Comparative performance indicators are also needed to provide councils and the public with a clear understanding of councils' performance relative to each other.

Recommendations

By mid-2018, OLG should:

  • commence work to consolidate the information reported by individual councils to NSW Government agencies as part of their compliance requirements.
  • progress work on the development of a Performance Measurement Framework, and associated performance indicators, that can be used by councils and the NSW Government in sector-wide performance reporting.

Streamlining the reporting burden would help councils improve reporting

The NSW Government does not have a central view of all local government reporting, planning and compliance obligations. A 2016 draft IPART ‘Review of reporting and compliance burdens on Local Government’ noted that councils provide a wide range of services under 67 different Acts, administered by 27 different NSW Government agencies. Consolidating and coordinating reporting requirements would assist with better reporting over time and comparative reporting. It would also provide an opportunity for NSW Government agencies to reduce the reporting burden on councils by identifying and removing duplication.

Enabling rural councils to perform tailored surveys of their communities may be more beneficial than a state-wide survey in defining outcome indicators

Some councils use community satisfaction survey data to develop outcome indicators for reporting. The results from these are used by councils to set service delivery targets and report on outcomes. This helps to drive service delivery in line with community expectations. While some regional councils do conduct satisfaction surveys, surveys are mainly used by metropolitan councils which generally have the resources needed to run them.

OLG and the Department of Premier and Cabinet have explored the potential to conduct state-wide resident satisfaction surveys with a view to establishing measures to improve service delivery. This work has drawn from a similar approach adopted in Victoria. Our consultation with stakeholders in Victoria indicated that the state level survey is not sufficiently detailed or specific enough to be used as a tool in setting targets that respond to local circumstances, expectations and priorities. Our analysis of reports and consultation with stakeholders suggest that better use of resident survey data in rural and regional areas may support improvements in performance reporting in these areas. Rural councils may benefit more from tailored surveys of groups of councils with similar challenges, priorities and circumstances than from a standard state-wide survey. These could potentially be achieved through regional cooperation between groups of similar councils or regional groups.

Comparative reporting indicators are needed to enable councils to respond to service delivery priorities of their communities

The Local Government Reform Panel in 2012 identified the need for ‘more consistent data collection and benchmarking to enable councils and the public to gain a clear understanding of how a council is performing relative to their peers’.

OLG commenced work in 2012 to build a new performance measurement Framework for councils which aimed to move away from compliance reporting. This work was also strongly influenced by the approach used in Victoria that requires councils to report on a set of 79 indicators which are reported on the Victorian 'Know your council' website. OLG’s work did not fully progress at the time and several other local government representative bodies have since commenced work to establish performance measurement frameworks. OLG advised us it has recently recommenced its work on this project.

Our consultation identified some desire amongst councils to be able to compare their performance to support improvement in the delivery of services. We also identified a level of frustration that more progress has not been made toward establishment of a set of indicators that councils can use to measure performance and drive improvement in service delivery.

Several councils we spoke with were concerned that the current approaches to comparative reporting did not adequately acknowledge that councils need to tailor their service types, level and mix to the needs of their community. Comparative reporting approaches tend to focus on output measures such as number of applications processed, library loans annually and opening hours for sporting facilities, rather than outcome measures. These approaches risk unjustified and adverse interpretations of performance where councils have made a decision based on community consultation, local priorities and available resources. To mitigate this, it is important to

  • adopt a partnership approach to the development of indicators
  • ensure indicators measure performance, not just level of activity
  • compare performance between councils that are similar in terms of size and location.

It may be more feasible, at least in the short term, for OLG to support small groups of like councils to develop indicators suited to their situation.

Based on our consultations, key lessons from implementing a sector-wide performance indicator framework in Victoria included the benefits of:

  • consolidation of the various compliance data currently being reported by councils to provide an initial platform for comparative performance reporting
  • adopting a partnership approach to development of common indicators with groups of like councils.