Refine search Expand filter

Reports

Published

Actions for Mobile speed cameras

Mobile speed cameras

Transport
Compliance
Financial reporting
Information technology
Internal controls and governance
Management and administration
Regulation
Service delivery

Key aspects of the state’s mobile speed camera program need to be improved to maximise road safety benefits, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Mobile speed cameras are deployed in a limited number of locations with a small number of these being used frequently. This, along with decisions to limit the hours that mobile speed cameras operate, and to use multiple warning signs, have reduced the broad deterrence of speeding across the general network - the main policy objective of the mobile speed camera program.

The primary goal of speed cameras is to reduce speeding and make the roads safer. Our 2011 performance audit on speed cameras found that, in general, speed cameras change driver behaviour and have a positive impact on road safety.

Transport for NSW published the NSW Speed Camera Strategy in June 2012 in response to our audit. According to the Strategy, the main purpose of mobile speed cameras is to reduce speeding across the road network by providing a general deterrence through anywhere, anytime enforcement and by creating a perceived risk of detection across the road network. Fixed and red-light speed cameras aim to reduce speeding at specific locations.

Roads and Maritime Services and Transport for NSW deploy mobile speed cameras (MSCs) in consultation with NSW Police. The cameras are operated by contractors authorised by Roads and Maritime Services. MSC locations are stretches of road that can be more than 20 kilometres long. MSC sites are specific places within these locations that meet the requirements for a MSC vehicle to be able to operate there.

This audit assessed whether the mobile speed camera program is effectively managed to maximise road safety benefits across the NSW road network.

Conclusion

The mobile speed camera program requires improvements to key aspects of its management to maximise road safety benefits. While camera locations have been selected based on crash history, the limited number of locations restricts network coverage. It also makes enforcement more predictable, reducing the ability to provide a general deterrence. Implementation of the program has been consistent with government decisions to limit its hours of operation and use multiple warning signs. These factors limit the ability of the mobile speed camera program to effectively deliver a broad general network deterrence from speeding.

Many locations are needed to enable network-wide coverage and ensure MSC sessions are randomised and not predictable. However, there are insufficient locations available to operate MSCs that meet strict criteria for crash history, operator safety, signage and technical requirements. MSC performance would be improved if there were more locations.

A scheduling system is meant to randomise MSC location visits to ensure they are not predictable. However, a relatively small number of locations have been visited many times making their deployment more predictable in these places. The allocation of MSCs across the time of day, day of week and across regions is prioritised based on crash history but the frequency of location visits does not correspond with the crash risk for each location.

There is evidence of a reduction in fatal and serious crashes at the 30 best-performing MSC locations. However, there is limited evidence that the current MSC program in NSW has led to a behavioural change in drivers by creating a general network deterrence. While the overall reduction in serious injuries on roads has continued, fatalities have started to climb again. Compliance with speed limits has improved at the sites and locations that MSCs operate, but the results of overall network speed surveys vary, with recent improvements in some speed zones but not others.
There is no supporting justification for the number of hours of operation for the program. The rate of MSC enforcement (hours per capita) in NSW is less than Queensland and Victoria. The government decision to use multiple warning signs has made it harder to identify and maintain suitable MSC locations, and impeded their use for enforcement in both traffic directions and in school zones. 

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #308 - released 18 October 2018

Published

Actions for Matching skills training with market needs

Matching skills training with market needs

Industry
Compliance
Internal controls and governance
Management and administration
Risk
Service delivery
Workforce and capability

The NSW Department of Industry targets subsidies towards training programs delivering skills most needed in New South Wales. However, the Department still provides subsidies to qualifications that the market may no longer need, according to a report released by Margaret Crawford, Auditor-General for New South Wales. 

In 2012, governments across Australia entered into the National Partnership Agreement on Skills Reform. Under the National Partnership Agreement, the Australian Government provided incentive payments to States and Territories to move towards a more contestable Vocational Education and Training (VET) market. The aim of the National Partnership Agreement was to foster a more accessible, transparent, efficient and high quality training sector that is responsive to the needs of students and industry. 

The New South Wales Government introduced the Smart and Skilled program in response to the National Partnership Agreement. Through Smart and Skilled, students can choose a vocational course from a list of approved qualifications and training providers. Students pay the same fee for their chosen qualification regardless of the selected training provider and the government covers the gap between the student fee and the fixed price of the qualification through a subsidy paid to their training provider. 

Smart and Skilled commenced in January 2015, with the then Department of Education and Communities having primary responsibility for its implementation. Since July 2015, the NSW Department of Industry (the Department) has been responsible for VET in New South Wales and the implementation of Smart and Skilled. 

The NSW Skills Board, comprising nine part-time members appointed by the Minister for Skills, provides independent strategic advice on VET reform and funding. In line with most other States and Territories, the Department maintains a 'Skills List' which contains government subsidised qualifications to address identified priority skill needs in New South Wales.

This audit assessed the effectiveness of the Department in identifying, prioritising, and aligning course subsidies to the skill needs of NSW. To do this we examined whether:

  • the Department effectively identifies and prioritises present and future skill needs 
  • Smart and Skilled funding is aligned with the priority skill areas
  • skill needs and available VET courses are effectively communicated to potential participants and training providers.

Smart and Skilled is a relatively new and complex program, and is being delivered in the context of significant reform to VET nationally and in New South Wales. A large scale government funded contestable market was not present in the VET sector in New South Wales before the introduction of Smart and Skilled. This audit's findings should be considered in that context.
 

Conclusion
The Department effectively consults with industry, training providers and government departments to identify skill needs, and targets subsidies to meet those needs. However, the Department does not have a robust, data driven process to remove subsidies from qualifications which are no longer a priority. There is a risk that some qualifications are being subsidised which do not reflect the skill needs of New South Wales. 
The Department needs to better use the data it has, and collect additional data, to support its analysis of priority skill needs in New South Wales, and direct funding accordingly.
In addition to subsidising priority qualifications, the Department promotes engagement in skills training by:
  • funding scholarships and support for disadvantaged students
  • funding training in regional and remote areas
  • providing additional support to deliver some qualifications that the market is not providing.

The Department needs to evaluate these funding strategies to ensure they are achieving their goals. It should also explore why training providers are not delivering some priority qualifications through Smart and Skilled.

Training providers compete for funding allocations based on their capacity to deliver. The Department successfully manages the budget by capping funding allocated to each Smart and Skilled training provider. However, training providers have only one year of funding certainty at present. Training providers that are performing well are not rewarded with greater certainty.

The Department needs to improve its communication with prospective students to ensure they can make informed decisions in the VET market.

The Department also needs to communicate more transparently to training providers about its funding allocations and decisions about changes to the NSW Skills List. 

The NSW Skills List is unlikely to be missing high priority qualifications, but may include lower priority qualifications because the Department does not have a robust process to identify and remove these qualifications from the list. The Department needs to better use available data, and collect further data, to support decisions about which qualifications should be on the NSW Skills List.

The Department relies on stakeholder proposals to update the NSW Skills List. Stakeholders include industry, training providers and government departments. These stakeholders, particularly industry, are likely to be aware of skill needs, and have a strong incentive to propose qualifications that address these needs. The Department’s process of collecting stakeholder proposals helps to ensure that it can identify qualifications needed to address material skill needs. 

It is also important that the Department ensures the NSW Skills List only includes priority qualifications that need to be subsidised by government. The Department does not have robust processes in place to remove qualifications from the NSW Skills List. As a result, there is a risk that the list may include lower priority skill areas. Since the NSW Skills List was first created, new additions to the list have outnumbered those removed by five to one.

The Department does not always validate information gathered from stakeholder proposals, even when it has data to do so. Further, its decision making about what to include on, or delete from, the NSW Skills List is not transparent because the rationale for decisions is not adequately documented. 

The Department is undertaking projects to better use data to support its decisions about what should be on the NSW Skills List. Some of these projects should deliver useful data soon, but some can only provide useful information when sufficient trend data is available. 

Recommendation

The Department should: 

  • by June 2019, increase transparency of decisions about proposed changes to the NSW Skills List and improve record-keeping of deliberations regarding these changes
  • by December 2019, use data more effectively and consistently to ensure that the NSW Skills List only includes high priority qualifications
The Department funds training providers that deliver qualifications on the NSW Skills List. Alignment of funding to skill needs relies on the accuracy of the NSW Skills List, which may include some lower priority qualifications.

Only qualifications on the NSW Skills List are eligible for subsidies under Smart and Skilled. As the Department does not have a robust process for removing low priority qualifications from the NSW Skills list, some low priority qualifications may be subsidised. 

The Department allocates the Smart and Skilled budget through contracts with Smart and Skilled training providers. Training providers that meet contractual obligations and perform well in terms of enrolments and completion rates are rewarded with renewed contracts and more funding for increased enrolments, but these decisions are not based on student outcomes. The Department reduces or removes funding from training providers that do not meet quality standards, breach contract conditions or that are unable to spend their allocated funding effectively. Contracts are for only one year, offering training providers little funding certainty. 

Smart and Skilled provides additional funding for scholarships and for training providers in locations where the cost of delivery is high or to those that cater to students with disabilities. The Department has not yet evaluated whether this additional funding is achieving its intended outcomes. 

Eight per cent of the qualifications that have been on the NSW Skills List since 2015 are not delivered under Smart and Skilled anywhere in New South Wales. A further 14 per cent of the qualifications that are offered by training providers have had no student commencements. The Department is yet to identify the reasons that these high priority qualifications are either not offered or not taken up by students.

Recommendation

The Department should:

  • by June 2019, investigate why training providers do not offer, and prospective students do not enrol in, some Smart and Skilled subsidised qualifications 
  • by December 2019, evaluate the effectiveness of Smart and Skilled funding which supplements standard subsidies for qualifications on the NSW Skills List, to determine whether it is achieving its objectives
  • by December 2019, provide longer term funding certainty to high performing training providers, while retaining incentives for them to continue to perform well.
The Department needs to improve its communication, particularly with prospective students.

In a contestable market, it is important for consumers to have sufficient information to make informed decisions. The Department does not provide some key information to prospective VET students to support their decisions, such as measures of provider quality and examples of employment and further education outcomes of students completing particular courses. Existing information is spread across numerous channels and is not presented in a user friendly manner. This is a potential barrier to participation in VET for those less engaged with the system or less ICT literate.

The Department conveys relevant information about the program to training providers through its websites and its regional offices. However, it could better communicate some specific information directly to individual Smart and Skilled training providers, such as reasons their proposals to include new qualifications on the NSW Skills List are accepted or rejected. 

While the Department is implementing a communication strategy for VET in New South Wales, it does not have a specific communications strategy for Smart and Skilled which comprehensively identifies the needs of different stakeholders and how these can be addressed. 

Recommendation

By December 2019, the Department should develop and implement a specific communications strategy for Smart and Skilled to:

  • support prospective student engagement and informed decision making
  • meet the information needs of training providers 

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #305 - released 26 July 2018

Published

Actions for Regulation of water pollution in drinking water catchments and illegal disposal of solid waste

Regulation of water pollution in drinking water catchments and illegal disposal of solid waste

Environment
Compliance
Internal controls and governance
Management and administration
Regulation
Risk

There are important gaps in how the Environmental Protection Authority (EPA) implements its regulatory framework for water pollution in drinking water catchments and illegal solid waste disposal. This limits the effectiveness of its regulatory responses, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford.

The NSW Environment Protection Authority (the EPA) is the State’s primary environmental regulator. The EPA regulates waste and water pollution under the Protection of the Environment Operations Act 1997 (the Act) through its licensing, monitoring, regulation and enforcement activities. The community should be able to rely on the effectiveness of this regulation to protect the environment and human health. The EPA has regulatory responsibility for more significant and specific activities which can potentially harm the environment.

Activities regulated by the EPA include manufacturing, chemical production, electricity generation, mining, waste management, livestock processing, mineral processing, sewerage treatment, and road construction. For these activities, the operator must have an EPA issued environment protection licence (licence). Licences have conditions attached which may limit the amount and concentrations of substances the activity may produce and discharge into the environment. Conditions also require the licensee to report on its licensed activities.

This audit assessed the effectiveness of the EPA’s regulatory response to water pollution in drinking water catchments and illegal solid waste disposal. The findings and recommendations of this review can be reasonably applied to the EPA’s other regulatory functions, as the areas we examined were indicative of how the EPA regulates all pollution types and incidents.

 
Conclusion
There are important gaps in how the EPA implements its regulatory framework for water pollution in drinking water catchments and illegal solid waste disposal which limit the effectiveness of its regulatory response. The EPA uses a risk-based regulatory framework that has elements consistent with the NSW Government Guidance for regulators to implement outcomes and risk-based regulation. However, the EPA did not demonstrate that it has established reliable practices to accurately and consistently detect the risk of non compliances by licensees, and apply consistent regulatory actions. This may expose the risk of harm to the environment and human health.
The EPA also could not demonstrate that it has effective governance and oversight of its regulatory operations. The EPA operates in a complex regulatory environment where its regional offices have broad discretions for how they operate. The EPA has not balanced this devolved structure with an effective governance approach that includes appropriate internal controls to monitor the consistency or quality of its regulatory activities. It also does not have an effective performance framework that sets relevant performance expectations and outcome-based key performance indicators (KPIs) for its regional offices. 
These deficiencies mean that the EPA cannot be confident that it conducts compliance and enforcement activities consistently across the State and that licensees are complying with their licence conditions or the Act.
The EPA's reporting on environmental and regulatory outcomes is limited and most of the data it uses is self reported by industry. It has not set outcome-based key result areas to assess performance and trends over time. 
The EPA uses a risk-based regulatory framework for water pollution and illegal solid waste disposal but there are important gaps in implementation that reduce its effectiveness.
Elements of the EPA’s risk-based regulatory framework for water pollution and illegal solid waste disposal are consistent with the NSW Government Guidance for regulators to implement outcomes and risk-based regulation. There are important gaps in how the EPA implements its risk-based approach that limit the effectiveness of its regulatory response. The EPA could not demonstrate that it effectively regulates licensees because it has not established reliable practices that accurately and consistently detect licence non compliances or breaches of the Act and enforce regulatory actions.
The EPA lacks effective governance arrangements to support its devolved regional structure. The EPA's performance framework has limited and inconclusive reporting on regional performance to the EPA’s Chief Executive Officer or to the EPA Board. The EPA cannot assure that it is conducting its regulatory responsibilities effectively and efficiently. 
The EPA does not consistently evaluate its regulatory approach to ensure it is effective and efficient. For example, there are no set requirements for how EPA officers conduct mandatory site inspections, which means that there is a risk that officers are not detecting all breaches or non-compliances. The inconsistent approach also means that the EPA cannot rely on the data it collects from these site inspections to understand whether its regulatory response is effective and efficient. In addition, where the EPA identifies instances of non compliance or breaches, it does not apply all available regulatory actions to encourage compliance.
The EPA also does not have a systematic approach to validate self-reported information in licensees’ annual returns, despite the data being used to assess administrative fees payable to the EPA and its regulatory response to non-compliances. 
The EPA does not use performance frameworks to monitor the consistency or quality of work conducted across the State. The EPA has also failed to provide effective guidance for its staff. Many of its policies and procedures are out-dated, inconsistent, hard to access, or not mandated.
Recommendations
By 31 December 2018, to improve governance and oversight, the EPA should:
1. implement a more effective performance framework with regular reports to the Chief Executive Officer and to the EPA Board on outcomes-based key result areas that assess its environmental and regulatory performance and trends over time
By 30 June 2019, to improve consistency in its practices, the EPA should:
2. progressively update and make accessible its policies and procedures for regulatory operations, and mandate procedures where necessary to ensure consistent application
3. implement internal controls to monitor the consistency and quality of its regulatory operations. 
The EPA does not apply a consistent approach to setting licence conditions for discharges to water.
The requirements for setting licence conditions for water pollution are complex and require technical and scientific expertise. In August 2016, the EPA approved guidance developed by its technical experts in the Water Technical Advisory Unit to assist its regional staff. However, the EPA did not mandate the use of the guidance until mid-April 2018. Up until then, the EPA had left discretion to regional offices to decide what guidance their staff use. This meant that practices have differed across the organisation. The EPA is yet to conduct training for staff to ensure they consistently apply the 2016 guidance.
The EPA has not implemented any appropriate internal controls or quality assurance process to monitor the consistency or quality of licence conditions set by its officers across the State. This is not consistent with good regulatory practice.
The triennial 2016 audit of the Sydney drinking water catchment report highlighted that Lake Burragorang has experienced worsening water quality over the past 20 years from increased salinity levels. The salinity levels were nearly twice as high as in other storages in the Sydney drinking water catchment. The report recommended that the source and implication of the increased salinity levels be investigated. The report did not propose which public authority should carry out such an investigation. 
To date, no NSW Government agency has addressed the report's recommendation. There are three public authorities, the EPA, DPE and WaterNSW that are responsible for regulating activities that impact on water quality in the Sydney drinking water catchment, which includes Lake Burragorang. 
Recommendation
By 30 June 2019, to address worsening water quality in Lake Burragorang, the EPA should:
4. (a) review the impact of its licensed activities on water quality in Lake Burragorang, and
  (b) develop strategies relating to its licensed activities (in consultation with other relevant NSW Government agencies) to improve and maintain the lake's water quality.
The EPA’s risk-based approach to monitoring compliance of licensees has limited effectiveness. 
The EPA tailors its compliance monitoring approach based on the performance of licensees. This means that licensees that perform better have a lower administrative fee and fewer mandatory site inspections. 
However, this approach relies on information that is not complete or accurate. Sources of information include licensees’ annual returns, EPA site inspections and compliance audits, and pollution reports from the public. 
Licensees report annually to the EPA on their performance, including compliance against their licence conditions. The Act contains significant financial penalties if licensees provide false and misleading information in their annual returns. However, the EPA does not systematically or consistently validate information self-reported by licensees, or consistently apply regulatory actions if it discovers non-compliance. 
Self-reported compliance data is used in part to assess a licensed premises’ overall environmental risk level, which underpins the calculation of the administrative fee, the EPA’s site inspection frequency, and the licensee’s exposure to regulatory actions. It is also used to assess the load-based licence fee that the licensee pays.
The EPA has set minimum mandatory site inspection frequencies for licensed premises based on its assessed overall risk level. This is a key tool to detect non-compliance or breaches of the Act. However, the EPA has not issued a policy or procedures that define what these mandatory inspections should cover and how they are to be conducted. We found variations in how the EPA officers in the offices we visited conducted these inspections. The inconsistent approach means that the EPA does not have complete and accurate information of licensees’ compliance. The inconsistent approach also means that the EPA is not effectively identifying all non-compliances for it to consider applying appropriate regulatory actions.
The EPA also receives reports of pollution incidents from the public that may indicate non-compliance. However, the EPA has not set expected time frames within which it expects its officers to investigate pollution incidents. The EPA regional offices decide what to investigate and timeframes. The EPA does not measure regional performance regarding timeframes. 
The few compliance audits the EPA conducts annually are effective in identifying licence non-compliances and breaches of the Act. However, the EPA does not have a policy or required procedures for its regulatory officers to consistently apply appropriate regulatory actions in response to compliance audit findings. 
The EPA has not implemented any effective internal controls or quality assurance process to check the consistency or quality of how its regulatory officers monitor compliance across the State. This is not consistent with good regulatory practice.
Recommendations
To improve compliance monitoring, the EPA should implement procedures to:
5. by 30 June 2019, validate self-reported information, eliminate hardcopy submissions and require licensees to report on their breaches of the Act and associated regulations in their annual returns
6. by 31 December 2018, conduct mandatory site inspections under the risk-based licensing scheme to assess compliance with all regulatory requirements and licence conditions.
 
The EPA cannot assure that its regulatory enforcement approach is fully effective.
The EPA’s compliance policy and prosecution guidelines have a large number of available regulatory actions and factors which should be taken into account when selecting an appropriate regulatory response. The extensive legislation determining the EPA’s regulatory activities, and the devolved regional structure the EPA has adopted in delivering its compliance and regulatory functions, increases the risk of inconsistent compliance decisions and regulatory responses. A good regulatory framework needs a consistent approach to enforcement to incentivise compliance. 
The EPA has not balanced this devolved regional structure with appropriate governance arrangements to give it assurance that its regulatory officers apply a consistent approach to enforcement.
The EPA has not issued standard procedures to ensure consistent non-court enforcement action for breaches of the Act or non-compliance with licence conditions. Given our finding that the EPA does not effectively detect breaches and non-compliances, there is a risk that it is not applying appropriate regulatory actions for many breaches and non-compliances.
A recent EPA compliance audit identified significant non-compliances with incident management plan requirements. However, the EPA has not applied regulatory actions for making false statements on annual returns for those licensees that certified their plans complied with such requirements. The EPA also has not applied available regulatory actions for the non-compliances which led to the false or misleading statements.
Recommendation
By 31 December 2018 to improve enforcement, the EPA should:
7. Implement procedures to systematically assess non-compliances with licence conditions and breaches of the Act and to implement appropriate and consistent regulatory actions.
The EPA has implemented the actions listed in the NSW Illegal Dumping Strategy 2014–16. To date, the EPA has also implemented four of the six recommendations made by the ICAC on EPA's oversight of Regional Illegal Dumping Squads.
The EPA did not achieve the NSW Illegal Dumping Strategy 2014–16 target of a 30 per cent reduction in instances of large scale illegal dumping in Sydney, the Illawarra, Hunter and Central Coast from 2011 levels. 
In the reporting period, the incidences of large scale illegal dumping more than doubled. The EPA advised that this increase may be the result of greater public awareness and reporting rather than increased illegal dumping activity. 
By June 2018, the EPA is due to implement one outstanding recommendation made by the ICAC but has not set a time for the other outstanding recommendation.  

Published

Actions for Fraud controls in local councils

Fraud controls in local councils

Local Government
Fraud
Internal controls and governance
Management and administration
Risk

Many local councils need to improve their fraud control systems, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. The report highlights that councils often have fraud control procedures and systems in place, but are not ensuring people understand them and how they work. There is also significant variation between councils in the quality of their fraud controls.

Fraud can directly influence councils’ ability to deliver services, and undermine community confidence and trust. ICAC investigations, such as the recent Operation Ricco into the former City of Botany Bay Council, show the financial and reputational damage that major fraud can cause. Good fraud control practices are critical for councils and the community. 

The Audit Office of New South Wales 2015 Fraud Control Improvement Kit (the Kit) aligns with the Fraud and Corruption Control Standard AS8001-2008 and identifies ten attributes of an effective fraud control system. This audit used the Kit to assess how councils manage the risk of fraud. It identifies areas where fraud control can improve. 

Fraud can disrupt the delivery and quality of services and threaten the financial stability of councils.

Recent reviews of local government in Queensland and Victoria identify that councils are at risk of fraud because they purchase large quantities of goods and services using devolved decision making arrangements. The Queensland Audit Office in its 2014–15 report 'Fraud Management in Local Government' found that ‘Councils are exposed to high-risks of fraud and corruption because of the high volume of goods and services they procure, often from local suppliers; and because of the high degree of decision making vested in councils'. They also highlight some common problems faced by councils including the absence of fraud control plans and failure to conduct regular reviews of their internal controls. Also, in 2008 and 2012 the Victorian Auditor-General identified the importance of up-to-date fraud control planning, clearly documented related policies, training staff to identify fraud risks and the importance of controls such as third party management. 

Investigations into councils by the NSW Independent Commission Against Corruption (ICAC), such as the recent Operation Ricco, show the impact that fraud can have on councils. These impacts include significant financial loss, and negative public perceptions about how well councils manage fraud. The findings of these investigations also show the importance of good fraud controls for councils.

Operation Ricco

In its report on Operation Ricco, the ICAC found that the Chief Financial Officer (CFO) of the City of Botany Bay Council and others dishonestly exercised official functions to obtain financial benefits for themselves and others by causing fraudulent payments from the Council for their benefit. It also identified the CFO received inducements for favourable treatment of contractors.

The report noted that there were overwhelming failures in the council’s procedures and governance framework that created significant opportunities for corruption, of which the CFO and others took advantage.

It found weaknesses across a wide variety of governance processes and functions, including those involving the general manager, the internal audit function, external audit, and the operation of the audit committee.

Source: Published reports of ICAC investigations July 2017.

The strength of fraud control systems varies significantly across New South Wales local councils, and many councils we surveyed need to improve significantly. 

Most surveyed councils do not have fraud control plans that direct resources to mitigating the specific fraud risks they face. Few councils reported that they conduct regular risk assessments or health checks to ensure they respond effectively to the risks they identify. 

There are sector wide weaknesses that impact on the strength of councils' fraud control practice. Less than one-third of councils that responded to the survey:

  • communicate their expectations about ethical conduct and responsibility for fraud control to staff 
  • regularly train staff to identify and respond to suspected fraud
  • inform staff or the wider community how to report suspected fraud and how reports made will be investigated.

The audit also identified a pattern of councils developing policies, procedures or systems without ensuring people understand them, or assessing that they work. This reduces the likelihood that staff will actually use them. 

In general, metropolitan and regional councils surveyed have stronger fraud control systems than rural councils. 

Newly amalgamated councils are operating with systems inherited from two or more pre-amalgamated councils. These councils are developing new systems for their changed circumstances.

Five councils surveyed reported that they did not comply with the Public Interest Disclosure Act 1994

Observations for the sector:
Councils should improve their fraud controls by:

  • tailoring fraud control plans to their circumstances and specific risks
  • systematically and regularly reviewing their fraud risks and fraud control systems to keep their plans up to-date
  • effectively communicating fraud risks, and how staff and the community can report suspected fraud 
  • ensuring that they comply with the Public Interest Disclosure Act 1994.

Recommendation:
That the Office of Local Government: 

  • work with councils to ensure they comply with the Public Interest Disclosure Act 1994.
     
Despite several New South Wales state entities collecting data on suspected fraud, the cost, extent, and nature of fraud in local councils is not clear. 
There are weaknesses in data collection and categorisation. Several state entities receive complaints about councils. These entities often do not separate complaints about fraud from other complaint data, do not separate local council data from other public-sector data, and do not separate complaints about council decisions or councillors from complaints about council staff conduct. Complaints about one incidence of suspected fraud can also be reported multiple times. 
Collaboration between state entities and councils to address these weaknesses in data collection could provide a clearer picture to the public and councils on the incidence of suspected fraud. Better information may also help councils decide where to focus fraud control efforts and apply resources more effectively.
Including measures for fraud control strength and maturity in the OLG performance framework may also improve practice in councils. Further, OLG may want to consider how a revised Model Code could better drive fraud control practice in councils.
Recommendations
That the Office of Local Government:
  •  work with state entities and councils to develop a common approach to how fraud complaints and incidences are defined and categorised so that they can:
    • better use data to provide a clearer picture of the level of fraud within councils
    • measure the effectiveness of, and drive improvement in councils' fraud controls systems

Published

Actions for Shared services in local government

Shared services in local government

Local Government
Internal controls and governance
Management and administration
Shared services and collaboration

Local councils need to properly assess the performance of their current services before considering whether to enter into arrangements with other councils to jointly manage back-office functions or services for their communities. This is one of the recommended practices for councils in a report released today by the Auditor-General for New South Wales, Margaret Crawford. ‘When councils have decided to jointly provide services, they do not always have a strong business case, which clearly identifies the expected costs, benefits and risks of shared service arrangements’, said the Auditor-General.

Councils provide a range of services to meet the needs of their communities. It is important that they consider the most effective and efficient way to deliver them. Many councils work together to share knowledge, resources and services. When done well, councils can save money and improve access to services. This audit assessed how efficiently and effectively councils engage in shared service arrangements. We define ‘shared services’ as two or more councils jointly managing activities to deliver services to communities or perform back-office functions. 

The information we gathered for this audit included a survey of all general-purpose councils in NSW. In total 67 councils (52 per cent) responded to the survey from 128 invited to participate. Appendix two outlines in more detail some of the results from our survey. 

Conclusion
Most councils we surveyed are not efficiently and effectively engaging in shared services. This is due to three main factors. 
First, not all surveyed councils are assessing the performance of their current services before deciding on the best service delivery model. Where they have decided that sharing services is the best way to deliver services, they do not always build a business case which outlines the costs, benefits and risks of the proposed shared service arrangement before entering into it.
Second, some governance models used by councils to share services affect the scope, management and effectiveness of their shared service operations. Not all models are subject to the same checks and balances applied to councils, risking transparency and accountability. Councils must comply with legislative obligations under the Local Government Act 1993 (NSW), including principles for their day-to-day operations. When two or more councils decide to share services, they should choose the most suitable governance model in line with these obligations. 
Third, some councils we surveyed and spoke to lack the capability required to establish and manage shared service arrangements. Identifying whether sharing is the best way to deliver council services involves analysing how services are currently being delivered and building a business case. Councils also need to negotiate with partner councils and determine which governance model is fit for purpose. Planning to establish a shared service arrangement involves strong project management. Evaluating the arrangements identifies whether they are delivering to the expected outcomes. All of these tasks need a specialised skill set that councils do not always have in-house. Resources are available to support councils and to build their capability, but not all councils are seeking this out or considering their capability needs before proceeding.  
Some councils are not clearly defining the expected costs and benefits of shared service arrangements. As a result, the benefits from these arrangements cannot be effectively evaluated.
Some councils are entering into shared service arrangements without formally assessing their costs and benefits or investigating alternative service delivery models. Some councils are also not evaluating shared services against baseline data or initial expectations. Councils should base their arrangements on a clear analysis of the costs, benefits and risks involved. They should evaluate performance against clearly defined outcomes.
The decision to share a service involves an assessment of financial and non-financial costs and benefits. Non-financial benefits include being able to deliver additional services, improve service quality, and deliver regional services across councils or levels of government. 
When councils need support to assess and evaluate shared service arrangements, guidance is available through organisations or by peer learning with other councils.
The governance models councils use for shared services can affect their scope and effectiveness. Some councils need to improve their project management practices to better manage issues, risks and reporting. 
Shared services can operate under several possible governance models. Each governance model has different legal or administrative obligations, risks and benefits. Some arrangements can affect the scope and effectiveness of shared services. For example, some models do not allow councils to jointly manage services, requiring one council to take all risks and responsibilities. In addition, some models may reduce transparency and accountability to councils and their communities.
Regardless of these obligations and risks, councils can still improve how they manage their shared services operations by focusing on project management and better oversight. They would benefit from more guidance on shared service governance models to help them ensure the they are fit for purpose.
Recommendation
The Office of Local Government should, by April 2019:

Develop guidance which outlines the risks and opportunities of governance models that councils can use to share services. This should include advice on legal requirements, transparency in decisions, and accountability for effective use of public resources.

Published

Actions for Grants to non-government schools

Grants to non-government schools

Education
Compliance
Internal controls and governance
Management and administration

The NSW Department of Education could strengthen its management of the $1.2 billion provided to non-government schools annually. This would provide greater accountability for the use of public funds, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford.

Non‑government schools educate 418,000 school children each year, representing 35 per cent of all students in NSW. The NSW Department of Education administers several grant schemes to support these schools, with the aim of improving student learning outcomes and supporting parent choice. To be eligible for NSW Government funding, non‑government schools must be registered with the NSW Education Standards Authority (NESA) and not operate 'for profit' as per section 83C of the NSW Education Act 1990 (the Act). Non‑government schools can either be registered as independent or part of a System Authority.

In 2017–18, non‑government schools in NSW will receive over $1.2 billion from the NSW Government, as well as $3.4 billion from the Australian Government. Recently, the Australian Government has changed the way it funds schools. The NSW Government is assessing how these changes will impact State funding for non‑government schools.

This audit assessed how effectively and efficiently NSW Government grants to non‑government schools are allocated and managed. This audit did not assess the use of NSW Government grants by individual non‑government schools or System Authorities because the Auditor‑General of New South Wales does not have the mandate to assess how government funds are spent by non‑government entities.

Conclusion

The Department of Education effectively and efficiently allocates grants to non‑government schools. Clarifying the objectives of grants, monitoring progress towards these objectives, and improving oversight, would strengthen accountability for the use of public funds by non‑government schools.

We tested a sample of grants provided to non‑government schools under all major schemes, and found that the Department of Education consistently allocates and distributes grants in line with its methodology. The Department has clear processes and procedures to efficiently collect data from schools, calculate the level of funding each school or System should receive, obtain appropriate approvals, and make payments.

We identified three areas where the Department could strengthen its management of grants to provide greater accountability for the use of public funds. First, the Department’s objectives for providing grants to non‑government schools are covered by legislation, intergovernmental agreements and grant guidelines. The Department could consolidate these objectives to allow for more consistent monitoring. Second, the Department relies on schools or System Authorities to engage a registered auditor to certify the accuracy of information on their enrolments and usage of grants. Greater scrutiny of the registration and independence of the auditors would increase confidence in the accuracy of this information. Third, the Department does not monitor how System Authorities reallocate grant funding to their member schools. Further oversight in this area would increase accountability for the use of public funds.

The Department effectively and efficiently allocates grants to non‑government schools. Strengthening its processes would provide greater assurance that the information it collects is accurate.

The Department provides clear guidelines to assist schools to provide the necessary census information to calculate per capita grants. Schools must get an independent external auditor, registered with ASIC, to certify their enrolment figures. The Department checks a sample of the auditors to ensure that they are registered with ASIC. Some other jurisdictions perform additional procedures to increase confidence in the accuracy of the census (for example, independently checking a sample of schools’ census data).

The Department accurately calculates and distributes per capita grants in accordance with its methodology. The previous methodology, used prior to 2018, was not updated frequently enough to reflect changes in schools' circumstances. Over 2014 to 2017, the Department provided additional grants to non‑government schools under the National Education Reform Agreement (NERA), to bring funding more closely in line with the Australian Department of Education and Training's Schooling Resource Standard (SRS). From 2018, the Department has changed the way it calculates per capita grants to more closely align with the Australian Department of Education and Training's approach.

The Department determines eligibility for grants by checking a school's registration status with NESA. However, NESA's approach to monitoring compliance with the registration requirements prioritises student learning and wellbeing requirements over the requirement for policies and procedures for proper governance. Given their importance to the appropriate use of government funding, NESA could increase its monitoring of policies and procedures for proper governance through its program of random inspections. Further, the Department and NESA should enter into a formal agreement to share information to more accurately determine the level of risk of non‑compliance at each school. This may help both agencies more effectively target their monitoring to higher‑risk schools.

By December 2018, the NSW Department of Education should:

  1. Strengthen its processes to provide greater assurance that the enrolment and expenditure information it collects from non‑government schools is accurate. This should build on the work the Australian Government already does in this area.
  2. Establish formal information‑sharing arrangements with the NSW Education Standards Authority to more effectively monitor schools' eligibility to receive funding.
     

By December 2018, the NSW Education Standards Authority should:

  1. Extend its inspection practices to increase coverage of the registration requirement for policies and procedures for the proper governance of schools.
  2. Establish formal information‑sharing arrangements with the NSW Department of Education to more effectively monitor schools' continued compliance with the registration requirements.

The Department’s current approach to managing grants to non‑government schools could be improved to provide greater confidence that funds are being spent in line with the objectives of the grant schemes.

The NSW Government provides funding to non‑government schools to improve student learning outcomes, and to support schooling choices by parents, but does not monitor whether these grants are achieving this. In addition, each grant program has specific objectives. The main objectives for the per capita grant program is to increase the rate of students completing Year 12 (or equivalent), and to improve education outcomes for students. While non‑government schools publicly report on some educational measures via the MySchool website, these measures do not address all the objectives. Strengthened monitoring and reporting of progress towards objectives, at a school level, would increase accountability for public funding. This may require the Department to formalise its access to student level information.

The Department has listed five broad categories of acceptable use for per capita grants, however, provides no further guidance on what expenditure would fit into these categories. Clarifying the appropriate use of grants would increase confidence that funding is being used as intended. Schools must engage an independent auditor, registered with ASIC, to certify that the funding has been spent. The Department could strengthen this approach by improving its processes to check the registration of the auditor, and to verify their independence.

The Department has limited oversight of funding provided to System Authorities (Systems). The Department provides grants to Systems for all their member schools. The Systems can distribute the grants to their schools according to their own methodology. Systems are not required to report to the Department how much of their grant was retained for administrative or centralised expenses. Increased oversight over how the Systems distribute this grant could provide increased transparency for the use of public funds by systems.

By December 2018, the NSW Department of Education should:

  1. Establish and communicate funding conditions that require funded schools to:
    • adhere to conditions of funding, such as the acceptable use of grants, and accounting requirements to demonstrate compliance
    • report their progress towards the objectives of the scheme or wider Government initiatives
    • allow the Department to conduct investigations to verify enrolment and expenditure of funds
    • provide the Department with access to existing student level data to inform policy development and analysis.
  1. Increase its oversight of System Authorities by requiring them to:
    • re‑allocate funds across their system on a needs basis, and report to the Department on this
    • provide a yearly submission with enough detail to demonstrate that each System school has spent their State funding in line with the Department's requirements.

Published

Actions for Council reporting on service delivery

Council reporting on service delivery

Local Government
Compliance
Internal controls and governance
Management and administration
Service delivery

New South Wales local government councils’ could do more to demonstrate how well they are delivering services in their reports to the public, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Many councils report activity, but do not report on outcomes in a way that would help their communities assess how well they are performing. Most councils also did not report on the cost of services, making it difficult for communities to see how efficiently they are being delivered. And councils are not consistently publishing targets to demonstrate what they are striving for.

I am pleased to present my first local government performance audit pursuant to section 421D of the Local Government Act 1993.

My new mandate supports the Parliament’s objectives to:

  • strengthen governance and financial oversight in the local government sector
  • improve financial management, fiscal responsibility and public accountability for how councils use citizens’ funds.

Performance audits aim to help councils improve their efficiency and effectiveness. They will also provide communities with independent information on the performance of their councils.

For this inaugural audit in the local government sector, I have chosen to examine how well councils report to their constituents about the services they provide.

In this way, the report will enable benchmarking and provide improvement guidance to all councils across New South Wales.

Specific recommendations to drive improved reporting are directed to the Office of Local Government, which is the regulator of councils in New South Wales.

Councils provide a range of services which have a direct impact on the amenity, safety and health of their communities. These services need to meet the needs and expectations of their communities, as well as relevant regulatory requirements set by state and federal governments. Councils have a high level of autonomy in decisions about how and to whom they provide services, so it is important that local communities have access to information about how well they are being delivered and meeting community needs. Ultimately councils should aim to ensure that reporting performance is subject to quality controls designed to provide independent assurance.

Conclusion
While councils report on outputs, reporting on outcomes and performance over time can be improved. Improved reporting would include objectives with targets that better demonstrate performance over time. This would help communities understand what services are being delivered, how efficiently and effectively they are being delivered, and what improvements are being made.
To ensure greater transparency on service effectiveness and efficiency, the Office of Local Government (OLG) should work with councils to develop guidance principles to improve reporting on service delivery to local communities. This audit identified an interest amongst councils in improving their reporting and broad agreement with the good practice principles developed as part of the audit.
The Integrated Planning and Reporting Framework (the Framework), which councils are required to use to report on service delivery, is intended to promote better practice. However, the Framework is silent on efficiency reporting and provides limited guidance on how long-term strategic documents link with annual reports produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.
OLG should also work with state agencies to reduce the overall reporting burden on councils by consolidating state agency reporting requirements. 

Councils report extensively on the things they have done, but minimally on the outcomes from that effort, efficiency and performance over time.

Councils could improve reporting on service delivery by more clearly relating the resources needed with the outputs produced, and by reporting against clear targets. This would enable communities to understand how efficiently services are being delivered and how well councils are tracking against their goals and priorities.

Across the sector, a greater focus is also needed on reporting performance over time so that communities can track changes in performance and councils can demonstrate whether they are on target to meet any agreed timeframes for service improvements.

The degree to which councils demonstrate good practice in reporting on service delivery varies greatly between councils. Metropolitan and regional town and city councils generally produce better quality reporting than rural councils. This variation indicates that, at least in the near-term, OLG's efforts in building capability in reporting would be best directed toward rural councils.

Recommendation

By mid-2018, OLG should:

  • assist rural councils to develop their reporting capability.

The Framework which councils are required to use to report on service delivery, is intended to drive good practice in reporting. Despite this, the Framework is silent on a number of aspects of reporting that should be considered fundamental to transparent reporting on service delivery. It does not provide guidance on reporting efficiency or cost effectiveness in service delivery and provides limited guidance on how annual reports link with other plans produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.

Recommendation

By mid-2018, OLG should:

  • issue additional guidance on good practice in council reporting, with specific information on:
    • reporting on performance against targets
    • reporting on performance against outcome
    • assessing and reporting on efficiency and cost effectiveness
    • reporting performance over time
    • clearer integration of all reports and plans that are required by the Framework, particularly the role of End of Term Reporting
    • defining reporting terms to encourage consistency.

The Framework is silent on inclusion of efficiency or cost effectiveness indicators in reports

The guidelines produced by OLG in 2013 to assist councils to implement their Framework requirements advise that performance measures should be included in all plans. However, the Framework does not specifically state that efficiency or cost effectiveness indicators should be included as part of this process. This has been identified as a weakness in the 2012 performance audit report and the Local Government Reform Panel review of reporting by councils on service delivery.

The Framework and supporting documents provide limited guidance on reporting

Councils' annual reports provide a consolidated summary of their efforts and achievements in service delivery and financial management. However, OLG provides limited guidance on:

  • good practice in reporting to the community
  • how the annual report links with other plans and reports required by the Framework.

Further, the Framework includes both Annual and End of Term Reports. However, End of Term reports are published prior to council elections and are mainly a consolidation of annual reports produced during a council’s term. The relationship between Annual reports and End of Term reports is not clear.

OLG is reviewing the Framework and guidance

OLG commenced work on reviewing of the Framework in 2013 but this was deferred with work re‑starting in 2017. The revised guidelines and manual were expected to be released late in 2017.

OLG should build on the Framework to improve guidance on reporting on service delivery, including in annual reports

The Framework provides limited guidance on how best to report on service delivery, including in annual reports. It is silent on inclusion of efficiency or cost effectiveness indicators in reporting, which are fundamental aspects of performance reporting. Councils we consulted would welcome more guidance from OLG on these aspects of reporting.

Our consultation with councils highlighted that many council staff would welcome a set of reporting principles that provide guidance to councils, without being prescriptive. This would allow councils to tailor their approach to the individual characteristics, needs and priorities of their local communities.

Consolidating what councils are required to report to state agencies would reduce the reporting burden and enable councils to better report on performance. Comparative performance indicators are also needed to provide councils and the public with a clear understanding of councils' performance relative to each other.

Recommendations

By mid-2018, OLG should:

  • commence work to consolidate the information reported by individual councils to NSW Government agencies as part of their compliance requirements.
  • progress work on the development of a Performance Measurement Framework, and associated performance indicators, that can be used by councils and the NSW Government in sector-wide performance reporting.

Streamlining the reporting burden would help councils improve reporting

The NSW Government does not have a central view of all local government reporting, planning and compliance obligations. A 2016 draft IPART ‘Review of reporting and compliance burdens on Local Government’ noted that councils provide a wide range of services under 67 different Acts, administered by 27 different NSW Government agencies. Consolidating and coordinating reporting requirements would assist with better reporting over time and comparative reporting. It would also provide an opportunity for NSW Government agencies to reduce the reporting burden on councils by identifying and removing duplication.

Enabling rural councils to perform tailored surveys of their communities may be more beneficial than a state-wide survey in defining outcome indicators

Some councils use community satisfaction survey data to develop outcome indicators for reporting. The results from these are used by councils to set service delivery targets and report on outcomes. This helps to drive service delivery in line with community expectations. While some regional councils do conduct satisfaction surveys, surveys are mainly used by metropolitan councils which generally have the resources needed to run them.

OLG and the Department of Premier and Cabinet have explored the potential to conduct state-wide resident satisfaction surveys with a view to establishing measures to improve service delivery. This work has drawn from a similar approach adopted in Victoria. Our consultation with stakeholders in Victoria indicated that the state level survey is not sufficiently detailed or specific enough to be used as a tool in setting targets that respond to local circumstances, expectations and priorities. Our analysis of reports and consultation with stakeholders suggest that better use of resident survey data in rural and regional areas may support improvements in performance reporting in these areas. Rural councils may benefit more from tailored surveys of groups of councils with similar challenges, priorities and circumstances than from a standard state-wide survey. These could potentially be achieved through regional cooperation between groups of similar councils or regional groups.

Comparative reporting indicators are needed to enable councils to respond to service delivery priorities of their communities

The Local Government Reform Panel in 2012 identified the need for ‘more consistent data collection and benchmarking to enable councils and the public to gain a clear understanding of how a council is performing relative to their peers’.

OLG commenced work in 2012 to build a new performance measurement Framework for councils which aimed to move away from compliance reporting. This work was also strongly influenced by the approach used in Victoria that requires councils to report on a set of 79 indicators which are reported on the Victorian 'Know your council' website. OLG’s work did not fully progress at the time and several other local government representative bodies have since commenced work to establish performance measurement frameworks. OLG advised us it has recently recommenced its work on this project.

Our consultation identified some desire amongst councils to be able to compare their performance to support improvement in the delivery of services. We also identified a level of frustration that more progress has not been made toward establishment of a set of indicators that councils can use to measure performance and drive improvement in service delivery.

Several councils we spoke with were concerned that the current approaches to comparative reporting did not adequately acknowledge that councils need to tailor their service types, level and mix to the needs of their community. Comparative reporting approaches tend to focus on output measures such as number of applications processed, library loans annually and opening hours for sporting facilities, rather than outcome measures. These approaches risk unjustified and adverse interpretations of performance where councils have made a decision based on community consultation, local priorities and available resources. To mitigate this, it is important to

  • adopt a partnership approach to the development of indicators
  • ensure indicators measure performance, not just level of activity
  • compare performance between councils that are similar in terms of size and location.

It may be more feasible, at least in the short term, for OLG to support small groups of like councils to develop indicators suited to their situation.

Based on our consultations, key lessons from implementing a sector-wide performance indicator framework in Victoria included the benefits of:

  • consolidation of the various compliance data currently being reported by councils to provide an initial platform for comparative performance reporting
  • adopting a partnership approach to development of common indicators with groups of like councils.

Published

Actions for CBD and South East Light Rail Project

CBD and South East Light Rail Project

Transport
Compliance
Financial reporting
Infrastructure
Internal controls and governance
Management and administration
Procurement
Project management
Risk

Transport for NSW did not effectively plan and procure the CBD and South East Light Rail (CSELR) project to achieve best value for money according to a report released today by NSW Auditor-General, Margaret Crawford.

Transport for NSW is on track to deliver the project, but it will come at a higher cost with lower benefits than in the approved business case.

 

Parliamentary reference - Report number #278 - released 30 November 2016

Published

Actions for Implementation of the NSW Government’s program evaluation initiative

Implementation of the NSW Government’s program evaluation initiative

Industry
Justice
Planning
Premier and Cabinet
Treasury
Environment
Financial reporting
Internal controls and governance
Management and administration
Risk
Service delivery
Shared services and collaboration
Workforce and capability

The NSW Government’s ‘program evaluation initiative’, introduced to assess whether service delivery programs achieve expected outcomes and value for money, is largely ineffective according to a report released today by NSW Auditor-General, Margaret Crawford.

Government services, in areas such as public order and safety, health and education, are delivered by agencies through a variety of programs. In 2016–17, the NSW Government estimates that it will spend over $73 billion on programs to deliver services.

 

Parliamentary reference - Report number #277 - released 3 November 2016

Published

Actions for Monitoring food safety practices in retail food businesses

Monitoring food safety practices in retail food businesses

Health
Local Government
Compliance
Internal controls and governance
Management and administration
Risk
Shared services and collaboration

New South Wales has a lower rate of foodborne illness than the national average. This reflects some good practices in the NSW Food Authority’s approach to monitoring food safety standards. It also is a factor of the long-standing commitment by local councils’ to ensuring retail food businesses meet these standards.

To ensure foodborne illness remains low, the Authority needs to better monitor its arrangements with councils which inspect retail food businesses on its behalf, and receive additional and more timely information from councils on compliance with food safety standards.

 

Parliamentary reference - Report number #274 - released 15 September 2016