Refine search Expand filter

Reports

Published

Actions for Ensuring teaching quality in NSW public schools

Ensuring teaching quality in NSW public schools

Education
Management and administration
Regulation
Service delivery
Workforce and capability

The Auditor-General for New South Wales, Margaret Crawford, has released a report on how the New South Wales Education and Standards Authority (NESA) and the Department of Education (the Department) ensure teaching quality in NSW public schools.

Around 2,200 NSW public school principals are responsible for accrediting their teachers in line with the Australian Professional Standards for Teachers. The report found that NESA does not oversight principals’ decisions to ensure that minimum standards for teaching quality are consistently met.

The Department does not effectively monitor teaching quality across the state. With limited data, it is difficult for the Department to ensure its strategies to improve teaching quality are appropriately targeted to improve teaching quality.

The Department’s Performance and Development Framework does not adequately support principals and supervisors to effectively manage and improve teacher performance or actively improve teaching quality. The Department manages those teachers formally identified as underperforming through teacher improvement programs. Only 53 of over 66,000 teachers employed by the Department were involved in these programs in 2018.

The report makes three recommendations towards NESA to improve accreditation processes, and four recommendations to the Department to improve its systems and processes for ensuring teaching quality across the State.

Australian research has shown that quality teaching is the greatest in-school influence on student engagement and outcomes, accounting for 30 per cent of the variance in student performance. An international comparative study of 15-year-old students showed the performance of New South Wales students in reading, mathematics and science has declined between 2006 and 2015.

The Australian Professional Standards for Teachers (the Standards) describe the knowledge, skills and understanding expected of effective teachers at different career stages. Teachers must be accredited against the Standards to be employed in NSW schools. The NSW Education Standards Authority (NESA) is responsible for ensuring all teachers in NSW schools are accredited. As part of the accreditation process the NSW Department of Education (The Department) assesses whether public school teachers meet proficient accreditation standards and advises NESA of its decisions.

The School Excellence Framework provides a method for the Department to monitor teaching quality at a school level across four elements of effective teaching practice. The Performance and Development Framework provides a method for teachers and their supervisors to monitor and improve teaching quality through setting professional goals to guide their performance and development.

The Department has a strategic goal that every student, every teacher, every leader and every school improves every year. In line with this goal, the Department has a range of strategies targeted to improving teaching quality at different career stages. These include additional resources to support new teachers, a program to support teachers to gain higher-level accreditation, support for principals to manage underperforming teachers, and a professional learning program where teachers observe and discuss each other's practice.

The objective of this audit was to assess the effectiveness of the NSW Department of Education's and the NSW Education Standards Authority's arrangements to ensure teaching quality in NSW public schools. To address this objective, the audit examined whether:

  • agencies effectively monitor the quality of teaching in NSW public schools
  • strategies to improve the quality of teaching are planned, communicated, implemented and monitored well.
The NSW Education Standards Authority does not oversight principals’ decisions to accredit teachers as proficient. This means it is not ensuring minimum standards for teaching quality are consistently met.
NESA does not have a process to ensure principals’ decisions to accredit teachers are in line with the Standards. The decision to accredit teachers is one of the main ways to ensure teaching quality. In New South Wales public schools, around 2,200 principals are tasked with making decisions to accredit their teachers as proficient. NESA provides training and guidelines for principals to encourage consistent accreditation decisions but regular turnover of principals makes it difficult to ensure that all principals are adequately supported. NESA has more oversight of provisional and conditional accreditation for beginning teachers, as well as higher-level accreditation for highly effective teachers. That said, there are only limited numbers of teachers with higher-level accreditation across the state.
The Department of Education does not effectively monitor teaching quality at a system level. This makes it difficult to ensure strategies to improve teaching quality are appropriately targeted.
The Department is not collecting sufficient information to monitor teaching quality across the state. No information on teacher assessment against the Performance and Development Framework is collected centrally. Schools self-assess their performance against the School Excellence Framework but this does not assess teaching quality for all teachers. The Department also surveys students about their experiences of teaching quality but schools opt-in to this survey, with 65 per cent of public schools participating in 2018. These factors limit the ability of the Department to target efforts to areas of concern.
We examined five key strategies that support the critical parts of a teacher’s career. Most strategies were based on research and consultation, planned, trialled, reviewed and adjusted before wider rollout. Guidance and training is provided to communicate requirements and help schools implement strategies at a local level. Monitoring of strategies implemented at a local level is variable. We identified several instances where Quality Teaching, Successful Students funding was used outside guidelines. Two strategies have not yet been evaluated, which prevents the Department from determining whether they are having the desired impact.
The Performance and Development Framework is not structured in a way that supports principals and supervisors to actively improve teacher performance and teaching quality.
There is limited opportunity for supervisors to set goals, conduct observations of teaching practice, or provide constructive written feedback on a teacher’s progress towards achieving their goals under this framework. Guidance on how to use the Standards to construct quality goals, observe teaching practice and provide valuable feedback is also insufficient. The framework focuses on teachers’ self-identified development goals but there is no requirement to align these with the Standards. These limitations reduce the ability of supervisors to use this framework to effectively manage teacher performance and improve teaching quality.
The Department manages those teachers formally identified as underperforming through teacher improvement programs. Only 53 of over 66,000 teachers employed by the Department were involved in these programs in 2018. By comparison, a report on inspections conducted in the United Kingdom assessed the quality of teaching as ‘inadequate’ in three per cent of schools.

Appendix one – Response from agencies

Appendix two – About the audit

Appendix three – Performance auditing

© Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Parliamentary Reference: Report number #327 - released 26 September 2019

36

Published

Actions for Ensuring contract management capability in government - Department of Education

Ensuring contract management capability in government - Department of Education

Education
Compliance
Internal controls and governance
Management and administration
Procurement
Workforce and capability

This report examines whether the Department of Education has the required contract management capability to effectively manage high-value goods and services contracts (over $250,000). In 2017–18, the department managed high-value goods and services contracts worth $3.08 billion, with most of the contracts running over multiple years.

NSW government agencies are increasingly delivering services and projects through contracts with third parties. These contracts can be complex and governments face challenges in negotiating and implementing them effectively.

Contract management capability is a broad term, which can include aspects of individual staff capability as well as organisational capability (such as policies, frameworks and processes).

In 2017–18, the Department of Education (the Department) managed high-value (over $250,000) goods and services contracts worth $3.08 billion, with most of the contracts running over multiple years. The Department delivers, funds and regulates education services for NSW students from early childhood to secondary school.

This audit examined whether the Department has the required capability to effectively manage high-value goods and services contracts.

We did not examine infrastructure, construction or information communication and technology contracts. We assessed the Department against the following criteria:

  1. The Department’s policies and procedures support effective contract management and are consistent with relevant frameworks, policies and guidelines.
  2. The Department has capable personnel to effectively conduct the monitoring activities throughout the life of the contract.

The NSW Public Service Commission and the Department of Finance, Services and Innovation are included as auditees as they administer policies which directly affect contract management capability, including:

  • NSW Procurement Board Directions and policies
  • NSW Procurement Agency Accreditation Scheme
  • NSW Public Sector Capability Framework.

The Department of Finance, Services and Innovation's responsibility for NSW Procurement will transfer to NSW Treasury on 1 July 2019 as part of changes to government administrative arrangements announced on 2 April 2019 and amended on 1 May 2019.

Conclusion

The Department of Education's procedures and policies for goods and services contract management are consistent with relevant guidance. It also has a systemic approach to defining the capability required for contract management roles. That said, there are gaps in how well the Department uses this capability to ensure its contracts are performing. We also found one program (comprising 645 contracts) that was not compliant with the Department's policies.

The Department has up-to-date policies and procedures that are consistent with relevant guidance. The Department also communicates changes to procurement related policies, monitors compliance with policies and conducts regular reviews aiming to identify non-compliance.

The Department uses the NSW Public Service Commission's capability framework to support its workforce management and development. The capability framework includes general contract management capability for all staff and occupation specific capabilities for contract managers. The Department also provides learning and development for staff who manage contracts to improve their capability.

The Department provides some guidance on different ways that contract managers can validate performance information provided by suppliers. However, the Department does not provide guidance to assist contract managers to choose the best validation strategy according to contract risk. This could lead to inconsistent practice and contracts not delivering what they are supposed to.

We found that none of the 645 contracts associated with the Assisted Schools Travel Program (estimated value of $182 million in 2018–19) have contract management plans. This is contrary to the Department's policies and increases the risk that contract managers are not effectively reviewing performance and resolving disputes.

Appendix one - Response from agencies

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary Reference: Report number #325 - released 28 June 2019

Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Published

Actions for Contracting non-government organisations

Contracting non-government organisations

Community Services
Compliance
Fraud
Management and administration
Procurement
Regulation
Service delivery

This report found the Department of Family and Community Services (FACS) needs to do more to demonstrate it is effectively and efficiently contracting NGOs to deliver community services in the Permanency Support Program (a component of out-of-home-care services) and Specialist Homelessness Services. It notes that FACS is moving to an outcomes-based commissioning model and recommends this be escalated consistent with government policy.

Government agencies, such as the Department of Family and Community Services (FACS), are increasingly contracting non-government organisations (NGOs) to deliver human services in New South Wales. In doing so, agencies are responsible for ensuring these services are achieving expected outcomes. Since the introduction of the Commissioning and Contestability Policy in 2016, all NSW Government agencies are expected to include plans for customer and community outcomes and look for ways to use contestability to raise standards.

Two of the areas receiving the greatest funding from FACS are the Permanency Support Program and Specialist Homelessness Services. In the financial year 2017–18, nearly 500 organisations received $784 million for out-of-home care programs, including the Permanency Support Program. Across New South Wales, specialist homelessness providers assist more than 54,000 people each year and in the financial year 2017–18, 145 organisations received $243 million for providing short term accommodation and homelessness support, including Specialist Homelessness Services.

In the financial year 2017–18, FACS entered into 230 contracts for out-of-home care, of which 49 were for the Permanency Support Program, representing $322 million. FACS also entered into 157 contracts for the provision of Specialist Homelessness Services which totalled $170 million. We reviewed the Permanency Support Program and Specialist Homelessness Services for this audit.

This audit assessed how effectively and efficiently FACS contracts NGOs to deliver community services. The audit could not assess how NGOs used the funds they received from FACS as the Audit Office does not have a mandate that could provide direct assurance that NGOs are using government funds effectively.

Conclusion
FACS cannot demonstrate it is effectively and efficiently contracting NGOs to deliver community services because it does not always use open tenders to test the market when contracting NGOs, and does not collect adequate performance data to ensure safe and quality services are being provided. While there are some valid reasons for using restricted tenders, it means that new service providers are excluded from consideration - limiting contestability. In the service delivery areas we assessed, FACS does not measure client outcomes as it has not yet moved to outcomes-based contracts. 
FACS' procurement approach sometimes restricts the selection of NGOs for the Permanency Support Program and Specialist Homelessness Services
FACS has a procurement policy and plan which it follows when contracting NGOs for the provision of human services. This includes the option to use restricted tenders, which FACS sometimes uses rather than opening the process to the market. The use of restricted tenders is consistent with its procurement plan where there is a limited number of possible providers and the services are highly specialised. However, this approach perpetuates existing arrangements and makes it very difficult for new service providers to enter the market. The recontracting of existing providers means FACS may miss the opportunity to benchmark existing providers against the whole market. 
FACS does not effectively use client data to monitor the performance of NGOs funded under the Permanency Support Program and Specialist Homelessness Services
FACS' contract management staff monitor individual NGO performance including safety, quality of services and compliance with contract requirements. Although FACS does provide training materials on its intranet, FACS does not provide these staff with sufficient training, support or guidance to monitor NGO performance efficiently or effectively. FACS also requires NGOs to self-report their financial performance and contract compliance annually. FACS verifies the accuracy of the financial data but conducts limited validation of client data reported by NGOs to verify its accuracy. Instead, FACS relies on contract management staff to identify errors or inaccurate reporting by NGOs.
FACS' ongoing monitoring of the performance of providers under the Permanency Support Program is particularly limited due to problems with timely data collection at the program level. This reduces FACS' ability to monitor and analyse NGO performance at the program level as it does not have access to ongoing performance data for monitoring service quality.
In the Specialist Homelessness Services program, FACS and NGOs both provide the data required for the National Minimum Data Set on homelessness and provide it to the Australian Institute of Health and Welfare, as they are required to do. However, this data is not used for NGO performance monitoring or management.
FACS does not yet track outcomes for clients of NGOs
FACS began to develop an approach to outcomes-based contracting in 2015. Despite this, none of the contracts we reviewed are using outcomes as a measure of success. Currently, NGOs are required to demonstrate their performance is consistent with the measures stipulated in their contracts as part of an annual check of their contract compliance and financial accounts. NGOs report against activity-based measures (Key Performance Indicators) and not outcomes.
FACS advises that the transition to outcomes-based contracting will be made with the new rounds of funding which will take place in 2020–2021 for Specialist Homelessness Services and 2023 for the Permanency Support Program. Once these contracts are in place, FACS can transition NGOs to outcomes based reporting.
Incomplete data limits FACS' effectiveness in continuous improvement for the Permanency Support Program and Specialist Homelessness Services
FACS has policies and procedures in place to learn from past experiences and use this to inform future contracting decisions. However, FACS has limited client data related to the Permanency Support Program which restricts the amount of continuous improvement it can undertake. In the Specialist Homelessness Support Program data is collected to inform routine contract management discussions with service providers but FACS is not using this data for continuous improvement. 

Appendix one – Response from agency

Appendix two – About the audit

Appendix three – Performance auditing

 

Parliamentary Reference: Report number #323 - released 26 June 2019

Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Published

Actions for Newcastle Urban Transformation and Transport Program

Newcastle Urban Transformation and Transport Program

Transport
Planning
Compliance
Infrastructure
Management and administration
Procurement
Project management

The urban renewal projects on former railway land in the Newcastle city centre are well targeted to support the objectives of the Newcastle Urban Transformation and Transport Program (the Program), according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. The planned uses of the former railway land achieve a balance between the economic and social objectives of the Program at a reasonable cost to the government. However, the evidence that the cost of the light rail will be justified by its contribution to the Program is not convincing.

The Newcastle Urban Transformation and Transport Program (the Program) is an urban renewal and transport program in the Newcastle city centre. The Hunter and Central Coast Development Corporation (HCCDC) has led the Program since 2017. UrbanGrowth NSW led the Program from 2014 until 2017. Transport for NSW has been responsible for delivering the transport parts of the Program since the Program commenced. All references to HCCDC in this report relate to both HCCDC and its predecessor, the Hunter Development Corporation. All references to UrbanGrowth NSW in this report relate only to its Newcastle office from 2014 to 2017.

This audit had two objectives:

  1. To assess the economy of the approach chosen to achieve the objectives of the Program.
  2. To assess the effectiveness of the consultation and oversight of the Program.

We addressed the audit objectives by answering the following questions:

a) Was the decision to build light rail an economical option for achieving Program objectives?
b) Has the best value been obtained for the use of the former railway land?
c) Was good practice used in consultation on key Program decisions?
d) Did governance arrangements support delivery of the program?

Conclusion
1. The urban renewal projects on the former railway land are well targeted to support the objectives of the Program. However, there is insufficient evidence that the cost of the light rail will be justified by its contribution to Program objectives.

The planned uses of the former railway land achieve a balance between the economic and social objectives of the Program at a reasonable cost to the Government. HCCDC, and previously UrbanGrowth NSW, identified and considered options for land use that would best meet Program objectives. Required probity processes were followed for developments that involved financial transactions. Our audit did not assess the achievement of these objectives because none of the projects have been completed yet.

Analysis presented in the Program business case and other planning documents showed that the light rail would have small transport benefits and was expected to make a modest contribution to broader Program objectives. Analysis in the Program business case argued that despite this, the light rail was justified because it would attract investment and promote economic development around the route. The Program business case referred to several international examples to support this argument, but did not make a convincing case that these examples were comparable to the proposed light rail in Newcastle.

The audited agencies argue that the contribution of light rail cannot be assessed separately because it is a part of a broader Program. The cost of the light rail makes up around 53 per cent of the total Program funding. Given the cost of the light rail, agencies need to be able to demonstrate that this investment provides value for money by making a measurable contribution to the Program objectives.

2. Consultation and oversight were mostly effective during the implementation stages of the Program. There were weaknesses in both areas in the planning stages.

Consultations about the urban renewal activities from around 2015 onward followed good practice standards. These consultations were based on an internationally accepted framework and met their stated objectives. Community consultations on the decision to close the train line were held in 2006 and 2009. However, the final decision in 2012 was made without a specific community consultation. There was no community consultation on the decision to build a light rail.

The governance arrangements that were in place during the planning stages of the Program did not provide effective oversight. This meant there was not a single agreed set of Program objectives until 2016 and roles and responsibilities for the Program were not clear. Leadership and oversight improved during the implementation phase of the Program. Roles and responsibilities were clarified and a multi-agency steering committee was established to resolve issues that needed multi-agency coordination.
The light rail is not justified by conventional cost-benefit analysis and there is insufficient evidence that the indirect contribution of light rail to achieving the economic development objectives of the Program will justify the cost.
Analysis presented in Program business cases and other planning documents showed that the light rail would have small transport benefits and was expected to make a modest contribution to broader Program objectives. Analysis in the Program business case argued that despite this, the light rail was justified because it would attract investment and promote economic development around the route. The Program business case referred to several international examples to support this argument, but did not make a convincing case that these examples were comparable to the proposed light rail in Newcastle.
The business case analysis of the benefits and costs of light rail was prepared after the decision to build light rail had been made and announced. Our previous reports, and recent reports by others, have emphasised the importance of completing thorough analysis before announcing infrastructure projects. Some advice provided after the initial light rail decision was announced was overly optimistic. It included benefits that cannot reasonably be attributed to light rail and underestimated the scope and cost of the project.
The audited agencies argue that the contribution of light rail cannot be assessed separately because it is part of a broader Program. The cost of the light rail makes up around 53 per cent of the total Program funding. Given the high cost of the light rail, we believe agencies need to be able to demonstrate that this investment provides value for money by making a measurable contribution to the Program objectives.

Recommendations
For future infrastructure programs, NSW Government agencies should support economical decision-making on infrastructure projects by:
  • providing balanced advice to decision makers on the benefits and risks of large infrastructure investments at all stages of the decision-making process
  • providing scope and cost estimates that are as accurate and complete as possible when initial funding decisions are being made
  • making business cases available to the public.​​​​​​
The planned uses of the former railway land achieve a balance between the economic and social objectives of the Program at a reasonable cost to the government.

The planned uses of the former railway land align with the objectives of encouraging people to visit and live in the city centre, creating attractive public spaces, and supporting growth in employment in the city. The transport benefits of the activities are less clear, because the light rail is the major transport project and this will not make significant improvements to transport in Newcastle.

The processes used for selling and leasing parts of the former railway land followed industry standards. Options for the former railway land were identified and assessed systematically. Competitive processes were used for most transactions and the required assessment and approval processes were followed. The sale of land to the University of Newcastle did not use a competitive process, but required processes for direct negotiations were followed.

Recommendation
By March 2019, the Hunter and Central Coast Development Corporation should:
  • work with relevant stakeholders to explore options for increasing the focus on the heritage objective of the Program in projects on the former railway land. This could include projects that recognise the cultural and industrial heritage of Newcastle.
Consultations about the urban renewal activities followed good practice standards, but consultation on transport decisions for the Program did not.

Consultations focusing on urban renewal options for the Program included a range of stakeholders and provided opportunities for input into decisions about the use of the former railway land. These consultations received mostly positive feedback from participants. Changes and additions were made to the objectives of the Program and specific projects in response to feedback received. 

There had been several decades of debate about the potential closure of the train line, including community consultations in 2006 and 2009. However, the final decision to close the train line was made and announced in 2012 without a specific community consultation. HCCDC states that consultation with industry and business representatives constitutes community consultation because industry representatives are also members of the community. This does not meet good practice standards because it is not a representative sample of the community.

There was no community consultation on the decision to build a light rail. There were subsequent opportunities for members of the community to comment on the implementation options, but the decision to build it had already been made. A community and industry consultation was held on which route the light rail should use, but the results of this were not made public. 

Recommendation
For future infrastructure programs, NSW Government agencies should consult with a wide range of stakeholders before major decisions are made and announced, and report publicly on the results and outcomes of consultations. 

The governance arrangements that were in place during the planning stages of the Program did not provide effective oversight. Project leadership and oversight improved during the implementation phase of the Program.

Multi-agency coordination and oversight were ineffective during the planning stages of the Program. Examples include: multiple versions of Program objectives being in circulation; unclear reporting lines for project management groups; and poor role definition for the initial advisory board. Program ownership was clarified in mid-2016 with the appointment of a new Program Director with clear accountability for the delivery of the Program. This was supported by the creation of a multi-agency steering committee that was more effective than previous oversight bodies.

The limitations that existed in multi-agency coordination and oversight had some negative consequences in important aspects of project management for the Program. This included whole-of-government benefits management and the coordination of work to mitigate impacts of the Program on small businesses.

Recommendations
For future infrastructure programs, NSW Government agencies should: 

  • develop and implement a benefits management approach from the beginning of a program to ensure responsibility for defining benefits and measuring their achievement is clear
  • establish whole-of-government oversight early in the program to guide major decisions. This should include:
    • agreeing on objectives and ensuring all agencies understand these
    • clearly defining roles and responsibilities for all agencies
    • establishing whole-of-government coordination for the assessment and mitigation of the impact of major construction projects on businesses and the community.

By March 2019, the Hunter and Central Coast Development Corporation should update and implement the Program Benefits Realisation Plan. This should include:

  • setting measurable targets for the desired benefits
  • clearly allocating ownership for achieving the desired benefits
  • monitoring progress toward achieving the desired benefits and reporting publicly on the results.

Appendix one - Response from agencies    

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #310 - released 12 December 2018

Published

Actions for Managing Antisocial behaviour in public housing

Managing Antisocial behaviour in public housing

Community Services
Asset valuation
Infrastructure
Regulation
Service delivery
Workforce and capability

The Department of Family and Community Services (FACS) has not adequately supported or resourced its staff to manage antisocial behaviour in public housing according to a report released today by the Deputy Auditor-General for New South Wales, Ian Goodwin. 

In recent decades, policy makers and legislators in Australian states and territories have developed and implemented initiatives to manage antisocial behaviour in public housing environments. All jurisdictions now have some form of legislation or policy to encourage public housing tenants to comply with rules and obligations of ‘good neighbourliness’. In November 2015, the NSW Parliament changed legislation to introduce a new approach to manage antisocial behaviour in public housing. This approach is commonly described as the ‘strikes’ approach. 

When introduced in the NSW Parliament, the ‘strikes’ approach was described as a means to:

  • improve the behaviour of a minority of tenants engaging in antisocial behaviour 
  • create better, safer communities for law abiding tenants, including those who are ageing and vulnerable.

FACS has a number of tasks as a landlord, including a responsibility to collect rent and organise housing maintenance. FACS also has a role to support tenants with complex needs and manage antisocial behaviour. These roles have some inherent tensions. The FACS antisocial behaviour management policy aims are: 

to balance the responsibilities of tenants, the rights of their neighbours in social housing, private residents and the broader community with the need to support tenants to sustain their public housing tenancies.

This audit assessed the efficiency and effectiveness of the ‘strikes’ approach to managing antisocial behaviour in public housing environments.

We examined whether:

  • the approach is being implemented as intended and leading to improved safety and security in social housing environments
  • FACS and its partner agencies have the capability and capacity to implement the approach
  • there are effective mechanisms to monitor, report and progressively improve the approach.
Conclusion

FACS has not adequately supported or resourced its staff to implement the antisocial behaviour policy. FACS antisocial behaviour data is incomplete and unreliable. Accordingly, there is insufficient data to determine the nature and extent of the problem and whether the implementation of the policy is leading to improved safety and security

FACS management of minor and moderate incidents of antisocial behaviour is poor. FACS has not dedicated sufficient training to equip frontline housing staff with the relevant skills to apply the antisocial behaviour management policy. At more than half of the housing offices we visited, staff had not been trained to:

  • conduct effective interviews to determine whether an antisocial behaviour complaint can be substantiated

  • de escalate conflict and manage complex behaviours when required

  • properly manage the safety of staff and tenants

  • establish information sharing arrangements with police

  • collect evidence that meets requirements at the NSW Civil and Administrative Tribunal

  • record and manage antisocial behaviour incidents using the information management system HOMES ASB.

When frontline housing staff are informed about serious and severe illegal antisocial behaviour incidents, they generally refer them to the FACS Legal Division. Staff in the Legal Division are trained and proficient in managing antisocial behaviour in compliance with the policy and therefore, the more serious incidents are managed effectively using HOMES ASB. 


FACS provides housing services to most remote townships via outreach visits from the Dubbo office. In remote townships, the policy is not being fully implemented due to insufficient frontline housing staff. There is very limited knowledge of the policy in these areas and FACS data shows few recorded antisocial behaviour incidents in remote regions. 


The FACS information management system (HOMES ASB) is poorly designed and has significant functional limitations that impede the ability of staff to record and manage antisocial behaviour. Staff at most of the housing offices we visited were unable to accurately record antisocial behaviour matters in HOMES ASB, making the data incorrect and unreliable.

Published

Actions for Regional Assistance Programs

Regional Assistance Programs

Premier and Cabinet
Planning
Transport
Compliance
Infrastructure
Management and administration
Project management

Infrastructure NSW effectively manages how grant applications for regional assistance programs are assessed and recommended for funding. Its contract management processes are also effective. However, we are unable to conclude whether the objectives of these programs have been achieved as the relevant agencies have not yet measured their benefits, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. 

In 2011, the NSW Government established Restart NSW to fund new infrastructure with the proceeds from the sale and lease of government assets. From 2011 to 2017, the NSW Government allocated $1.7 billion from the fund for infrastructure in regional areas, with an additional commitment of $1.3 billion to be allocated by 2021. The NSW Government allocates these funds through regional assistance programs such as Resources for Regions and Fixing Country Roads. NSW councils are the primary recipients of funding provided under these programs.

The NSW Government announced the Resources for Regions program in 2012 with the aim of addressing infrastructure constraints in mining affected communities. Infrastructure NSW administers the program, with support from the Department of Premier and Cabinet.

The NSW Government announced the Fixing Country Roads program in 2014 with the aim of building more efficient road freight networks. Transport for NSW and Infrastructure NSW jointly administer this program, which funds local councils to deliver projects that help connect local and regional roads to state highways and freight hubs.

This audit assessed whether these two programs (Resources for Regions and Fixing Country Roads) were being effectively managed and achieved their objectives. In making this assessment, we answered the following questions:

  • How well are the relevant agencies managing the assessment and recommendation process?
  • How do the relevant agencies ensure that funded projects are being delivered?
  • Do the funded projects meet program and project objectives?

The audit focussed on four rounds of Resources for Regions funding between 2013–14 to 2015–16, as well as the first two rounds of Fixing Country Roads funding in 2014–15 and 2015–16.

Conclusion
Infrastructure NSW effectively manages how grant applications are assessed and recommended for funding. Infrastructure NSW’s contract management processes are also effective. However, we are unable to conclude on whether program objectives are being achieved as Infrastructure NSW has not yet measured program benefits.
While Infrastructure NSW and Transport for NSW managed the assessment processes effectively overall, they have not fully maintained all required documentation, such as conflict of interest registers. Keeping accurate records is important to support transparency and accountability to the public about funding allocation. The relevant agencies have taken steps to address this in the current funding rounds for both programs.
For both programs assessed, the relevant agencies have developed good strategies over time to support councils through the application process. These strategies include workshops, briefings and feedback for unsuccessful applicants. Transport for NSW and the Department of Premier and Cabinet have implemented effective tools to assist applicants in demonstrating the economic impact of their projects.
Infrastructure NSW is effective in identifying projects that are 'at‑risk' and assists in bringing them back on track. Infrastructure NSW has a risk‑based methodology to verify payment claims, which includes elements of good practice in grants administration. For example, it requires grant recipients to provide photos and engages Public Works Advisory to review progress claims and visit project sites.
Infrastructure NSW collects project completion reports for all Resources for Regions and Fixing Country Roads funded projects. Infrastructure NSW intends to assess benefits for both programs once each project in a funding round is completed. To date, no funding round has been completed. As a result, no benefits assessment has been done for any completed project funded in either program.
 

The project selection criteria are consistent with the program objectives set by the NSW Government, and the RIAP applied the criteria consistently. Probity and record keeping practices did not fully comply with the probity plans.

The assessment methodology designed by Infrastructure NSW is consistent with2 the program objectives and criteria. In the rounds that we reviewed, all funded projects met the assessment criteria.

Infrastructure NSW developed probity plans for both programs which provided guidance on the record keeping required to maintain an audit trail, including the use of conflict of interest registers. Infrastructure NSW and Transport for NSW did not fully comply with these requirements. The relevant agencies have taken steps to address this in the current funding rounds for both programs.

NSW Procurement Board Directions require agencies to ensure that they do not engage a probity advisor that is engaged elsewhere in the agency. Infrastructure NSW has not fully complied with this requirement. A conflict of interest arose when Infrastructure NSW engaged the same consultancy to act as its internal auditor and probity advisor.

While these infringements of probity arrangements are unlikely to have had a major impact on the assessment process, they weaken the transparency and accountability of the process.

Some councils have identified resourcing and capability issues which impact on their ability to participate in the application process. For both programs, the relevant agencies conducted briefings and webinars with applicants to provide advice on the objectives of the programs and how to improve the quality of their applications. Additionally, Transport for NSW and the Department of Premier and Cabinet have developed tools to assist councils to demonstrate the economic impact of their applications.

The relevant agencies provided feedback on unsuccessful applications to councils. Councils reported that the quality of this feedback has improved over time.

Recommendations

  1. By June 2018, Infrastructure NSW should:
    • ensure probity reports address whether all elements of the probity plan have been effectively implemented.
  1. By June 2018, Infrastructure NSW and Transport for NSW should:
    • maintain and store all documentation regarding assessment and probity matters according to the State Records Act 1998, the NSW Standard on Records Management and the relevant probity plans

Infrastructure NSW is responsible for overseeing and monitoring projects funded under Resources for Regions and Fixing Country Roads. Infrastructure NSW effectively manages projects to keep them on track, however it could do more to assure itself that all recipients have complied with funding deeds. Benefits and outcomes should also start to be measured and reported as soon as practicable after projects are completed to inform assessment of future projects.

Infrastructure NSW identifies projects experiencing unreasonable delays or higher than expected expenses as 'at‑risk'. After Infrastructure NSW identifies a project as 'at‑risk', it puts in place processes to resolve issues to bring them back on track. Infrastructure NSW, working with Public Works Advisory regional offices, employs a risk‑based approach to validate payment claims, however this process should be strengthened. Infrastructure NSW would get better assurance by also conducting annual audits of compliance with the funding deed for a random sample of projects.

Infrastructure NSW collects project completion reports for all Resources for Regions and Fixing Country Roads funded projects. It applies the Infrastructure Investor Assurance Framework to Resources for Regions and Fixing Country Roads at a program level. This means that each round of funding (under both programs) is treated as a distinct program for the purposes of benefits realisation. It plans to assess whether benefits have been realised once each project in a funding round is completed. As a result, no benefits realisation assessment has been done for any project funded under either Resources for Regions or Fixing Country Roads. Without project‑level benefits realisation, future decisions are not informed by the lessons from previous investments.

Recommendations

  1. By December 2018, Infrastructure NSW should:
    • conduct annual audits of compliance with the funding deed for a random sample of projects funded under Resources for Regions and Fixing Country Roads
    • publish the circumstances under which unspent funds can be allocated to changes in project scope
    • measure benefits delivered by projects that were completed before December 2017
    • implement an annual process to measure benefits for projects completed after December 2017
  1. By December 2018, Transport for NSW and Infrastructure NSW should:
    • incorporate a benefits realisation framework as part of the detailed application.

Published

Actions for Grants to non-government schools

Grants to non-government schools

Education
Compliance
Internal controls and governance
Management and administration

The NSW Department of Education could strengthen its management of the $1.2 billion provided to non-government schools annually. This would provide greater accountability for the use of public funds, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford.

Non‑government schools educate 418,000 school children each year, representing 35 per cent of all students in NSW. The NSW Department of Education administers several grant schemes to support these schools, with the aim of improving student learning outcomes and supporting parent choice. To be eligible for NSW Government funding, non‑government schools must be registered with the NSW Education Standards Authority (NESA) and not operate 'for profit' as per section 83C of the NSW Education Act 1990 (the Act). Non‑government schools can either be registered as independent or part of a System Authority.

In 2017–18, non‑government schools in NSW will receive over $1.2 billion from the NSW Government, as well as $3.4 billion from the Australian Government. Recently, the Australian Government has changed the way it funds schools. The NSW Government is assessing how these changes will impact State funding for non‑government schools.

This audit assessed how effectively and efficiently NSW Government grants to non‑government schools are allocated and managed. This audit did not assess the use of NSW Government grants by individual non‑government schools or System Authorities because the Auditor‑General of New South Wales does not have the mandate to assess how government funds are spent by non‑government entities.

Conclusion

The Department of Education effectively and efficiently allocates grants to non‑government schools. Clarifying the objectives of grants, monitoring progress towards these objectives, and improving oversight, would strengthen accountability for the use of public funds by non‑government schools.

We tested a sample of grants provided to non‑government schools under all major schemes, and found that the Department of Education consistently allocates and distributes grants in line with its methodology. The Department has clear processes and procedures to efficiently collect data from schools, calculate the level of funding each school or System should receive, obtain appropriate approvals, and make payments.

We identified three areas where the Department could strengthen its management of grants to provide greater accountability for the use of public funds. First, the Department’s objectives for providing grants to non‑government schools are covered by legislation, intergovernmental agreements and grant guidelines. The Department could consolidate these objectives to allow for more consistent monitoring. Second, the Department relies on schools or System Authorities to engage a registered auditor to certify the accuracy of information on their enrolments and usage of grants. Greater scrutiny of the registration and independence of the auditors would increase confidence in the accuracy of this information. Third, the Department does not monitor how System Authorities reallocate grant funding to their member schools. Further oversight in this area would increase accountability for the use of public funds.

The Department effectively and efficiently allocates grants to non‑government schools. Strengthening its processes would provide greater assurance that the information it collects is accurate.

The Department provides clear guidelines to assist schools to provide the necessary census information to calculate per capita grants. Schools must get an independent external auditor, registered with ASIC, to certify their enrolment figures. The Department checks a sample of the auditors to ensure that they are registered with ASIC. Some other jurisdictions perform additional procedures to increase confidence in the accuracy of the census (for example, independently checking a sample of schools’ census data).

The Department accurately calculates and distributes per capita grants in accordance with its methodology. The previous methodology, used prior to 2018, was not updated frequently enough to reflect changes in schools' circumstances. Over 2014 to 2017, the Department provided additional grants to non‑government schools under the National Education Reform Agreement (NERA), to bring funding more closely in line with the Australian Department of Education and Training's Schooling Resource Standard (SRS). From 2018, the Department has changed the way it calculates per capita grants to more closely align with the Australian Department of Education and Training's approach.

The Department determines eligibility for grants by checking a school's registration status with NESA. However, NESA's approach to monitoring compliance with the registration requirements prioritises student learning and wellbeing requirements over the requirement for policies and procedures for proper governance. Given their importance to the appropriate use of government funding, NESA could increase its monitoring of policies and procedures for proper governance through its program of random inspections. Further, the Department and NESA should enter into a formal agreement to share information to more accurately determine the level of risk of non‑compliance at each school. This may help both agencies more effectively target their monitoring to higher‑risk schools.

By December 2018, the NSW Department of Education should:

  1. Strengthen its processes to provide greater assurance that the enrolment and expenditure information it collects from non‑government schools is accurate. This should build on the work the Australian Government already does in this area.
  2. Establish formal information‑sharing arrangements with the NSW Education Standards Authority to more effectively monitor schools' eligibility to receive funding.
     

By December 2018, the NSW Education Standards Authority should:

  1. Extend its inspection practices to increase coverage of the registration requirement for policies and procedures for the proper governance of schools.
  2. Establish formal information‑sharing arrangements with the NSW Department of Education to more effectively monitor schools' continued compliance with the registration requirements.

The Department’s current approach to managing grants to non‑government schools could be improved to provide greater confidence that funds are being spent in line with the objectives of the grant schemes.

The NSW Government provides funding to non‑government schools to improve student learning outcomes, and to support schooling choices by parents, but does not monitor whether these grants are achieving this. In addition, each grant program has specific objectives. The main objectives for the per capita grant program is to increase the rate of students completing Year 12 (or equivalent), and to improve education outcomes for students. While non‑government schools publicly report on some educational measures via the MySchool website, these measures do not address all the objectives. Strengthened monitoring and reporting of progress towards objectives, at a school level, would increase accountability for public funding. This may require the Department to formalise its access to student level information.

The Department has listed five broad categories of acceptable use for per capita grants, however, provides no further guidance on what expenditure would fit into these categories. Clarifying the appropriate use of grants would increase confidence that funding is being used as intended. Schools must engage an independent auditor, registered with ASIC, to certify that the funding has been spent. The Department could strengthen this approach by improving its processes to check the registration of the auditor, and to verify their independence.

The Department has limited oversight of funding provided to System Authorities (Systems). The Department provides grants to Systems for all their member schools. The Systems can distribute the grants to their schools according to their own methodology. Systems are not required to report to the Department how much of their grant was retained for administrative or centralised expenses. Increased oversight over how the Systems distribute this grant could provide increased transparency for the use of public funds by systems.

By December 2018, the NSW Department of Education should:

  1. Establish and communicate funding conditions that require funded schools to:
    • adhere to conditions of funding, such as the acceptable use of grants, and accounting requirements to demonstrate compliance
    • report their progress towards the objectives of the scheme or wider Government initiatives
    • allow the Department to conduct investigations to verify enrolment and expenditure of funds
    • provide the Department with access to existing student level data to inform policy development and analysis.
  1. Increase its oversight of System Authorities by requiring them to:
    • re‑allocate funds across their system on a needs basis, and report to the Department on this
    • provide a yearly submission with enough detail to demonstrate that each System school has spent their State funding in line with the Department's requirements.

Published

Actions for Council reporting on service delivery

Council reporting on service delivery

Local Government
Compliance
Internal controls and governance
Management and administration
Service delivery

New South Wales local government councils’ could do more to demonstrate how well they are delivering services in their reports to the public, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Many councils report activity, but do not report on outcomes in a way that would help their communities assess how well they are performing. Most councils also did not report on the cost of services, making it difficult for communities to see how efficiently they are being delivered. And councils are not consistently publishing targets to demonstrate what they are striving for.

I am pleased to present my first local government performance audit pursuant to section 421D of the Local Government Act 1993.

My new mandate supports the Parliament’s objectives to:

  • strengthen governance and financial oversight in the local government sector
  • improve financial management, fiscal responsibility and public accountability for how councils use citizens’ funds.

Performance audits aim to help councils improve their efficiency and effectiveness. They will also provide communities with independent information on the performance of their councils.

For this inaugural audit in the local government sector, I have chosen to examine how well councils report to their constituents about the services they provide.

In this way, the report will enable benchmarking and provide improvement guidance to all councils across New South Wales.

Specific recommendations to drive improved reporting are directed to the Office of Local Government, which is the regulator of councils in New South Wales.

Councils provide a range of services which have a direct impact on the amenity, safety and health of their communities. These services need to meet the needs and expectations of their communities, as well as relevant regulatory requirements set by state and federal governments. Councils have a high level of autonomy in decisions about how and to whom they provide services, so it is important that local communities have access to information about how well they are being delivered and meeting community needs. Ultimately councils should aim to ensure that reporting performance is subject to quality controls designed to provide independent assurance.

Conclusion
While councils report on outputs, reporting on outcomes and performance over time can be improved. Improved reporting would include objectives with targets that better demonstrate performance over time. This would help communities understand what services are being delivered, how efficiently and effectively they are being delivered, and what improvements are being made.
To ensure greater transparency on service effectiveness and efficiency, the Office of Local Government (OLG) should work with councils to develop guidance principles to improve reporting on service delivery to local communities. This audit identified an interest amongst councils in improving their reporting and broad agreement with the good practice principles developed as part of the audit.
The Integrated Planning and Reporting Framework (the Framework), which councils are required to use to report on service delivery, is intended to promote better practice. However, the Framework is silent on efficiency reporting and provides limited guidance on how long-term strategic documents link with annual reports produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.
OLG should also work with state agencies to reduce the overall reporting burden on councils by consolidating state agency reporting requirements. 

Councils report extensively on the things they have done, but minimally on the outcomes from that effort, efficiency and performance over time.

Councils could improve reporting on service delivery by more clearly relating the resources needed with the outputs produced, and by reporting against clear targets. This would enable communities to understand how efficiently services are being delivered and how well councils are tracking against their goals and priorities.

Across the sector, a greater focus is also needed on reporting performance over time so that communities can track changes in performance and councils can demonstrate whether they are on target to meet any agreed timeframes for service improvements.

The degree to which councils demonstrate good practice in reporting on service delivery varies greatly between councils. Metropolitan and regional town and city councils generally produce better quality reporting than rural councils. This variation indicates that, at least in the near-term, OLG's efforts in building capability in reporting would be best directed toward rural councils.

Recommendation

By mid-2018, OLG should:

  • assist rural councils to develop their reporting capability.

The Framework which councils are required to use to report on service delivery, is intended to drive good practice in reporting. Despite this, the Framework is silent on a number of aspects of reporting that should be considered fundamental to transparent reporting on service delivery. It does not provide guidance on reporting efficiency or cost effectiveness in service delivery and provides limited guidance on how annual reports link with other plans produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.

Recommendation

By mid-2018, OLG should:

  • issue additional guidance on good practice in council reporting, with specific information on:
    • reporting on performance against targets
    • reporting on performance against outcome
    • assessing and reporting on efficiency and cost effectiveness
    • reporting performance over time
    • clearer integration of all reports and plans that are required by the Framework, particularly the role of End of Term Reporting
    • defining reporting terms to encourage consistency.

The Framework is silent on inclusion of efficiency or cost effectiveness indicators in reports

The guidelines produced by OLG in 2013 to assist councils to implement their Framework requirements advise that performance measures should be included in all plans. However, the Framework does not specifically state that efficiency or cost effectiveness indicators should be included as part of this process. This has been identified as a weakness in the 2012 performance audit report and the Local Government Reform Panel review of reporting by councils on service delivery.

The Framework and supporting documents provide limited guidance on reporting

Councils' annual reports provide a consolidated summary of their efforts and achievements in service delivery and financial management. However, OLG provides limited guidance on:

  • good practice in reporting to the community
  • how the annual report links with other plans and reports required by the Framework.

Further, the Framework includes both Annual and End of Term Reports. However, End of Term reports are published prior to council elections and are mainly a consolidation of annual reports produced during a council’s term. The relationship between Annual reports and End of Term reports is not clear.

OLG is reviewing the Framework and guidance

OLG commenced work on reviewing of the Framework in 2013 but this was deferred with work re‑starting in 2017. The revised guidelines and manual were expected to be released late in 2017.

OLG should build on the Framework to improve guidance on reporting on service delivery, including in annual reports

The Framework provides limited guidance on how best to report on service delivery, including in annual reports. It is silent on inclusion of efficiency or cost effectiveness indicators in reporting, which are fundamental aspects of performance reporting. Councils we consulted would welcome more guidance from OLG on these aspects of reporting.

Our consultation with councils highlighted that many council staff would welcome a set of reporting principles that provide guidance to councils, without being prescriptive. This would allow councils to tailor their approach to the individual characteristics, needs and priorities of their local communities.

Consolidating what councils are required to report to state agencies would reduce the reporting burden and enable councils to better report on performance. Comparative performance indicators are also needed to provide councils and the public with a clear understanding of councils' performance relative to each other.

Recommendations

By mid-2018, OLG should:

  • commence work to consolidate the information reported by individual councils to NSW Government agencies as part of their compliance requirements.
  • progress work on the development of a Performance Measurement Framework, and associated performance indicators, that can be used by councils and the NSW Government in sector-wide performance reporting.

Streamlining the reporting burden would help councils improve reporting

The NSW Government does not have a central view of all local government reporting, planning and compliance obligations. A 2016 draft IPART ‘Review of reporting and compliance burdens on Local Government’ noted that councils provide a wide range of services under 67 different Acts, administered by 27 different NSW Government agencies. Consolidating and coordinating reporting requirements would assist with better reporting over time and comparative reporting. It would also provide an opportunity for NSW Government agencies to reduce the reporting burden on councils by identifying and removing duplication.

Enabling rural councils to perform tailored surveys of their communities may be more beneficial than a state-wide survey in defining outcome indicators

Some councils use community satisfaction survey data to develop outcome indicators for reporting. The results from these are used by councils to set service delivery targets and report on outcomes. This helps to drive service delivery in line with community expectations. While some regional councils do conduct satisfaction surveys, surveys are mainly used by metropolitan councils which generally have the resources needed to run them.

OLG and the Department of Premier and Cabinet have explored the potential to conduct state-wide resident satisfaction surveys with a view to establishing measures to improve service delivery. This work has drawn from a similar approach adopted in Victoria. Our consultation with stakeholders in Victoria indicated that the state level survey is not sufficiently detailed or specific enough to be used as a tool in setting targets that respond to local circumstances, expectations and priorities. Our analysis of reports and consultation with stakeholders suggest that better use of resident survey data in rural and regional areas may support improvements in performance reporting in these areas. Rural councils may benefit more from tailored surveys of groups of councils with similar challenges, priorities and circumstances than from a standard state-wide survey. These could potentially be achieved through regional cooperation between groups of similar councils or regional groups.

Comparative reporting indicators are needed to enable councils to respond to service delivery priorities of their communities

The Local Government Reform Panel in 2012 identified the need for ‘more consistent data collection and benchmarking to enable councils and the public to gain a clear understanding of how a council is performing relative to their peers’.

OLG commenced work in 2012 to build a new performance measurement Framework for councils which aimed to move away from compliance reporting. This work was also strongly influenced by the approach used in Victoria that requires councils to report on a set of 79 indicators which are reported on the Victorian 'Know your council' website. OLG’s work did not fully progress at the time and several other local government representative bodies have since commenced work to establish performance measurement frameworks. OLG advised us it has recently recommenced its work on this project.

Our consultation identified some desire amongst councils to be able to compare their performance to support improvement in the delivery of services. We also identified a level of frustration that more progress has not been made toward establishment of a set of indicators that councils can use to measure performance and drive improvement in service delivery.

Several councils we spoke with were concerned that the current approaches to comparative reporting did not adequately acknowledge that councils need to tailor their service types, level and mix to the needs of their community. Comparative reporting approaches tend to focus on output measures such as number of applications processed, library loans annually and opening hours for sporting facilities, rather than outcome measures. These approaches risk unjustified and adverse interpretations of performance where councils have made a decision based on community consultation, local priorities and available resources. To mitigate this, it is important to

  • adopt a partnership approach to the development of indicators
  • ensure indicators measure performance, not just level of activity
  • compare performance between councils that are similar in terms of size and location.

It may be more feasible, at least in the short term, for OLG to support small groups of like councils to develop indicators suited to their situation.

Based on our consultations, key lessons from implementing a sector-wide performance indicator framework in Victoria included the benefits of:

  • consolidation of the various compliance data currently being reported by councils to provide an initial platform for comparative performance reporting
  • adopting a partnership approach to development of common indicators with groups of like councils.

Published

Actions for Helping older people access a residential aged care facility

Helping older people access a residential aged care facility

Health
Community Services
Compliance
Internal controls and governance
Management and administration
Risk
Service delivery
Shared services and collaboration
Workforce and capability

Assessment processes for older people needing to go to an Residential Aged Care Facility (RACF) vary depending on the processes of the Aged Care Assessement Teams (ACAT) they see and whether or not they are in hospital. The data collected on ACAT performance was significantly revised during 2004 making comparisons with subsequent years problematic. ACATs have more responsibilities than assessing older people for residential care. It is not clear whether they have sufficient resources for this additional workload.

 

Parliamentary reference - Report number #160 - released 5 December 2006

Published

Actions for Controlling and reducing pollution from industry

Controlling and reducing pollution from industry

Planning
Environment
Compliance
Management and administration
Regulation

The regulatory framework introduced under the Protection of the Environment Operations Act 1997, along with other initiatives progressively being implemented by the Environment Protection Authority (EPA), should enhance the overall effectiveness of environment protection in NSW. The Audit Office is of the opinion that the framework is consistent with best practice and once fully implemented, should contribute to the achievement of further improvements in the environmental performance of industry.

However while the legislative framework supports best practice in regulation and enforcement, there are a number of issues which limit the effectiveness of the reforms. Some of the problems, such as the quality of licences and the effectiveness of compliance activities, have been identified by the EPA and may be addressed through recent initiatives.

 

Parliamentary reference - Report number #82 - released 18 April 2001