Refine search Expand filter

Reports

Published

Actions for Local Schools, Local Decisions: needs-based equity funding

Local Schools, Local Decisions: needs-based equity funding

Education
Internal controls and governance
Management and administration
Service delivery

The Auditor-General for New South Wales, Margaret Crawford, released a report today examining the Department of Education’s (the department’s) support and oversight of school planning and use of needs-based funding under the Local Schools, Local Decisions reform.

The report found the department has not had adequate oversight of how schools are using needs-based funding to improve student outcomes since it was introduced in 2014.

The department had not set measures or targets for needs-based equity funding. It had also not been clear enough in how it expected schools to report on the outcomes of additional funding. This means it has not been able to effectively demonstrate the impact of funding at a school, or state-wide level.

To assist with the transition to greater local decision-making, the department provided schools with guidance materials, additional resources and systems support. However, guidance material was not clear enough on the purpose of funding, school budgeting systems were not fit-for-purpose when initially introduced, and support for schools was spread across different areas of the department.

The department has recently increased executive oversight of progress to improve educational outcomes for Aboriginal students and students from a low socio-economic background. It has also developed a consistent set of school-level targets to be implemented from 2020. This may help the department more reliably monitor progress in lifting outcomes for students with additional learning needs.

The report makes eight recommendations aimed at clarifying requirements of schools, better coordinating support and strengthening oversight of the use of needs-based equity funding.

Read full report (PDF)

The Local Schools, Local Decisions reform was launched in 2012 to give public schools more authority to make local decisions about how best to meet the needs of their students. A major element of the reform was the introduction of a new needs-based school funding model. Core elements of the model address staffing and operational requirements, while needs-based elements reflect the characteristics of schools and students within them. This includes equity funding designed to support students with additional needs. The four categories of equity funding are:

  • socio-economic background
  • Aboriginal background
  • English language proficiency
  • low-level adjustment for disability. 

Around $900 million in equity funding was allocated in 2019. School principals decide how to use these funds and account for them through their school annual reports. The Department of Education (the department) supports schools in making these choices with tools and systems, guidelines, and good practice examples.

The objective of this audit was to assess the department's support and oversight of school planning and use of needs-based funding under the Local Schools, Local Decisions reform. To address this objective, the audit examined whether:

  • effective accountability arrangements have been established
  • effective support is provided to schools.  

Conclusion

The department has not had adequate oversight of how schools are using needs-based equity funding to improve student outcomes since it was introduced in 2014. While it provides guidance and resources, it has not set measures or targets to describe the outcomes expected of this funding, or explicit requirements for schools to report outcomes from how these funds were used. Consequently, there is no effective mechanism to capture the impact of funding at a school, or state-wide level. The department has recently developed a consistent set of school-level targets to be implemented from 2020. This may help it to better hold schools accountable for progress towards its strategic goal of reducing the impact of disadvantage.

A significant amount of extra funding has been provided to schools over recent years in recognition of the additional learning needs of certain groups of students facing disadvantage. Under the Local Schools, Local Decisions reform, schools were given the ability to make decisions about how best to use the equity funding in combination with their overall school resources to meet their students’ needs. However, multiple guidelines provided to schools contain inconsistent advice on how the community should be consulted, how funding could be used, and how impact should be reported. Because of this, it is not clear how schools have used equity funding for the benefit of identified groups. School annual reports we reviewed did not fully account for the equity funding received, nor adequately describe the impact of funding on student outcomes.

To help in the transition to greater local decision-making, the department provided extra support by; establishing peer support for new principals, increasing the number of directors, developing data analysis and financial planning systems, targeted training and showcasing good practice. Multiple roles and areas of the department provide advice to schools in similar areas and this support could be better co-ordinated.

Financial planning systems designed to help schools budget for equity and other funding sources were not fit-for-purpose when originally introduced. Schools reported a lack of trust in their budget figures and so were not fully spending their allocated funding. Since then, the department developed and improved a budgeting tool in consultation with stakeholder and user groups. It provided extra funding for administrative support and one-to-one training to help schools develop their capabilities. Despite this, schools we spoke to reported they were not yet fully confident in using the system and needed ongoing training and support. 

Appendix one – Response from agency

Appendix two – About the audit

Appendix three – Performance auditing

 

Copyright notice

© Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Parliamentary reference - Report number #331 - released 8 April 2020.

Published

Actions for Integrity of data in the Births, Deaths and Marriages Register

Integrity of data in the Births, Deaths and Marriages Register

Justice
Premier and Cabinet
Whole of Government
Cyber security
Fraud
Information technology
Internal controls and governance
Management and administration

This report outlines whether the Department of Customer Service (the department) has effective controls in place to ensure the integrity of data in the Births, Deaths and Marriages Register (the register), and to prevent unauthorised access and misuse.

The audit found that the department has processes in place to ensure that the information entered in the register is accurate and that any changes to it are validated. Although there are controls in place to prevent and detect unauthorised access to, and activity in the register, there were significant gaps in these controls. Addressing these gaps is necessary to ensure the integrity of information in the register.

The Auditor-General made nine recommendations to the department, aimed at strengthening controls to prevent and detect unauthorised access to, and activity in the register. These included increased monitoring of individuals who have access to the register and strengthening security controls around the databases that contain the information in the register.

The NSW Registry of Births Deaths and Marriages is responsible for maintaining registers of births, deaths and marriages in New South Wales as well as registering adoptions, changes of names, changes of sex and relationships. Maintaining the integrity of this information is important as it is used to confirm people’s identity and unauthorised access to it can lead to fraud or identity theft.

Read full report (PDF)

The NSW Registry of Births Deaths and Marriages (BD&M) is responsible for maintaining registers of births, deaths and marriages in New South Wales. BD&M is also responsible for registering adoptions, changes of name, changes of sex and relationships. These records are collectively referred to as 'the Register'. The Births, Deaths and Marriages Registration Act 1995 (the BD&M Act) makes the Registrar (the head of BD&M) responsible for maintaining the integrity of the Register and preventing fraud associated with the Register. Maintaining the integrity of the information held in the Register is important as it is used to confirm people's identity. Unauthorised access to, or misuse of the information in the Register can lead to fraud or identity theft. For these reasons it is important that there are sufficient controls in place to protect the information.

BD&M staff access, add to and amend the Register through the LifeLink application. While BD&M is part of the Department of Customer Service, the Department of Communities and Justice (DCJ) manages the databases that contain the Register and sit behind LifeLink and is responsible for the security of these databases.

This audit assessed whether BD&M has effective controls in place to ensure the integrity of data in the Births, Deaths and Marriages Register, and to prevent unauthorised access and misuse. It addressed the following:

  • Are relevant process and IT controls in place and effective to ensure the integrity of data in the Register and the authenticity of records and documents?
  • Are security controls in place and effective to prevent unauthorised access to, and modification of, data in the Register?

Conclusion

BD&M has processes and controls in place to ensure that the information entered in the Register is accurate and that amendments to the Register are validated. BD&M also has controls in place to prevent and detect unauthorised access to, and activity in the Register. However, there are significant gaps in these controls. Addressing these gaps is necessary to ensure the integrity of the information in the Register.

BD&M has detailed procedures for all registrations and amendments to the Register, which include processes for entering, assessing and checking the validity and adequacy of source documents. Where BD&M staff have directly input all the data and for amendments to the Register, a second person is required to check all information that has been input before an event can be registered or an amendment can be made. BD&M carries out regular internal audits of all registration processes to check whether procedures are being followed and to address non-compliance where required.

BD&M authorises access to the Register and carries out regular access reviews to ensure that users are current and have the appropriate level of access. There are audit trails of all user activity, but BD&M does not routinely monitor these. At the time of the audit, BD&M also did not monitor activity by privileged users who could make unauthorised changes to the Register. Not monitoring this activity created a risk that unauthorised activity in the Register would not be detected.

BD&M has no direct oversight of the database environment which houses the Register and relies on DCJ's management of a third-party vendor to provide the assurance it needs over database security. The vendor operates an Information Security Management System that complies with international standards, but neither BD&M nor DCJ has undertaken independent assurance of the effectiveness of the vendor's IT controls.

Appendix one – Response from agency

Appendix two – About the audit

Appendix three – Performance auditing

 

Copyright notice

© Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

 

Parliamentary reference - Report number #330 - released 7 April 2020.

Published

Actions for Supporting the District Criminal Court

Supporting the District Criminal Court

Justice
Community Services
Information technology
Internal controls and governance
Project management

The Auditor-General for New South Wales, Margaret Crawford, released a report today on whether the Department of Communities and Justice (the department) effectively supports the efficient operation of the District Criminal Court system.

The audit found that in the provision of data and technology services, the department is not effectively supporting the efficient operation of the District Criminal Court system. The department has insufficient controls in place to ensure that data in the system is always accurate.

The department is also using outdated technology and could improve its delivery of technical support to courts.

The audit also assessed the implementation of the Early Appropriate Guilty Pleas reform. This reform aims to improve court efficiency by having more cases resolved earlier with a guilty plea in the Local Court. The audit found that the department effectively governed the implementation of the reform but is not measuring achievement of expected benefits, placing the objectives of the reform at risk.

The Auditor-General made seven recommendations to the department, aimed at improving the controls around courts data, reporting on key performance indicators, improving regional technical support and measuring the success of the Early Appropriate Guilty Pleas reform. 

The District Court is the intermediate court in the New South Wales court system. It hears most serious criminal matters, except murder, treason and piracy. The Department of Communities and Justice (the Department) provides support to the District Court in a variety of ways. For example, it provides security services, library services and front-desk services. This audit examined three forms of support that the Department provides to the District Court:

  • data collection, reporting and analysis - the Department collects data from cases in its case management system, JusticeLink, based on the orders Judges make in court and court papers
  • technology - the Department provides technology to courts across New South Wales, as well as technical support for this technology
  • policy - the Department is responsible for proposing and implementing policy reforms.

Recent years have seen a worsening of District Court efficiency, as measured in the Productivity Commission's Report on Government Services (RoGS). Efficiency in the court system is typically measured through timeliness of case completion. There is evidence that timeliness has worsened. For example, the median time from arrest to finalisation of a case in the District Court increased from 420 days in 2012–13 to 541 days in 2017–18.

As a result, the government has announced a range of measures to improve court performance, particularly in the District Court. These measures included the Early Appropriate Guilty Pleas (EAGP) reform. One of the objectives of EAGP is to improve court efficiency, which would be achieved by having more cases resolve with a guilty plea in the Local Court.

This audit assessed whether the Department of Communities and Justice effectively supports the efficient operation of the District Criminal Court system. We assessed this with the following lines of inquiry:

  • Does the Department effectively collect, analyse and report performance information relevant to court efficiency?
  • Does the Department effectively provide technology to support the efficient working of the courts?
  • Does the Department have effective plans, governance and monitoring for the Early Appropriate Guilty Pleas reform?

The audit did not consider other support functions provided by the Department. Further information on the audit, including detailed audit criteria, may be found in Appendix two.

Conclusion
In the provision of data and technology services, the Department is not effectively supporting the efficient operation of the District Criminal Court system. The Department has insufficient controls in place to ensure accurate data in the District Criminal Court system. The Department is also using outdated technology in significant numbers and could improve its delivery of technical support to meet agreed targets.
The Department effectively governed the implementation of the Early Appropriate Guilty Pleas reform. However, it is not ensuring that the benefits stated in the business case are being achieved, placing its objectives at risk.
The impact of inaccurate court data can be severe, and the Department does not have sufficient controls in place to ensure that its court data is accurate. Recent Bureau of Crime Statistics and Research reviews have identified data inaccuracies, and this demonstrates the Department needs strong controls in place to ensure that its court data is accurate.
The Department does not have a policy for data quality and has not formally assigned responsibility for data quality to any individual or branch. The Department also does not have a data dictionary outlining all the fields in its case management system. While the Department validates the highest risk items, such as warrants, to ensure that they are accurate, most data is not validated. The Department has recently commenced setting up a data unit for the Courts, Tribunals and Service Delivery branch. It is proposed that this unit will address most of the identified shortcomings.
The Department did not provide timely technical support to the court system in 2017 and is using outdated technology in significant numbers. The Digital and Technology Services branch of the Department had agreed a Service Level Agreement with the rest of the Department, outlining the expected speed of technical support responses. The branch did not meet response times in 2017. Performance improved in 2018, though DTS fell short of its targets for critical and moderate priority incidents. Critical incidents are particularly important to deal with in a timely manner as they include incidents which may delay a court sitting.
Requests for technical support rose significantly in 2018 compared to 2017, which may be related to the number of outdated pieces of technology. As at April 2019, the whole court system had 2,389 laptops or desktop computers outside their warranty period. The Department was also using other outdated technology. Outdated technology is more prone to failure and continuing to use it poses a risk of court delays.
The Department is not measuring all the expected benefits from the Early Appropriate Guilty Pleas reform, placing the objectives of the program at risk. The Early Appropriate Guilty Pleas business case outlined nine expected benefits from the reform. The Department is not measuring one of these benefits and is not measuring the economic benefits of a further five business case benefits. Not measuring the impact of the reform means that the Department does not know if it is achieving its objectives and if the reform had the desired impact.

The Department is responsible for providing technology to the courts, which can improve the efficiency of court operations by making them faster and cheaper. The Department is also responsible for providing technical support to courtrooms and registries. It is important that technical support is provided in a timely manner because some technical incidents can delay court sittings and thus impact on court efficiency. A 2013 Organisation for Economic Co‑operation and Development report emphasised the importance of technology and digitisation for reducing trial length.

While the Department may provide technology to the courts, they are not responsible for deciding when, how or if the technology is used in the courtroom.

The Department is using a significant amount of outdated technology, risking court delays

As of April 2019, the whole court system had 2,389 laptops or desktop computers out of warranty, 56.0 per cent of the court system's fleet. The court system also had 786 printing devices out of their normal warranty period, 75.1 per cent of all printers in use. The Department also advised that many of its court audio transcription machines are out of date. These machines must be running for the court to sit and thus it is critical that they are maintained to a high degree. The then Department of Justice estimated the cost of aligning its hardware across the whole Department with desired levels at $14.0 million per year for three years. Figures for the court system were not calculated but they are likely to be a significant portion of this figure.

Using outdated technology poses a risk to the court system as older equipment may be more likely to break down, potentially delaying courts or slowing down court services. In the court system throughout 2018, hardware made up 30.8 per cent of all critical incidents reported to technical support and 41.9 per cent of all high priority incidents. In addition, 16.2 per cent of all reported issues related to printing devices or printing.

From 2017 to 2018, technical support incidents from courts or court services increased. There were 4,379 technical support incidents in 2017, which increased significantly to 9,186 in 2018. The Department advised that some outside factors may have contributed to this increase. The Department was rolling out its new incident recording system throughout 2017, meaning that there would be an under‑reporting of incidents in that year. The Department also advised that throughout 2018 there was a greater focus on ensuring that every issue was logged, which had not previously been the case. Despite these factors, the use of outdated technology has likely increased the risk of technology breakages and may have contributed to the increase in requests for technical support.

Refreshing technology on a regular basis would reduce the risk of hardware failures and ensure that equipment is covered by warranty.

The Department did not meet all court technical support targets in 2017 and 2018

The Digital and Technology Services branch (DTS) was responsible for providing technical support to the courts and the Courts and Tribunal Services branch prior to July 2019. DTS provided technical support in line with a Service Level Agreement (SLA) with the Department. In 2017, DTS did not provide this support in a timely manner. Performance improved in 2018, though DTS fell short of its targets for critical and moderate priority incidents. Exhibit 7 outlines DTS' targets under the SLA.

Exhibit 7: Digital and Technology Services' Service Level Agreement
Priority Target resolution time Target percentage in time (%)
1. Critical 4 hours 80
2. High 1 day 80
3. Moderate 3 days 85
4. Low 5 days 85
Source: Department of Communities and Justice, 2019.

Critical incidents are particularly important for the Department to deal with in a timely manner because these include incidents which may delay a court sitting until resolved or incidents which impact on large numbers of staff. Some of the critical incidents raised with DTS specifically stated that they were delaying a court sitting, often due to transcription machines not working. High priority incidents include those where there is some impact on the functions of the business, which may in turn affect the efficiency of the court system. High priority incidents also include those directly impacting on members of the Judiciary. 

This audit examined DTS' performance against its SLA in the 2017 and 2018 calendar years across the whole court system, not just the District Court. The total number of incidents, as well as critical and high priority incidents, can be seen in Exhibit 8.

Exhibit 8: Number of incidents in 2017 and 2018
Priority 2017 2018
All 4,379 9,186
1. Critical 48 91
2. High 128 315
Source: Audit Office of NSW analysis of Department of Communities and Justice data, 2019.

The Department's results against its SLA in 2017 and 2018 are shown in Exhibit 9.

The Early Appropriate Guilty Pleas (EAGP) reform consists of five main elements:

  • early disclosure of evidence from NSW Police Force to the prosecution and defence
  • early certification of what the accused is going to be charged with to minimise changes
  • mandatory criminal case conferencing between the prosecutor and accused's representation
  • changes to Local Court case management
  • more structured sentence discounts.

More detailed descriptions of each of these changes can be found in the Introduction. These reform elements are anticipated to have three key effects:

  • accelerate the timing of guilty pleas
  • increase the overall proportion of guilty pleas
  • decrease the average length of contested trials.

Improving District Court efficiency is one of the stated aims of EAGP, which would be achieved by having more cases resolve in the Local Court and having fewer defendants plead guilty on the day of their trial in the District Court. The reform commenced in April 2018 and it is too early to state the impact of this reform on District Court efficiency.

The Department is responsible for delivering EAGP in conjunction with other justice sector agencies. They participated in the Steering Committee and the Working Groups, as well as providing the Project Management Office (PMO).

The Department is not measuring the economic benefits stated in the EAGP business case

The business case for EAGP listed nine quantifiable benefits which were expected to be derived from the achievement of the three key effects listed above. The Department is not measuring one of these benefits and is not measuring the economic benefits for five more, as shown in Exhibit 12.

Benefit Economic benefit (over ten years) Being measured?
Accelerated timing of guilty pleas $54.6m yellow circle with minus in the center
Increased guilty plea rate $90.7m yellow circle with minus in the center
Decreased average trial length $27.5m yellow circle with minus in the center
A reduction in the delay of indictable matters proceeding to trial N/A check circle mauve
Increase the number of finalised matters per annum N/A check circle mauve
Reduction of the current backlog of criminal trials in the District Court N/A check circle mauve
Reduction in bed pressure on the correction system due to reduced
average time in custody
$13.7m Exclamation circle red
Productivity improvements due to reduction in wasted effort $53.3m yellow circle with minus in the center
Bankable cost savings due to jury empanelment avoided $2.5m yellow circle with minus in the center

 

Exhibit 12: The Department's measurement of quantifiable benefits
Key check circle mauve Measuring yellow circle with minus in the center Not measuring economic benefit Exclamation circle red Not measuring
Source: Audit Office of NSW analysis.

While it is too early to comment on the overall impact of EAGP, better practice in benefits realisation involves an ongoing effort to monitor benefits to ensure that the reform is on target and determine whether any corrective action is needed.

The Department is measuring the number of finalised matters per annum and while the Department is not measuring the reduction in the backlog as part of this program, this measure is reported as part of the Department's internal reporting framework. The Department is not monitoring the reduction in delay of indictable matters proceeding to trial directly as part of this reform, but this does form part of the monthly Operational Performance Report which the Department sends to the EAGP Steering Committee.

The Department is not monitoring any of the economic benefits stated in the business case. These economic benefits are a mixture of bankable savings and productivity improvements. This amounts to a total of $242.3 million over ten years which was listed in the business case as potential economic benefits from the implementation of this reform against the total cost of $206.9 million over ten years. The Department is collecting proxy indicators which would assist in these calculations for several indicators, but it is not actively monitoring these savings. For example, the Department is monitoring average trial length, but is not using this information to calculate economic benefits derived from changes in trial length.

The Department is also not collecting information related to the average length of custody as part of this program. This means that it is unable to determine if EAGP is putting less pressure on the correctives system and it is not possible for the Department to calculate the savings from this particular benefit.

While stakeholders are optimistic about the impact of EAGP, not measuring the expected benefits stated in the business case means that the Department does not know if the reform is achieving what it was designed to achieve. Further, the Department does not know if it must take corrective action to ensure that the program achieves the stated benefits. These two things put the overall program benefits at risk.

The Department has not assigned responsibility for the realisation of each benefit, potentially risking the success of the program

The Department has not assigned responsibility for the realisation of each benefit stated in the business case. The Department holds the Steering Committee responsible for the realisation of all benefits. Benefits realisation is the process which ensures that the agency reaches benefits as stated in the business case. Assigning responsibility for benefits realisation to the Steering Committee rather than individuals is not in line with good practice.

Good practice benefits realisation involves assigning responsibility for the realisation of each benefit to an individual at the business unit level. This ensures there is a single point of accountability for each part of the program with knowledge of the benefit and the ability to take corrective action if it looks like that benefit will not be realised. This responsibility should sit at the operational level where detailed action can most easily be undertaken. The role of a Steering Committee in benefits realisation is to ensure that responsible parties are monitoring their benefits and taking appropriate corrective action.

The Department advised that it believes the Steering Committee should have responsibility for the realisation of benefits due to the difficulty of attributing the achievement of each benefit to one part of the reform alone. Given the Steering Committee meets only quarterly, it is not well placed to take action in response to variances in performance.

A BOCSAR evaluation is planned, however data errors make some of the information unreliable

BOCSAR are planning to undertake an overall evaluation of EAGP which is planned for release in 2021. Undertaking this evaluation will require high quality data to gain an understanding of the drivers of the reform. However, data captured throughout the first year of EAGP has proven unreliable, which may reduce the usefulness of BOCSAR's evaluation. These data issues were discussed in Exhibit 5 in Chapter 2, above. Access to accurate data is vital for conducting any program evaluation and inaccurate data raises the risk that the BOCSAR evaluation will not be able to provide an accurate evaluation of the impact of EAGP.

In addition to the BOCSAR evaluation, the Department had plans for a series of 'snapshot' evaluations for some of the key elements of the reform to ensure that they were operating effectively. These were initially delayed due to an efficiency dividend which affected EAGP. In August 2019, the Department commissioned a review of the implementation of several key success factors for EAGP.

There was clear governance throughout the implementation of EAGP

The implementation stage of EAGP had clear governance, lines of authority and communication. The Steering Committee, each Working Group and each agency had clear roles and responsibilities, and these were organised through a Project Management Office (PMO) provided by the former Department of Justice. The governance structure throughout the implementation phase can be seen at Exhibit 13.

The Steering Committee was established in December 2016 and met regularly from March 2017. It comprised senior members of key government agencies, as well as the Chief Judge and the Chief Magistrate for most of the duration of the implementation period. The Steering Committee met at least monthly throughout the life of the program. The Steering Committee was responsible for overseeing the delivery of EAGP and making key decisions relating to implementation, including spending decisions. The Chief Judge and the Chief Magistrate abstained from financial decisions. The Steering Committee updated the governance and membership of the Steering Committee as appropriate throughout the life of the reform.

Appendix one – Response from agency
 
Appendix two – About the audit 

Appendix three – Performance auditing 

 

Copyright Notice

© Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Parliamentary Reference: Report number #329 - released 18 December 2019

Published

Actions for Ensuring contract management capability in government - Department of Education

Ensuring contract management capability in government - Department of Education

Education
Compliance
Internal controls and governance
Management and administration
Procurement
Workforce and capability

This report examines whether the Department of Education has the required contract management capability to effectively manage high-value goods and services contracts (over $250,000). In 2017–18, the department managed high-value goods and services contracts worth $3.08 billion, with most of the contracts running over multiple years.

NSW government agencies are increasingly delivering services and projects through contracts with third parties. These contracts can be complex and governments face challenges in negotiating and implementing them effectively.

Contract management capability is a broad term, which can include aspects of individual staff capability as well as organisational capability (such as policies, frameworks and processes).

In 2017–18, the Department of Education (the Department) managed high-value (over $250,000) goods and services contracts worth $3.08 billion, with most of the contracts running over multiple years. The Department delivers, funds and regulates education services for NSW students from early childhood to secondary school.

This audit examined whether the Department has the required capability to effectively manage high-value goods and services contracts.

We did not examine infrastructure, construction or information communication and technology contracts. We assessed the Department against the following criteria:

  1. The Department’s policies and procedures support effective contract management and are consistent with relevant frameworks, policies and guidelines.
  2. The Department has capable personnel to effectively conduct the monitoring activities throughout the life of the contract.

The NSW Public Service Commission and the Department of Finance, Services and Innovation are included as auditees as they administer policies which directly affect contract management capability, including:

  • NSW Procurement Board Directions and policies
  • NSW Procurement Agency Accreditation Scheme
  • NSW Public Sector Capability Framework.

The Department of Finance, Services and Innovation's responsibility for NSW Procurement will transfer to NSW Treasury on 1 July 2019 as part of changes to government administrative arrangements announced on 2 April 2019 and amended on 1 May 2019.

Conclusion

The Department of Education's procedures and policies for goods and services contract management are consistent with relevant guidance. It also has a systemic approach to defining the capability required for contract management roles. That said, there are gaps in how well the Department uses this capability to ensure its contracts are performing. We also found one program (comprising 645 contracts) that was not compliant with the Department's policies.

The Department has up-to-date policies and procedures that are consistent with relevant guidance. The Department also communicates changes to procurement related policies, monitors compliance with policies and conducts regular reviews aiming to identify non-compliance.

The Department uses the NSW Public Service Commission's capability framework to support its workforce management and development. The capability framework includes general contract management capability for all staff and occupation specific capabilities for contract managers. The Department also provides learning and development for staff who manage contracts to improve their capability.

The Department provides some guidance on different ways that contract managers can validate performance information provided by suppliers. However, the Department does not provide guidance to assist contract managers to choose the best validation strategy according to contract risk. This could lead to inconsistent practice and contracts not delivering what they are supposed to.

We found that none of the 645 contracts associated with the Assisted Schools Travel Program (estimated value of $182 million in 2018–19) have contract management plans. This is contrary to the Department's policies and increases the risk that contract managers are not effectively reviewing performance and resolving disputes.

Appendix one - Response from agencies

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary Reference: Report number #325 - released 28 June 2019

Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Published

Actions for Contracting non-government organisations

Contracting non-government organisations

Community Services
Compliance
Fraud
Management and administration
Procurement
Regulation
Service delivery

This report found the Department of Family and Community Services (FACS) needs to do more to demonstrate it is effectively and efficiently contracting NGOs to deliver community services in the Permanency Support Program (a component of out-of-home-care services) and Specialist Homelessness Services. It notes that FACS is moving to an outcomes-based commissioning model and recommends this be escalated consistent with government policy.

Government agencies, such as the Department of Family and Community Services (FACS), are increasingly contracting non-government organisations (NGOs) to deliver human services in New South Wales. In doing so, agencies are responsible for ensuring these services are achieving expected outcomes. Since the introduction of the Commissioning and Contestability Policy in 2016, all NSW Government agencies are expected to include plans for customer and community outcomes and look for ways to use contestability to raise standards.

Two of the areas receiving the greatest funding from FACS are the Permanency Support Program and Specialist Homelessness Services. In the financial year 2017–18, nearly 500 organisations received $784 million for out-of-home care programs, including the Permanency Support Program. Across New South Wales, specialist homelessness providers assist more than 54,000 people each year and in the financial year 2017–18, 145 organisations received $243 million for providing short term accommodation and homelessness support, including Specialist Homelessness Services.

In the financial year 2017–18, FACS entered into 230 contracts for out-of-home care, of which 49 were for the Permanency Support Program, representing $322 million. FACS also entered into 157 contracts for the provision of Specialist Homelessness Services which totalled $170 million. We reviewed the Permanency Support Program and Specialist Homelessness Services for this audit.

This audit assessed how effectively and efficiently FACS contracts NGOs to deliver community services. The audit could not assess how NGOs used the funds they received from FACS as the Audit Office does not have a mandate that could provide direct assurance that NGOs are using government funds effectively.

Conclusion
FACS cannot demonstrate it is effectively and efficiently contracting NGOs to deliver community services because it does not always use open tenders to test the market when contracting NGOs, and does not collect adequate performance data to ensure safe and quality services are being provided. While there are some valid reasons for using restricted tenders, it means that new service providers are excluded from consideration - limiting contestability. In the service delivery areas we assessed, FACS does not measure client outcomes as it has not yet moved to outcomes-based contracts. 
FACS' procurement approach sometimes restricts the selection of NGOs for the Permanency Support Program and Specialist Homelessness Services
FACS has a procurement policy and plan which it follows when contracting NGOs for the provision of human services. This includes the option to use restricted tenders, which FACS sometimes uses rather than opening the process to the market. The use of restricted tenders is consistent with its procurement plan where there is a limited number of possible providers and the services are highly specialised. However, this approach perpetuates existing arrangements and makes it very difficult for new service providers to enter the market. The recontracting of existing providers means FACS may miss the opportunity to benchmark existing providers against the whole market. 
FACS does not effectively use client data to monitor the performance of NGOs funded under the Permanency Support Program and Specialist Homelessness Services
FACS' contract management staff monitor individual NGO performance including safety, quality of services and compliance with contract requirements. Although FACS does provide training materials on its intranet, FACS does not provide these staff with sufficient training, support or guidance to monitor NGO performance efficiently or effectively. FACS also requires NGOs to self-report their financial performance and contract compliance annually. FACS verifies the accuracy of the financial data but conducts limited validation of client data reported by NGOs to verify its accuracy. Instead, FACS relies on contract management staff to identify errors or inaccurate reporting by NGOs.
FACS' ongoing monitoring of the performance of providers under the Permanency Support Program is particularly limited due to problems with timely data collection at the program level. This reduces FACS' ability to monitor and analyse NGO performance at the program level as it does not have access to ongoing performance data for monitoring service quality.
In the Specialist Homelessness Services program, FACS and NGOs both provide the data required for the National Minimum Data Set on homelessness and provide it to the Australian Institute of Health and Welfare, as they are required to do. However, this data is not used for NGO performance monitoring or management.
FACS does not yet track outcomes for clients of NGOs
FACS began to develop an approach to outcomes-based contracting in 2015. Despite this, none of the contracts we reviewed are using outcomes as a measure of success. Currently, NGOs are required to demonstrate their performance is consistent with the measures stipulated in their contracts as part of an annual check of their contract compliance and financial accounts. NGOs report against activity-based measures (Key Performance Indicators) and not outcomes.
FACS advises that the transition to outcomes-based contracting will be made with the new rounds of funding which will take place in 2020–2021 for Specialist Homelessness Services and 2023 for the Permanency Support Program. Once these contracts are in place, FACS can transition NGOs to outcomes based reporting.
Incomplete data limits FACS' effectiveness in continuous improvement for the Permanency Support Program and Specialist Homelessness Services
FACS has policies and procedures in place to learn from past experiences and use this to inform future contracting decisions. However, FACS has limited client data related to the Permanency Support Program which restricts the amount of continuous improvement it can undertake. In the Specialist Homelessness Support Program data is collected to inform routine contract management discussions with service providers but FACS is not using this data for continuous improvement. 

Appendix one – Response from agency

Appendix two – About the audit

Appendix three – Performance auditing

 

Parliamentary Reference: Report number #323 - released 26 June 2019

Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Published

Actions for Wellbeing of secondary school students

Wellbeing of secondary school students

Education
Management and administration
Service delivery
Shared services and collaboration
Workforce and capability

The Department of Education has a strong focus on supporting secondary school students’ wellbeing. However, it is difficult to assess how well the Department is progressing as it is yet to measure or report on the outcomes of this work at a whole-of-state level.

The Department of Education’s (the Department) purpose is to prepare young people for rewarding lives as engaged citizens in a complex and dynamic society. The Department commits to creating quality learning opportunities for children and young people, including a commitment to student wellbeing, which is seen as directly linked to positive learning outcomes. Wellbeing is defined broadly by the Department as “the quality of a person’s life…It is more than the absence of physical or psychological illness”. Student wellbeing can be supported by everything a school does to enhance a student's learning—from curriculum to teacher quality to targeted policies and programs to whole-school approaches to wellbeing.

Several reforms have aimed to support student wellbeing in recent years. 'Local Schools, Local Decisions' gave NSW schools more local authority to make decisions, including schools' approaches to support student wellbeing. In 2016, the 'Supported Students, Successful Students' initiative provided $167 million over four years to support the wellbeing of students. From 2018, the 'Every Student is Known, Valued and Cared For' initiative provides a principal led mentoring program, and a website with policies, procedures and resources to support student wellbeing.

This audit assessed how well the Department of Education supports secondary schools to promote and support the wellbeing of their students and how well secondary schools are promoting and supporting the wellbeing of their students.

Conclusion

The Department has implemented a range of programs and reforms aimed at supporting student wellbeing. However, the outcomes of this work have yet to be measured or reported on at a system level, making it difficult to assess the Department's progress in improving student wellbeing.

Secondary schools have generally adopted a structured approach to deliver wellbeing support and programs, using both Department and localised resources. The approaches have been tailored to meet the needs of their school community. That said, public reporting on wellbeing improvement measures via annual school reports is of variable quality and needs to improve.

The Department’s wellbeing initiatives are supported by research and consultation, but outcomes have not been reported on

The Department’s development of wellbeing policy, guidance, tools and resources has been transparent, consultative and well researched. It has drawn on international and domestic evidence to support its aim to deliver a fundamental shift from welfare to wellbeing at the school and system level.

However, the key performance indicator to monitor and track progress in wellbeing has yet to be reported on despite the strategic plan including this as a priority for the period 2018 to 2022. This includes not yet reporting a baseline for the target, nor how it will be measured.

The Department’s wellbeing resources are mostly well targeted but there is room for improvement

The Department’s allocation of resources to deliver wellbeing initiatives in schools is mostly well targeted, reflects a needs basis and supports current strategic directions. This could be improved with some changes to formula allocations and clearer definitions of the resourcing required for identified wellbeing positions in schools. The workforce modelling for forecasting supply and demand, specifically for school counsellors and psychologists, needs to separately identify these positions as they are currently subsumed in general teacher numbers.

Schools' reporting on wellbeing improvement measures is of variable quality and needs to improve

Schools we visited demonstrated a variety of approaches to wellbeing depending on their local circumstances and student populations. They make use of Department policies, guidelines, and resources, particularly mandatory policies and data collections, which have good compliance and take-up at school level. Professional learning supports specific wellbeing initiatives and online systems for monitoring and reporting have contributed to schools’ capacity and capabilities.

Schools report publicly on wellbeing improvement measures through annual school reports but this reporting is of variable quality. The Department plans to improve the capability of schools in data analysis and we recommend that this include the setting and evaluation of improvement targets for wellbeing.

The implementation of the 2015 Wellbeing Framework in schools is incomplete and the Department has not effectively prioritised and consolidated tools, systems and reporting for wellbeing

Schools' take up of the 2015 Wellbeing Framework is hindered by it not being linked to the school planning and reporting policy and tools—the School Excellence Framework. At some schools we visited, this disconnect has led to a lack of knowledge and confidence in using it in schools. The Department has identified the need to improve alignment of policies, frameworks and plans and has commenced work on this.

We found evidence of overburdening in schools for addressing student wellbeing—in the number of tools, online systems for information collection, and duplication in reporting. Following the significant reforms of recent years, the Department should consolidate its efforts by reinforcing existing effective programs and systems and addressing identified gaps and equity issues, rather than introducing further change for schools. In particular, methods and processes for complex case coordination need improvement.

The NSW Department of Education commits to creating quality learning opportunities for students. This includes strengthening students’ physical, social, emotional and spiritual development. The Department sets out to enable students to be healthy, happy, engaged and successful.

Welfare and wellbeing

The Department’s approach has significantly shifted from student welfare to wellbeing of the whole child and young person. Wellbeing is defined in departmental policy and strategy documents broadly, and as directly linked to learning and positive learning outcomes. “Wellbeing can be described as the quality of a person’s life…It is more than the absence of physical or psychological illness…Wellbeing, or the lack of it, can affect a student’s engagement and success in learning…”

Student wellbeing can be supported by everything a school does to enhance a student's learning—from curriculum to teacher quality to targeted policies and programs to whole-school approaches to wellbeing. Distinctions between wellbeing and welfare in the school context are outlined below.

Exhibit 1: Welfare and wellbeing
Welfare Wellbeing
Operates from a basis of student need and doesn't always take into account a whole child view. For all students.
Rather than building on the strengths of students, operates from a deficit model of individual student problems or negative behaviours. Goes beyond just welfare needs of a few students and aims for all students to be healthy, happy, successful and productive individuals who are active and positive contributors to the school and society in which they live.

Source: Department of Education 2018 'Wellbeing is here' presentation.

Published

Actions for Governance of Local Health Districts

Governance of Local Health Districts

Health
Internal controls and governance
Management and administration

The main roles, responsibilities and relationships between Local Health Districts (LHDs), their Boards and the Ministry of Health are clear and understood, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. However, there are opportunities to achieve further maturity in the system of governance and the audit report recommended a series of actions to further strengthen governance arrangements.

Fifteen Local Health Districts (LHDs) are responsible for providing public hospital and related health services in NSW. LHDs are:

  • established as statutory corporations under the Health Services Act 1997 to manage public hospitals and provide health services within defined geographical areas
  • governed by boards of between six and 13 people appointed by the Minister for Health
  • managed by a chief executive who is appointed by the board with the concurrence of the Secretary of NSW Health
  • accountable for meeting commitments made in annual service agreements with the NSW Ministry of Health.

The NSW Ministry of Health (the Ministry) is the policy agency for the NSW public health system, providing regulatory functions, public health policy, as well as managing the health system, including monitoring the performance of hospitals and health services.

The current roles and responsibilities of LHDs and the Ministry, along with other agencies in NSW Health, were established in 2011 following a series of reforms to the structure and governance of the system. These reforms began with the report of the 'Special Commission of Inquiry into Acute Care Services in NSW Public Hospitals' ('the Garling Inquiry'), which was released in 2008, and were followed by reforms announced by the incoming coalition government in 2011.

These reforms were intended to deliver greater local decision making, including better engagement with clinicians, consumers, local communities, and other stakeholders in the primary care (such as general practitioners) and non-government sectors.

The reforms empowered LHDs by devolving some management and accountability from the Ministry for the delivery of health services in their area. LHDs were made accountable for meeting annual obligations under service agreements.

This audit assessed the efficiency and effectiveness of the governance arrangements for LHDs. We answered two questions:

  • Are there clear roles, responsibilities and relationships between the Ministry of Health and LHDs and within LHDs?
  • Does the NSW Health Performance Framework establish and maintain accountability, oversight and strategic guidance for LHDs?
Conclusion
Main roles, responsibilities and relationships between LHDs, their boards, and the Ministry of Health are clear and understood, though there is opportunity to achieve further maturity in the system of governance for LHDs.
Main roles and responsibilities are clear and understood by local health district (LHD) board members and staff, Ministry of Health executive staff, and key stakeholders. However, there is some ambiguity for more complex and nuanced functions. A statement of principles to support decision making in a devolved system would help to ensuring that neither LHDs or the Ministry 'over-reach' into areas that are more appropriately the other's responsibility.
Better clinician engagement in LHD decision making was a key driver for devolution. This engagement has not met the expectations of devolution and requires attention as a priority.
Relationships between system participants are collaborative, though the opportunity should be taken to further embed this in the system structures and processes and complement existing interpersonal relationships and leadership styles.
Accountability and oversight mechanisms, including the Health Performance Framework and Service Agreements, have been effective in establishing accountability, oversight and strategic guidance for LHDs.
The Health Performance Framework and Service Agreements have underpinned a cultural shift toward greater accountability and oversight. However, as NSW Health is a large, complex and dynamic system, it is important that these accountability and oversight mechanisms continue to evolve to ensure that they are sufficiently robust to support good governance.
There are areas where accountability and oversight can be improved including:
  • continued progress in moving toward patient experience, outcome, and quality and safety measures
  • improving the Health Performance Framework document to ensure it is comprehensive, clear and specifies decision makers
  • greater clarity in the nexus between underperformance and escalation decisions
  • including governance-related performance measures
  • more rigour in accountability for non-service activity functions, including consumer and community engagement
  • ensuring that performance monitoring and intervention is consistent with the intent of devolution. 
There is clear understanding of the main roles and responsibilities of LHDs and the Ministry of Health under the structural and governance reforms introduced in 2011. Strongly collaborative relationships provide a good foundation on which governance arrangements can continue to mature, though there is a need to better ensure that clinicians are involved in LHD decision making.

NSW Health is large and complex system, operating in a dynamic environment. The governance reforms introduced in 2011 were significant and it is reasonable that they take time to mature.

The main roles of LHDs and the Ministry are clear and well-understood, and there is good collaboration between different parts of the system. This provides a sound foundation on which to further mature the governance arrangements of LHDs.

While the broad roles of LHDs, their boards, and the Ministry are well understood by stakeholders in the system, there are matters of detail and complexity that create ambiguity and uncertainty, including:

  • the roles and relationships between the LHDs and the Pillars
  • to what extent LHDs have discretion to pursue innovation
  • individual responsibility and obligations between chairs, boards, executive staff, and the Ministry.

These should be addressed collaboratively between boards, their executives, and the Ministry, and should be informed by a statement of principles that guides how devolved decision making should be implemented.

Better clinician engagement in health service decision making was a key policy driver for devolution. Priority should be given by LHDs and the Ministry to ensuring that clinicians are adequately engaged in LHD decision making. It appears that in many cases they are not, and this needs to be addressed.

The quality of board decision making depends on the information they are provided and their capacity to absorb and analyse that information. More can be done to promote good decision making by improving the papers that go to boards, and by ensuring that board members are well positioned to absorb the information provided. This includes ensuring that the right type and volume of information are provided to boards, and that members and executive managers have adequate data literacy skills to understand the information.

Recommendations

  1. By December 2019, the Ministry of Health should:
     
    1. work with LHDs to identify and overcome barriers that are limiting the appropriate engagement of clinicians in decision making in LHDs
    2. develop a statement of principles to guide decision making in a devolved system
    3. provide clarity on the relationship of the Agency for Clinical Innovation and the Clinical Excellence Commission to the roles and responsibilities of LHDs.
       
  2. By June 2020, LHDs boards, supported where appropriate by the Ministry of Health, should address the findings of this performance audit to ensure that local practices and processes support good governance, including:
     
    1. providing timely and consistent induction; training; and reviews of boards, members and charters
    2. ensuring that each board's governance and oversight of service agreements is consistent with their legislative functions
    3. improving the use of performance information to support decision making by boards and executive managers.
Accountability and oversight mechanisms, including the Health Performance Framework and service agreements, have been effective in establishing accountability, oversight and strategic guidance for LHDs. They have done this by driving a cultural shift that supports LHDs being accountable for meeting their obligations. These accountablity and oversight mechanisms must continue to evolve and be improved.

This cultural shift has achieved greater recognition of the importance of transparency in how well LHDs perform. However, as NSW Health is a large, complex and dynamic system, it is important that these accountability and oversight mechanisms continue to evolve to ensure that they are sufficiently robust to support good governance.

There are areas where accountability and oversight can be improved including:

  • continued progress in moving toward patient experience, outcome and value-based measures
  • improving the Health Performance Framework document to ensure it is comprehensive, clear and specifies decision makers
  • greater clarity in the nexus between underperformance and escalation decisions
  • by adding governance-related performance measures to service agreements
  • more rigour in accountability for non-service activity functions, such as consumer and community engagement
  • ensuring that performance monitoring and intervention is consistent with the intent of devolution.

Recommendations

3.    By June 2020, the Ministry of Health should improve accountability and oversight mechanisms by:

a)    revising the Health Performance Framework so that it is a cohesive and comprehensive document
b)    clarifying processes and decision making for managing performance concerns
c)    developing a mechanism to adequately hold LHDs accountable for non-service activity functions
d)    reconciling performance monitoring and intervention with the policy intent of devolution.

Published

Actions for Mobile speed cameras

Mobile speed cameras

Transport
Compliance
Financial reporting
Information technology
Internal controls and governance
Management and administration
Regulation
Service delivery

Key aspects of the state’s mobile speed camera program need to be improved to maximise road safety benefits, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Mobile speed cameras are deployed in a limited number of locations with a small number of these being used frequently. This, along with decisions to limit the hours that mobile speed cameras operate, and to use multiple warning signs, have reduced the broad deterrence of speeding across the general network - the main policy objective of the mobile speed camera program.

The primary goal of speed cameras is to reduce speeding and make the roads safer. Our 2011 performance audit on speed cameras found that, in general, speed cameras change driver behaviour and have a positive impact on road safety.

Transport for NSW published the NSW Speed Camera Strategy in June 2012 in response to our audit. According to the Strategy, the main purpose of mobile speed cameras is to reduce speeding across the road network by providing a general deterrence through anywhere, anytime enforcement and by creating a perceived risk of detection across the road network. Fixed and red-light speed cameras aim to reduce speeding at specific locations.

Roads and Maritime Services and Transport for NSW deploy mobile speed cameras (MSCs) in consultation with NSW Police. The cameras are operated by contractors authorised by Roads and Maritime Services. MSC locations are stretches of road that can be more than 20 kilometres long. MSC sites are specific places within these locations that meet the requirements for a MSC vehicle to be able to operate there.

This audit assessed whether the mobile speed camera program is effectively managed to maximise road safety benefits across the NSW road network.

Conclusion

The mobile speed camera program requires improvements to key aspects of its management to maximise road safety benefits. While camera locations have been selected based on crash history, the limited number of locations restricts network coverage. It also makes enforcement more predictable, reducing the ability to provide a general deterrence. Implementation of the program has been consistent with government decisions to limit its hours of operation and use multiple warning signs. These factors limit the ability of the mobile speed camera program to effectively deliver a broad general network deterrence from speeding.

Many locations are needed to enable network-wide coverage and ensure MSC sessions are randomised and not predictable. However, there are insufficient locations available to operate MSCs that meet strict criteria for crash history, operator safety, signage and technical requirements. MSC performance would be improved if there were more locations.

A scheduling system is meant to randomise MSC location visits to ensure they are not predictable. However, a relatively small number of locations have been visited many times making their deployment more predictable in these places. The allocation of MSCs across the time of day, day of week and across regions is prioritised based on crash history but the frequency of location visits does not correspond with the crash risk for each location.

There is evidence of a reduction in fatal and serious crashes at the 30 best-performing MSC locations. However, there is limited evidence that the current MSC program in NSW has led to a behavioural change in drivers by creating a general network deterrence. While the overall reduction in serious injuries on roads has continued, fatalities have started to climb again. Compliance with speed limits has improved at the sites and locations that MSCs operate, but the results of overall network speed surveys vary, with recent improvements in some speed zones but not others.
There is no supporting justification for the number of hours of operation for the program. The rate of MSC enforcement (hours per capita) in NSW is less than Queensland and Victoria. The government decision to use multiple warning signs has made it harder to identify and maintain suitable MSC locations, and impeded their use for enforcement in both traffic directions and in school zones. 

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #308 - released 18 October 2018

Published

Actions for Regulation of water pollution in drinking water catchments and illegal disposal of solid waste

Regulation of water pollution in drinking water catchments and illegal disposal of solid waste

Environment
Compliance
Internal controls and governance
Management and administration
Regulation
Risk

There are important gaps in how the Environmental Protection Authority (EPA) implements its regulatory framework for water pollution in drinking water catchments and illegal solid waste disposal. This limits the effectiveness of its regulatory responses, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford.

The NSW Environment Protection Authority (the EPA) is the State’s primary environmental regulator. The EPA regulates waste and water pollution under the Protection of the Environment Operations Act 1997 (the Act) through its licensing, monitoring, regulation and enforcement activities. The community should be able to rely on the effectiveness of this regulation to protect the environment and human health. The EPA has regulatory responsibility for more significant and specific activities which can potentially harm the environment.

Activities regulated by the EPA include manufacturing, chemical production, electricity generation, mining, waste management, livestock processing, mineral processing, sewerage treatment, and road construction. For these activities, the operator must have an EPA issued environment protection licence (licence). Licences have conditions attached which may limit the amount and concentrations of substances the activity may produce and discharge into the environment. Conditions also require the licensee to report on its licensed activities.

This audit assessed the effectiveness of the EPA’s regulatory response to water pollution in drinking water catchments and illegal solid waste disposal. The findings and recommendations of this review can be reasonably applied to the EPA’s other regulatory functions, as the areas we examined were indicative of how the EPA regulates all pollution types and incidents.

 
Conclusion
There are important gaps in how the EPA implements its regulatory framework for water pollution in drinking water catchments and illegal solid waste disposal which limit the effectiveness of its regulatory response. The EPA uses a risk-based regulatory framework that has elements consistent with the NSW Government Guidance for regulators to implement outcomes and risk-based regulation. However, the EPA did not demonstrate that it has established reliable practices to accurately and consistently detect the risk of non compliances by licensees, and apply consistent regulatory actions. This may expose the risk of harm to the environment and human health.
The EPA also could not demonstrate that it has effective governance and oversight of its regulatory operations. The EPA operates in a complex regulatory environment where its regional offices have broad discretions for how they operate. The EPA has not balanced this devolved structure with an effective governance approach that includes appropriate internal controls to monitor the consistency or quality of its regulatory activities. It also does not have an effective performance framework that sets relevant performance expectations and outcome-based key performance indicators (KPIs) for its regional offices. 
These deficiencies mean that the EPA cannot be confident that it conducts compliance and enforcement activities consistently across the State and that licensees are complying with their licence conditions or the Act.
The EPA's reporting on environmental and regulatory outcomes is limited and most of the data it uses is self reported by industry. It has not set outcome-based key result areas to assess performance and trends over time. 
The EPA uses a risk-based regulatory framework for water pollution and illegal solid waste disposal but there are important gaps in implementation that reduce its effectiveness.
Elements of the EPA’s risk-based regulatory framework for water pollution and illegal solid waste disposal are consistent with the NSW Government Guidance for regulators to implement outcomes and risk-based regulation. There are important gaps in how the EPA implements its risk-based approach that limit the effectiveness of its regulatory response. The EPA could not demonstrate that it effectively regulates licensees because it has not established reliable practices that accurately and consistently detect licence non compliances or breaches of the Act and enforce regulatory actions.
The EPA lacks effective governance arrangements to support its devolved regional structure. The EPA's performance framework has limited and inconclusive reporting on regional performance to the EPA’s Chief Executive Officer or to the EPA Board. The EPA cannot assure that it is conducting its regulatory responsibilities effectively and efficiently. 
The EPA does not consistently evaluate its regulatory approach to ensure it is effective and efficient. For example, there are no set requirements for how EPA officers conduct mandatory site inspections, which means that there is a risk that officers are not detecting all breaches or non-compliances. The inconsistent approach also means that the EPA cannot rely on the data it collects from these site inspections to understand whether its regulatory response is effective and efficient. In addition, where the EPA identifies instances of non compliance or breaches, it does not apply all available regulatory actions to encourage compliance.
The EPA also does not have a systematic approach to validate self-reported information in licensees’ annual returns, despite the data being used to assess administrative fees payable to the EPA and its regulatory response to non-compliances. 
The EPA does not use performance frameworks to monitor the consistency or quality of work conducted across the State. The EPA has also failed to provide effective guidance for its staff. Many of its policies and procedures are out-dated, inconsistent, hard to access, or not mandated.
Recommendations
By 31 December 2018, to improve governance and oversight, the EPA should:
1. implement a more effective performance framework with regular reports to the Chief Executive Officer and to the EPA Board on outcomes-based key result areas that assess its environmental and regulatory performance and trends over time
By 30 June 2019, to improve consistency in its practices, the EPA should:
2. progressively update and make accessible its policies and procedures for regulatory operations, and mandate procedures where necessary to ensure consistent application
3. implement internal controls to monitor the consistency and quality of its regulatory operations. 
The EPA does not apply a consistent approach to setting licence conditions for discharges to water.
The requirements for setting licence conditions for water pollution are complex and require technical and scientific expertise. In August 2016, the EPA approved guidance developed by its technical experts in the Water Technical Advisory Unit to assist its regional staff. However, the EPA did not mandate the use of the guidance until mid-April 2018. Up until then, the EPA had left discretion to regional offices to decide what guidance their staff use. This meant that practices have differed across the organisation. The EPA is yet to conduct training for staff to ensure they consistently apply the 2016 guidance.
The EPA has not implemented any appropriate internal controls or quality assurance process to monitor the consistency or quality of licence conditions set by its officers across the State. This is not consistent with good regulatory practice.
The triennial 2016 audit of the Sydney drinking water catchment report highlighted that Lake Burragorang has experienced worsening water quality over the past 20 years from increased salinity levels. The salinity levels were nearly twice as high as in other storages in the Sydney drinking water catchment. The report recommended that the source and implication of the increased salinity levels be investigated. The report did not propose which public authority should carry out such an investigation. 
To date, no NSW Government agency has addressed the report's recommendation. There are three public authorities, the EPA, DPE and WaterNSW that are responsible for regulating activities that impact on water quality in the Sydney drinking water catchment, which includes Lake Burragorang. 
Recommendation
By 30 June 2019, to address worsening water quality in Lake Burragorang, the EPA should:
4. (a) review the impact of its licensed activities on water quality in Lake Burragorang, and
  (b) develop strategies relating to its licensed activities (in consultation with other relevant NSW Government agencies) to improve and maintain the lake's water quality.
The EPA’s risk-based approach to monitoring compliance of licensees has limited effectiveness. 
The EPA tailors its compliance monitoring approach based on the performance of licensees. This means that licensees that perform better have a lower administrative fee and fewer mandatory site inspections. 
However, this approach relies on information that is not complete or accurate. Sources of information include licensees’ annual returns, EPA site inspections and compliance audits, and pollution reports from the public. 
Licensees report annually to the EPA on their performance, including compliance against their licence conditions. The Act contains significant financial penalties if licensees provide false and misleading information in their annual returns. However, the EPA does not systematically or consistently validate information self-reported by licensees, or consistently apply regulatory actions if it discovers non-compliance. 
Self-reported compliance data is used in part to assess a licensed premises’ overall environmental risk level, which underpins the calculation of the administrative fee, the EPA’s site inspection frequency, and the licensee’s exposure to regulatory actions. It is also used to assess the load-based licence fee that the licensee pays.
The EPA has set minimum mandatory site inspection frequencies for licensed premises based on its assessed overall risk level. This is a key tool to detect non-compliance or breaches of the Act. However, the EPA has not issued a policy or procedures that define what these mandatory inspections should cover and how they are to be conducted. We found variations in how the EPA officers in the offices we visited conducted these inspections. The inconsistent approach means that the EPA does not have complete and accurate information of licensees’ compliance. The inconsistent approach also means that the EPA is not effectively identifying all non-compliances for it to consider applying appropriate regulatory actions.
The EPA also receives reports of pollution incidents from the public that may indicate non-compliance. However, the EPA has not set expected time frames within which it expects its officers to investigate pollution incidents. The EPA regional offices decide what to investigate and timeframes. The EPA does not measure regional performance regarding timeframes. 
The few compliance audits the EPA conducts annually are effective in identifying licence non-compliances and breaches of the Act. However, the EPA does not have a policy or required procedures for its regulatory officers to consistently apply appropriate regulatory actions in response to compliance audit findings. 
The EPA has not implemented any effective internal controls or quality assurance process to check the consistency or quality of how its regulatory officers monitor compliance across the State. This is not consistent with good regulatory practice.
Recommendations
To improve compliance monitoring, the EPA should implement procedures to:
5. by 30 June 2019, validate self-reported information, eliminate hardcopy submissions and require licensees to report on their breaches of the Act and associated regulations in their annual returns
6. by 31 December 2018, conduct mandatory site inspections under the risk-based licensing scheme to assess compliance with all regulatory requirements and licence conditions.
 
The EPA cannot assure that its regulatory enforcement approach is fully effective.
The EPA’s compliance policy and prosecution guidelines have a large number of available regulatory actions and factors which should be taken into account when selecting an appropriate regulatory response. The extensive legislation determining the EPA’s regulatory activities, and the devolved regional structure the EPA has adopted in delivering its compliance and regulatory functions, increases the risk of inconsistent compliance decisions and regulatory responses. A good regulatory framework needs a consistent approach to enforcement to incentivise compliance. 
The EPA has not balanced this devolved regional structure with appropriate governance arrangements to give it assurance that its regulatory officers apply a consistent approach to enforcement.
The EPA has not issued standard procedures to ensure consistent non-court enforcement action for breaches of the Act or non-compliance with licence conditions. Given our finding that the EPA does not effectively detect breaches and non-compliances, there is a risk that it is not applying appropriate regulatory actions for many breaches and non-compliances.
A recent EPA compliance audit identified significant non-compliances with incident management plan requirements. However, the EPA has not applied regulatory actions for making false statements on annual returns for those licensees that certified their plans complied with such requirements. The EPA also has not applied available regulatory actions for the non-compliances which led to the false or misleading statements.
Recommendation
By 31 December 2018 to improve enforcement, the EPA should:
7. Implement procedures to systematically assess non-compliances with licence conditions and breaches of the Act and to implement appropriate and consistent regulatory actions.
The EPA has implemented the actions listed in the NSW Illegal Dumping Strategy 2014–16. To date, the EPA has also implemented four of the six recommendations made by the ICAC on EPA's oversight of Regional Illegal Dumping Squads.
The EPA did not achieve the NSW Illegal Dumping Strategy 2014–16 target of a 30 per cent reduction in instances of large scale illegal dumping in Sydney, the Illawarra, Hunter and Central Coast from 2011 levels. 
In the reporting period, the incidences of large scale illegal dumping more than doubled. The EPA advised that this increase may be the result of greater public awareness and reporting rather than increased illegal dumping activity. 
By June 2018, the EPA is due to implement one outstanding recommendation made by the ICAC but has not set a time for the other outstanding recommendation.  

Published

Actions for HealthRoster benefits realisation

HealthRoster benefits realisation

Health
Compliance
Information technology
Management and administration
Project management
Workforce and capability

The HealthRoster system is delivering some business benefits but Local Health Districts are yet to use all of its features, according to a report released today by the Auditor-General for New South Wales,  Margaret Crawford. HealthRoster is an IT system designed to more effectively roster staff to meet the needs of Local Health Districts and other NSW health agencies.

The NSW public health system employs over 100,000 people in clinical and non-clinical roles across the state. With increasing demand for services, it is vital that NSW Health effectively rosters staff to ensure high quality and efficient patient care, while maintaining good workplace practices to support staff in demanding roles.

NSW Health is implementing HealthRoster as its single state-wide rostering system to more effectively roster staff according to the demands of each location. Between 2013–14 and 2016–17, our financial audits of individual LHDs had reported issues with rostering and payroll processes and systems.

NSW Health grouped all Local Health Districts (LHDs), and other NSW Health organisations, into four clusters to manage the implementation of HealthRoster over four years. Refer to Exhibit 4 for a list of the NSW Health entities in each cluster.

  • Cluster 1 implementation commenced in 2014–15 and was completed in 2015–16.
  • Cluster 2 implementation commenced in 2015–16 and was completed in 2016–17.
  • Cluster 3 began implementation in 2016–17 and was underway during the conduct of the audit.
  • Cluster 4 began planning for implementation in 2017–18.

Full implementation, including capability for centralised data and reporting, is planned for completion in 2019.

This audit assessed the effectiveness of the HealthRoster system in delivering business benefits. In making this assessment, we examined whether:

  • expected business benefits of HealthRoster were well-defined
  • HealthRoster is achieving business benefits where implemented.

The HealthRoster project has a timespan from 2009 to 2019. We examined the HealthRoster implementation in LHDs, and other NSW Health organisations, focusing on the period from 2014, when eHealth assumed responsibility for project implementation, to early 2018.

Conclusion
The HealthRoster system is realising functional business benefits in the LHDs where it has been implemented. In these LHDs, financial control of payroll expenditure and rostering compliance with employment award conditions has improved. However, these LHDs are not measuring the value of broader benefits such as better management of staff leave and overtime.
NSW Health has addressed the lessons learned from earlier implementations to improve later implementations. Business benefits identified in the business case were well defined and are consistent with business needs identified by NSW Health. Three of four cluster 1 LHDs have been able to reduce the number of issues with rostering and payroll processes. LHDs in earlier implementations need to use HealthRoster more effectively to ensure they are getting all available benefits from it.
HealthRoster is taking six years longer, and costing $37.2 million more, to fully implement than originally planned. NSW Health attributes the increased cost and extended timeframe to the large scale and complexity of the full implementation of HealthRoster.

Business benefits identified for HealthRoster accurately reflect business needs.

NSW Health has a good understanding of the issues in previous rostering systems and has designed HealthRoster to adequately address these issues. Interviews with frontline staff indicate that HealthRoster facilitates rostering which complies with industrial awards. This is a key business benefit that supports the provision of quality patient care. We saw no evidence that any major business needs or issues with the previous rostering systems are not being addressed by HealthRoster.

In the period examined in this audit since 2015, NSW Health has applied appropriate project management and governance structures to ensure that risks and issues are well managed during HealthRoster implementation.

HealthRoster has had two changes to its budget and timeline. Overall, the capital cost for the project has increased from $88.6 million to $125.6 million (42 per cent) and has delayed expected project completion by four years from 2015 to 2019. NSW Health attributes the increased cost and extended time frame to the large scale and complexity of the full implementation of HealthRoster.

NSW Health has established appropriate governance arrangements to ensure that HealthRoster is successfully implemented and that it will achieve business benefits in the long term. During implementation, local steering committees monitor risks and resolve implementation issues. Risks or issues that cannot be resolved locally are escalated to the state-wide steering committee.

NSW Health has grouped local health districts, and other NSW Health organisations, into four clusters for implementation. This has enabled NSW Health to apply lessons learnt from each implementation to improve future implementations.

NSW Health has a benefits realisation framework, but it is not fully applied to HealthRoster.

NSW Health can demonstrate that HealthRoster has delivered some functional business benefits, including rosters that comply with a wide variety of employment awards.

NSW Health is not yet measuring and tracking the value of business benefits achieved. NSW Health did not have benefits realisation plans with baseline measures defined for LHDs in cluster 1 and 2 before implementation. Without baseline measures NSW Health is unable to quantify business benefits achieved. However, analysis of post-implementation reviews and interviews with frontline staff indicate that benefits are being achieved. As a result, NSW Health now includes defining baseline measures and setting targets as part of LHD implementation planning. It has created a benefits realisation toolkit to assist this process from cluster 3 implementations onwards.

NSW Health conducted post-implementation reviews for clusters 1 and 2 and found that LHDs in these clusters were not using HealthRoster to realise all the benefits that HealthRoster could deliver.

By September 2018, NSW Health should:

  1. Ensure that Local Health Districts undertake benefits realisation planning according to the NSW Health benefits realisation framework
  2. Regularly measure benefits realised, at state and local health district levels, from the statewide implementation of HealthRoster
  3. Review the use of HealthRoster in Local Health Districts in clusters 1 and 2 and assist them to improve their HealthRoster related processes and practices.

By June 2019, NSW Health should:

  1. Ensure that all Local Health Districts are effectively using demand based rostering.

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #301 - released 7 June 2018