Refine search Expand filter

Reports

Published

Actions for Supporting the District Criminal Court

Supporting the District Criminal Court

Justice
Community Services
Information technology
Internal controls and governance
Project management

The Auditor-General for New South Wales, Margaret Crawford, released a report today on whether the Department of Communities and Justice (the department) effectively supports the efficient operation of the District Criminal Court system.

The audit found that in the provision of data and technology services, the department is not effectively supporting the efficient operation of the District Criminal Court system. The department has insufficient controls in place to ensure that data in the system is always accurate.

The department is also using outdated technology and could improve its delivery of technical support to courts.

The audit also assessed the implementation of the Early Appropriate Guilty Pleas reform. This reform aims to improve court efficiency by having more cases resolved earlier with a guilty plea in the Local Court. The audit found that the department effectively governed the implementation of the reform but is not measuring achievement of expected benefits, placing the objectives of the reform at risk.

The Auditor-General made seven recommendations to the department, aimed at improving the controls around courts data, reporting on key performance indicators, improving regional technical support and measuring the success of the Early Appropriate Guilty Pleas reform. 

The District Court is the intermediate court in the New South Wales court system. It hears most serious criminal matters, except murder, treason and piracy. The Department of Communities and Justice (the Department) provides support to the District Court in a variety of ways. For example, it provides security services, library services and front-desk services. This audit examined three forms of support that the Department provides to the District Court:

  • data collection, reporting and analysis - the Department collects data from cases in its case management system, JusticeLink, based on the orders Judges make in court and court papers
  • technology - the Department provides technology to courts across New South Wales, as well as technical support for this technology
  • policy - the Department is responsible for proposing and implementing policy reforms.

Recent years have seen a worsening of District Court efficiency, as measured in the Productivity Commission's Report on Government Services (RoGS). Efficiency in the court system is typically measured through timeliness of case completion. There is evidence that timeliness has worsened. For example, the median time from arrest to finalisation of a case in the District Court increased from 420 days in 2012–13 to 541 days in 2017–18.

As a result, the government has announced a range of measures to improve court performance, particularly in the District Court. These measures included the Early Appropriate Guilty Pleas (EAGP) reform. One of the objectives of EAGP is to improve court efficiency, which would be achieved by having more cases resolve with a guilty plea in the Local Court.

This audit assessed whether the Department of Communities and Justice effectively supports the efficient operation of the District Criminal Court system. We assessed this with the following lines of inquiry:

  • Does the Department effectively collect, analyse and report performance information relevant to court efficiency?
  • Does the Department effectively provide technology to support the efficient working of the courts?
  • Does the Department have effective plans, governance and monitoring for the Early Appropriate Guilty Pleas reform?

The audit did not consider other support functions provided by the Department. Further information on the audit, including detailed audit criteria, may be found in Appendix two.

Conclusion
In the provision of data and technology services, the Department is not effectively supporting the efficient operation of the District Criminal Court system. The Department has insufficient controls in place to ensure accurate data in the District Criminal Court system. The Department is also using outdated technology in significant numbers and could improve its delivery of technical support to meet agreed targets.
The Department effectively governed the implementation of the Early Appropriate Guilty Pleas reform. However, it is not ensuring that the benefits stated in the business case are being achieved, placing its objectives at risk.
The impact of inaccurate court data can be severe, and the Department does not have sufficient controls in place to ensure that its court data is accurate. Recent Bureau of Crime Statistics and Research reviews have identified data inaccuracies, and this demonstrates the Department needs strong controls in place to ensure that its court data is accurate.
The Department does not have a policy for data quality and has not formally assigned responsibility for data quality to any individual or branch. The Department also does not have a data dictionary outlining all the fields in its case management system. While the Department validates the highest risk items, such as warrants, to ensure that they are accurate, most data is not validated. The Department has recently commenced setting up a data unit for the Courts, Tribunals and Service Delivery branch. It is proposed that this unit will address most of the identified shortcomings.
The Department did not provide timely technical support to the court system in 2017 and is using outdated technology in significant numbers. The Digital and Technology Services branch of the Department had agreed a Service Level Agreement with the rest of the Department, outlining the expected speed of technical support responses. The branch did not meet response times in 2017. Performance improved in 2018, though DTS fell short of its targets for critical and moderate priority incidents. Critical incidents are particularly important to deal with in a timely manner as they include incidents which may delay a court sitting.
Requests for technical support rose significantly in 2018 compared to 2017, which may be related to the number of outdated pieces of technology. As at April 2019, the whole court system had 2,389 laptops or desktop computers outside their warranty period. The Department was also using other outdated technology. Outdated technology is more prone to failure and continuing to use it poses a risk of court delays.
The Department is not measuring all the expected benefits from the Early Appropriate Guilty Pleas reform, placing the objectives of the program at risk. The Early Appropriate Guilty Pleas business case outlined nine expected benefits from the reform. The Department is not measuring one of these benefits and is not measuring the economic benefits of a further five business case benefits. Not measuring the impact of the reform means that the Department does not know if it is achieving its objectives and if the reform had the desired impact.

The Department is responsible for providing technology to the courts, which can improve the efficiency of court operations by making them faster and cheaper. The Department is also responsible for providing technical support to courtrooms and registries. It is important that technical support is provided in a timely manner because some technical incidents can delay court sittings and thus impact on court efficiency. A 2013 Organisation for Economic Co‑operation and Development report emphasised the importance of technology and digitisation for reducing trial length.

While the Department may provide technology to the courts, they are not responsible for deciding when, how or if the technology is used in the courtroom.

The Department is using a significant amount of outdated technology, risking court delays

As of April 2019, the whole court system had 2,389 laptops or desktop computers out of warranty, 56.0 per cent of the court system's fleet. The court system also had 786 printing devices out of their normal warranty period, 75.1 per cent of all printers in use. The Department also advised that many of its court audio transcription machines are out of date. These machines must be running for the court to sit and thus it is critical that they are maintained to a high degree. The then Department of Justice estimated the cost of aligning its hardware across the whole Department with desired levels at $14.0 million per year for three years. Figures for the court system were not calculated but they are likely to be a significant portion of this figure.

Using outdated technology poses a risk to the court system as older equipment may be more likely to break down, potentially delaying courts or slowing down court services. In the court system throughout 2018, hardware made up 30.8 per cent of all critical incidents reported to technical support and 41.9 per cent of all high priority incidents. In addition, 16.2 per cent of all reported issues related to printing devices or printing.

From 2017 to 2018, technical support incidents from courts or court services increased. There were 4,379 technical support incidents in 2017, which increased significantly to 9,186 in 2018. The Department advised that some outside factors may have contributed to this increase. The Department was rolling out its new incident recording system throughout 2017, meaning that there would be an under‑reporting of incidents in that year. The Department also advised that throughout 2018 there was a greater focus on ensuring that every issue was logged, which had not previously been the case. Despite these factors, the use of outdated technology has likely increased the risk of technology breakages and may have contributed to the increase in requests for technical support.

Refreshing technology on a regular basis would reduce the risk of hardware failures and ensure that equipment is covered by warranty.

The Department did not meet all court technical support targets in 2017 and 2018

The Digital and Technology Services branch (DTS) was responsible for providing technical support to the courts and the Courts and Tribunal Services branch prior to July 2019. DTS provided technical support in line with a Service Level Agreement (SLA) with the Department. In 2017, DTS did not provide this support in a timely manner. Performance improved in 2018, though DTS fell short of its targets for critical and moderate priority incidents. Exhibit 7 outlines DTS' targets under the SLA.

Exhibit 7: Digital and Technology Services' Service Level Agreement
Priority Target resolution time Target percentage in time (%)
1. Critical 4 hours 80
2. High 1 day 80
3. Moderate 3 days 85
4. Low 5 days 85
Source: Department of Communities and Justice, 2019.

Critical incidents are particularly important for the Department to deal with in a timely manner because these include incidents which may delay a court sitting until resolved or incidents which impact on large numbers of staff. Some of the critical incidents raised with DTS specifically stated that they were delaying a court sitting, often due to transcription machines not working. High priority incidents include those where there is some impact on the functions of the business, which may in turn affect the efficiency of the court system. High priority incidents also include those directly impacting on members of the Judiciary. 

This audit examined DTS' performance against its SLA in the 2017 and 2018 calendar years across the whole court system, not just the District Court. The total number of incidents, as well as critical and high priority incidents, can be seen in Exhibit 8.

Exhibit 8: Number of incidents in 2017 and 2018
Priority 2017 2018
All 4,379 9,186
1. Critical 48 91
2. High 128 315
Source: Audit Office of NSW analysis of Department of Communities and Justice data, 2019.

The Department's results against its SLA in 2017 and 2018 are shown in Exhibit 9.

The Early Appropriate Guilty Pleas (EAGP) reform consists of five main elements:

  • early disclosure of evidence from NSW Police Force to the prosecution and defence
  • early certification of what the accused is going to be charged with to minimise changes
  • mandatory criminal case conferencing between the prosecutor and accused's representation
  • changes to Local Court case management
  • more structured sentence discounts.

More detailed descriptions of each of these changes can be found in the Introduction. These reform elements are anticipated to have three key effects:

  • accelerate the timing of guilty pleas
  • increase the overall proportion of guilty pleas
  • decrease the average length of contested trials.

Improving District Court efficiency is one of the stated aims of EAGP, which would be achieved by having more cases resolve in the Local Court and having fewer defendants plead guilty on the day of their trial in the District Court. The reform commenced in April 2018 and it is too early to state the impact of this reform on District Court efficiency.

The Department is responsible for delivering EAGP in conjunction with other justice sector agencies. They participated in the Steering Committee and the Working Groups, as well as providing the Project Management Office (PMO).

The Department is not measuring the economic benefits stated in the EAGP business case

The business case for EAGP listed nine quantifiable benefits which were expected to be derived from the achievement of the three key effects listed above. The Department is not measuring one of these benefits and is not measuring the economic benefits for five more, as shown in Exhibit 12.

Benefit Economic benefit (over ten years) Being measured?
Accelerated timing of guilty pleas $54.6m yellow circle with minus in the center
Increased guilty plea rate $90.7m yellow circle with minus in the center
Decreased average trial length $27.5m yellow circle with minus in the center
A reduction in the delay of indictable matters proceeding to trial N/A check circle mauve
Increase the number of finalised matters per annum N/A check circle mauve
Reduction of the current backlog of criminal trials in the District Court N/A check circle mauve
Reduction in bed pressure on the correction system due to reduced
average time in custody
$13.7m Exclamation circle red
Productivity improvements due to reduction in wasted effort $53.3m yellow circle with minus in the center
Bankable cost savings due to jury empanelment avoided $2.5m yellow circle with minus in the center

 

Exhibit 12: The Department's measurement of quantifiable benefits
Key check circle mauve Measuring yellow circle with minus in the center Not measuring economic benefit Exclamation circle red Not measuring
Source: Audit Office of NSW analysis.

While it is too early to comment on the overall impact of EAGP, better practice in benefits realisation involves an ongoing effort to monitor benefits to ensure that the reform is on target and determine whether any corrective action is needed.

The Department is measuring the number of finalised matters per annum and while the Department is not measuring the reduction in the backlog as part of this program, this measure is reported as part of the Department's internal reporting framework. The Department is not monitoring the reduction in delay of indictable matters proceeding to trial directly as part of this reform, but this does form part of the monthly Operational Performance Report which the Department sends to the EAGP Steering Committee.

The Department is not monitoring any of the economic benefits stated in the business case. These economic benefits are a mixture of bankable savings and productivity improvements. This amounts to a total of $242.3 million over ten years which was listed in the business case as potential economic benefits from the implementation of this reform against the total cost of $206.9 million over ten years. The Department is collecting proxy indicators which would assist in these calculations for several indicators, but it is not actively monitoring these savings. For example, the Department is monitoring average trial length, but is not using this information to calculate economic benefits derived from changes in trial length.

The Department is also not collecting information related to the average length of custody as part of this program. This means that it is unable to determine if EAGP is putting less pressure on the correctives system and it is not possible for the Department to calculate the savings from this particular benefit.

While stakeholders are optimistic about the impact of EAGP, not measuring the expected benefits stated in the business case means that the Department does not know if the reform is achieving what it was designed to achieve. Further, the Department does not know if it must take corrective action to ensure that the program achieves the stated benefits. These two things put the overall program benefits at risk.

The Department has not assigned responsibility for the realisation of each benefit, potentially risking the success of the program

The Department has not assigned responsibility for the realisation of each benefit stated in the business case. The Department holds the Steering Committee responsible for the realisation of all benefits. Benefits realisation is the process which ensures that the agency reaches benefits as stated in the business case. Assigning responsibility for benefits realisation to the Steering Committee rather than individuals is not in line with good practice.

Good practice benefits realisation involves assigning responsibility for the realisation of each benefit to an individual at the business unit level. This ensures there is a single point of accountability for each part of the program with knowledge of the benefit and the ability to take corrective action if it looks like that benefit will not be realised. This responsibility should sit at the operational level where detailed action can most easily be undertaken. The role of a Steering Committee in benefits realisation is to ensure that responsible parties are monitoring their benefits and taking appropriate corrective action.

The Department advised that it believes the Steering Committee should have responsibility for the realisation of benefits due to the difficulty of attributing the achievement of each benefit to one part of the reform alone. Given the Steering Committee meets only quarterly, it is not well placed to take action in response to variances in performance.

A BOCSAR evaluation is planned, however data errors make some of the information unreliable

BOCSAR are planning to undertake an overall evaluation of EAGP which is planned for release in 2021. Undertaking this evaluation will require high quality data to gain an understanding of the drivers of the reform. However, data captured throughout the first year of EAGP has proven unreliable, which may reduce the usefulness of BOCSAR's evaluation. These data issues were discussed in Exhibit 5 in Chapter 2, above. Access to accurate data is vital for conducting any program evaluation and inaccurate data raises the risk that the BOCSAR evaluation will not be able to provide an accurate evaluation of the impact of EAGP.

In addition to the BOCSAR evaluation, the Department had plans for a series of 'snapshot' evaluations for some of the key elements of the reform to ensure that they were operating effectively. These were initially delayed due to an efficiency dividend which affected EAGP. In August 2019, the Department commissioned a review of the implementation of several key success factors for EAGP.

There was clear governance throughout the implementation of EAGP

The implementation stage of EAGP had clear governance, lines of authority and communication. The Steering Committee, each Working Group and each agency had clear roles and responsibilities, and these were organised through a Project Management Office (PMO) provided by the former Department of Justice. The governance structure throughout the implementation phase can be seen at Exhibit 13.

The Steering Committee was established in December 2016 and met regularly from March 2017. It comprised senior members of key government agencies, as well as the Chief Judge and the Chief Magistrate for most of the duration of the implementation period. The Steering Committee met at least monthly throughout the life of the program. The Steering Committee was responsible for overseeing the delivery of EAGP and making key decisions relating to implementation, including spending decisions. The Chief Judge and the Chief Magistrate abstained from financial decisions. The Steering Committee updated the governance and membership of the Steering Committee as appropriate throughout the life of the reform.

Appendix one – Response from agency
 
Appendix two – About the audit 

Appendix three – Performance auditing 

 

Copyright Notice

© Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Parliamentary Reference: Report number #329 - released 18 December 2019

Published

Actions for Ensuring teaching quality in NSW public schools

Ensuring teaching quality in NSW public schools

Education
Management and administration
Regulation
Service delivery
Workforce and capability

The Auditor-General for New South Wales, Margaret Crawford, has released a report on how the New South Wales Education and Standards Authority (NESA) and the Department of Education (the Department) ensure teaching quality in NSW public schools.

Around 2,200 NSW public school principals are responsible for accrediting their teachers in line with the Australian Professional Standards for Teachers. The report found that NESA does not oversight principals’ decisions to ensure that minimum standards for teaching quality are consistently met.

The Department does not effectively monitor teaching quality across the state. With limited data, it is difficult for the Department to ensure its strategies to improve teaching quality are appropriately targeted to improve teaching quality.

The Department’s Performance and Development Framework does not adequately support principals and supervisors to effectively manage and improve teacher performance or actively improve teaching quality. The Department manages those teachers formally identified as underperforming through teacher improvement programs. Only 53 of over 66,000 teachers employed by the Department were involved in these programs in 2018.

The report makes three recommendations towards NESA to improve accreditation processes, and four recommendations to the Department to improve its systems and processes for ensuring teaching quality across the State.

Australian research has shown that quality teaching is the greatest in-school influence on student engagement and outcomes, accounting for 30 per cent of the variance in student performance. An international comparative study of 15-year-old students showed the performance of New South Wales students in reading, mathematics and science has declined between 2006 and 2015.

The Australian Professional Standards for Teachers (the Standards) describe the knowledge, skills and understanding expected of effective teachers at different career stages. Teachers must be accredited against the Standards to be employed in NSW schools. The NSW Education Standards Authority (NESA) is responsible for ensuring all teachers in NSW schools are accredited. As part of the accreditation process the NSW Department of Education (The Department) assesses whether public school teachers meet proficient accreditation standards and advises NESA of its decisions.

The School Excellence Framework provides a method for the Department to monitor teaching quality at a school level across four elements of effective teaching practice. The Performance and Development Framework provides a method for teachers and their supervisors to monitor and improve teaching quality through setting professional goals to guide their performance and development.

The Department has a strategic goal that every student, every teacher, every leader and every school improves every year. In line with this goal, the Department has a range of strategies targeted to improving teaching quality at different career stages. These include additional resources to support new teachers, a program to support teachers to gain higher-level accreditation, support for principals to manage underperforming teachers, and a professional learning program where teachers observe and discuss each other's practice.

The objective of this audit was to assess the effectiveness of the NSW Department of Education's and the NSW Education Standards Authority's arrangements to ensure teaching quality in NSW public schools. To address this objective, the audit examined whether:

  • agencies effectively monitor the quality of teaching in NSW public schools
  • strategies to improve the quality of teaching are planned, communicated, implemented and monitored well.
The NSW Education Standards Authority does not oversight principals’ decisions to accredit teachers as proficient. This means it is not ensuring minimum standards for teaching quality are consistently met.
NESA does not have a process to ensure principals’ decisions to accredit teachers are in line with the Standards. The decision to accredit teachers is one of the main ways to ensure teaching quality. In New South Wales public schools, around 2,200 principals are tasked with making decisions to accredit their teachers as proficient. NESA provides training and guidelines for principals to encourage consistent accreditation decisions but regular turnover of principals makes it difficult to ensure that all principals are adequately supported. NESA has more oversight of provisional and conditional accreditation for beginning teachers, as well as higher-level accreditation for highly effective teachers. That said, there are only limited numbers of teachers with higher-level accreditation across the state.
The Department of Education does not effectively monitor teaching quality at a system level. This makes it difficult to ensure strategies to improve teaching quality are appropriately targeted.
The Department is not collecting sufficient information to monitor teaching quality across the state. No information on teacher assessment against the Performance and Development Framework is collected centrally. Schools self-assess their performance against the School Excellence Framework but this does not assess teaching quality for all teachers. The Department also surveys students about their experiences of teaching quality but schools opt-in to this survey, with 65 per cent of public schools participating in 2018. These factors limit the ability of the Department to target efforts to areas of concern.
We examined five key strategies that support the critical parts of a teacher’s career. Most strategies were based on research and consultation, planned, trialled, reviewed and adjusted before wider rollout. Guidance and training is provided to communicate requirements and help schools implement strategies at a local level. Monitoring of strategies implemented at a local level is variable. We identified several instances where Quality Teaching, Successful Students funding was used outside guidelines. Two strategies have not yet been evaluated, which prevents the Department from determining whether they are having the desired impact.
The Performance and Development Framework is not structured in a way that supports principals and supervisors to actively improve teacher performance and teaching quality.
There is limited opportunity for supervisors to set goals, conduct observations of teaching practice, or provide constructive written feedback on a teacher’s progress towards achieving their goals under this framework. Guidance on how to use the Standards to construct quality goals, observe teaching practice and provide valuable feedback is also insufficient. The framework focuses on teachers’ self-identified development goals but there is no requirement to align these with the Standards. These limitations reduce the ability of supervisors to use this framework to effectively manage teacher performance and improve teaching quality.
The Department manages those teachers formally identified as underperforming through teacher improvement programs. Only 53 of over 66,000 teachers employed by the Department were involved in these programs in 2018. By comparison, a report on inspections conducted in the United Kingdom assessed the quality of teaching as ‘inadequate’ in three per cent of schools.

Appendix one – Response from agencies

Appendix two – About the audit

Appendix three – Performance auditing

© Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Parliamentary Reference: Report number #327 - released 26 September 2019

36

Published

Actions for Ensuring contract management capability in government - Department of Education

Ensuring contract management capability in government - Department of Education

Education
Compliance
Internal controls and governance
Management and administration
Procurement
Workforce and capability

This report examines whether the Department of Education has the required contract management capability to effectively manage high-value goods and services contracts (over $250,000). In 2017–18, the department managed high-value goods and services contracts worth $3.08 billion, with most of the contracts running over multiple years.

NSW government agencies are increasingly delivering services and projects through contracts with third parties. These contracts can be complex and governments face challenges in negotiating and implementing them effectively.

Contract management capability is a broad term, which can include aspects of individual staff capability as well as organisational capability (such as policies, frameworks and processes).

In 2017–18, the Department of Education (the Department) managed high-value (over $250,000) goods and services contracts worth $3.08 billion, with most of the contracts running over multiple years. The Department delivers, funds and regulates education services for NSW students from early childhood to secondary school.

This audit examined whether the Department has the required capability to effectively manage high-value goods and services contracts.

We did not examine infrastructure, construction or information communication and technology contracts. We assessed the Department against the following criteria:

  1. The Department’s policies and procedures support effective contract management and are consistent with relevant frameworks, policies and guidelines.
  2. The Department has capable personnel to effectively conduct the monitoring activities throughout the life of the contract.

The NSW Public Service Commission and the Department of Finance, Services and Innovation are included as auditees as they administer policies which directly affect contract management capability, including:

  • NSW Procurement Board Directions and policies
  • NSW Procurement Agency Accreditation Scheme
  • NSW Public Sector Capability Framework.

The Department of Finance, Services and Innovation's responsibility for NSW Procurement will transfer to NSW Treasury on 1 July 2019 as part of changes to government administrative arrangements announced on 2 April 2019 and amended on 1 May 2019.

Conclusion

The Department of Education's procedures and policies for goods and services contract management are consistent with relevant guidance. It also has a systemic approach to defining the capability required for contract management roles. That said, there are gaps in how well the Department uses this capability to ensure its contracts are performing. We also found one program (comprising 645 contracts) that was not compliant with the Department's policies.

The Department has up-to-date policies and procedures that are consistent with relevant guidance. The Department also communicates changes to procurement related policies, monitors compliance with policies and conducts regular reviews aiming to identify non-compliance.

The Department uses the NSW Public Service Commission's capability framework to support its workforce management and development. The capability framework includes general contract management capability for all staff and occupation specific capabilities for contract managers. The Department also provides learning and development for staff who manage contracts to improve their capability.

The Department provides some guidance on different ways that contract managers can validate performance information provided by suppliers. However, the Department does not provide guidance to assist contract managers to choose the best validation strategy according to contract risk. This could lead to inconsistent practice and contracts not delivering what they are supposed to.

We found that none of the 645 contracts associated with the Assisted Schools Travel Program (estimated value of $182 million in 2018–19) have contract management plans. This is contrary to the Department's policies and increases the risk that contract managers are not effectively reviewing performance and resolving disputes.

Appendix one - Response from agencies

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary Reference: Report number #325 - released 28 June 2019

Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Published

Actions for Contracting non-government organisations

Contracting non-government organisations

Community Services
Compliance
Fraud
Management and administration
Procurement
Regulation
Service delivery

This report found the Department of Family and Community Services (FACS) needs to do more to demonstrate it is effectively and efficiently contracting NGOs to deliver community services in the Permanency Support Program (a component of out-of-home-care services) and Specialist Homelessness Services. It notes that FACS is moving to an outcomes-based commissioning model and recommends this be escalated consistent with government policy.

Government agencies, such as the Department of Family and Community Services (FACS), are increasingly contracting non-government organisations (NGOs) to deliver human services in New South Wales. In doing so, agencies are responsible for ensuring these services are achieving expected outcomes. Since the introduction of the Commissioning and Contestability Policy in 2016, all NSW Government agencies are expected to include plans for customer and community outcomes and look for ways to use contestability to raise standards.

Two of the areas receiving the greatest funding from FACS are the Permanency Support Program and Specialist Homelessness Services. In the financial year 2017–18, nearly 500 organisations received $784 million for out-of-home care programs, including the Permanency Support Program. Across New South Wales, specialist homelessness providers assist more than 54,000 people each year and in the financial year 2017–18, 145 organisations received $243 million for providing short term accommodation and homelessness support, including Specialist Homelessness Services.

In the financial year 2017–18, FACS entered into 230 contracts for out-of-home care, of which 49 were for the Permanency Support Program, representing $322 million. FACS also entered into 157 contracts for the provision of Specialist Homelessness Services which totalled $170 million. We reviewed the Permanency Support Program and Specialist Homelessness Services for this audit.

This audit assessed how effectively and efficiently FACS contracts NGOs to deliver community services. The audit could not assess how NGOs used the funds they received from FACS as the Audit Office does not have a mandate that could provide direct assurance that NGOs are using government funds effectively.

Conclusion
FACS cannot demonstrate it is effectively and efficiently contracting NGOs to deliver community services because it does not always use open tenders to test the market when contracting NGOs, and does not collect adequate performance data to ensure safe and quality services are being provided. While there are some valid reasons for using restricted tenders, it means that new service providers are excluded from consideration - limiting contestability. In the service delivery areas we assessed, FACS does not measure client outcomes as it has not yet moved to outcomes-based contracts. 
FACS' procurement approach sometimes restricts the selection of NGOs for the Permanency Support Program and Specialist Homelessness Services
FACS has a procurement policy and plan which it follows when contracting NGOs for the provision of human services. This includes the option to use restricted tenders, which FACS sometimes uses rather than opening the process to the market. The use of restricted tenders is consistent with its procurement plan where there is a limited number of possible providers and the services are highly specialised. However, this approach perpetuates existing arrangements and makes it very difficult for new service providers to enter the market. The recontracting of existing providers means FACS may miss the opportunity to benchmark existing providers against the whole market. 
FACS does not effectively use client data to monitor the performance of NGOs funded under the Permanency Support Program and Specialist Homelessness Services
FACS' contract management staff monitor individual NGO performance including safety, quality of services and compliance with contract requirements. Although FACS does provide training materials on its intranet, FACS does not provide these staff with sufficient training, support or guidance to monitor NGO performance efficiently or effectively. FACS also requires NGOs to self-report their financial performance and contract compliance annually. FACS verifies the accuracy of the financial data but conducts limited validation of client data reported by NGOs to verify its accuracy. Instead, FACS relies on contract management staff to identify errors or inaccurate reporting by NGOs.
FACS' ongoing monitoring of the performance of providers under the Permanency Support Program is particularly limited due to problems with timely data collection at the program level. This reduces FACS' ability to monitor and analyse NGO performance at the program level as it does not have access to ongoing performance data for monitoring service quality.
In the Specialist Homelessness Services program, FACS and NGOs both provide the data required for the National Minimum Data Set on homelessness and provide it to the Australian Institute of Health and Welfare, as they are required to do. However, this data is not used for NGO performance monitoring or management.
FACS does not yet track outcomes for clients of NGOs
FACS began to develop an approach to outcomes-based contracting in 2015. Despite this, none of the contracts we reviewed are using outcomes as a measure of success. Currently, NGOs are required to demonstrate their performance is consistent with the measures stipulated in their contracts as part of an annual check of their contract compliance and financial accounts. NGOs report against activity-based measures (Key Performance Indicators) and not outcomes.
FACS advises that the transition to outcomes-based contracting will be made with the new rounds of funding which will take place in 2020–2021 for Specialist Homelessness Services and 2023 for the Permanency Support Program. Once these contracts are in place, FACS can transition NGOs to outcomes based reporting.
Incomplete data limits FACS' effectiveness in continuous improvement for the Permanency Support Program and Specialist Homelessness Services
FACS has policies and procedures in place to learn from past experiences and use this to inform future contracting decisions. However, FACS has limited client data related to the Permanency Support Program which restricts the amount of continuous improvement it can undertake. In the Specialist Homelessness Support Program data is collected to inform routine contract management discussions with service providers but FACS is not using this data for continuous improvement. 

Appendix one – Response from agency

Appendix two – About the audit

Appendix three – Performance auditing

 

Parliamentary Reference: Report number #323 - released 26 June 2019

Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

Published

Actions for Helping older people access a residential aged care facility

Helping older people access a residential aged care facility

Health
Community Services
Compliance
Internal controls and governance
Management and administration
Risk
Service delivery
Shared services and collaboration
Workforce and capability

Assessment processes for older people needing to go to an Residential Aged Care Facility (RACF) vary depending on the processes of the Aged Care Assessement Teams (ACAT) they see and whether or not they are in hospital. The data collected on ACAT performance was significantly revised during 2004 making comparisons with subsequent years problematic. ACATs have more responsibilities than assessing older people for residential care. It is not clear whether they have sufficient resources for this additional workload.

 

Parliamentary reference - Report number #160 - released 5 December 2006

Published

Actions for Educating primary school students with disabilities

Educating primary school students with disabilities

Education
Internal controls and governance
Management and administration
Service delivery
Workforce and capability

Special education programs so far have been able to support schools to accommodate these students and as a result, we have an inclusive education system. Our concern is that as the number of students with disabilities increases, pressure will be placed on both funding and the capacity of schools to provide quality services.

The Government’s special education initiative is a positive step towards addressing these problems. Nevertheless, other changes need to be made to improve services to meet the needs of individual students. For example, the department needs to develop a common assessment tool to capture the additional support needs of students with disabilities on enrolment and improve accountability for services and results after enrolment.

 

Parliamentary reference - Report number #158 - released 6 September 2006

Published

Actions for Agencies working together to improve services

Agencies working together to improve services

Premier and Cabinet
Treasury
Justice
Transport
Education
Internal controls and governance
Service delivery
Shared services and collaboration

In the cases we examined, we found that agencies working together can improve services or results. However, the changes were not always as great as anticipated or had not reached maximum potential. Establishing the right governance framework and accountability requirements between partners at the start of the project is critical to success. And joint responsibility requires new funding and reporting arrangements to be developed.

 

Parliamentary reference - Report number #149 - released 22 March 2006

Published

Actions for The New Schools Privately Financed Project

The New Schools Privately Financed Project

Education
Treasury
Infrastructure
Management and administration
Procurement
Project management

In our view the contracts in the New Schools Privately Financed Project were established and let in a way that greatly assists their potential for delivering value for money. The contracts in the New Schools Privately Financed Project are at an early stage of their 30 year lives and the savings and other benefits are not guaranteed. The contracts will need to be carefully managed over the 30 year period to ensure that benefits are realised and that costs do not escalate beyond expectations.

 

Parliamentary reference - Report number #148 - released 8 March 2006