Refine search Expand filter

Reports

Published

Actions for Matching skills training with market needs

Matching skills training with market needs

Industry
Compliance
Internal controls and governance
Management and administration
Risk
Service delivery
Workforce and capability

The NSW Department of Industry targets subsidies towards training programs delivering skills most needed in New South Wales. However, the Department still provides subsidies to qualifications that the market may no longer need, according to a report released by Margaret Crawford, Auditor-General for New South Wales. 

In 2012, governments across Australia entered into the National Partnership Agreement on Skills Reform. Under the National Partnership Agreement, the Australian Government provided incentive payments to States and Territories to move towards a more contestable Vocational Education and Training (VET) market. The aim of the National Partnership Agreement was to foster a more accessible, transparent, efficient and high quality training sector that is responsive to the needs of students and industry. 

The New South Wales Government introduced the Smart and Skilled program in response to the National Partnership Agreement. Through Smart and Skilled, students can choose a vocational course from a list of approved qualifications and training providers. Students pay the same fee for their chosen qualification regardless of the selected training provider and the government covers the gap between the student fee and the fixed price of the qualification through a subsidy paid to their training provider. 

Smart and Skilled commenced in January 2015, with the then Department of Education and Communities having primary responsibility for its implementation. Since July 2015, the NSW Department of Industry (the Department) has been responsible for VET in New South Wales and the implementation of Smart and Skilled. 

The NSW Skills Board, comprising nine part-time members appointed by the Minister for Skills, provides independent strategic advice on VET reform and funding. In line with most other States and Territories, the Department maintains a 'Skills List' which contains government subsidised qualifications to address identified priority skill needs in New South Wales.

This audit assessed the effectiveness of the Department in identifying, prioritising, and aligning course subsidies to the skill needs of NSW. To do this we examined whether:

  • the Department effectively identifies and prioritises present and future skill needs 
  • Smart and Skilled funding is aligned with the priority skill areas
  • skill needs and available VET courses are effectively communicated to potential participants and training providers.

Smart and Skilled is a relatively new and complex program, and is being delivered in the context of significant reform to VET nationally and in New South Wales. A large scale government funded contestable market was not present in the VET sector in New South Wales before the introduction of Smart and Skilled. This audit's findings should be considered in that context.
 

Conclusion
The Department effectively consults with industry, training providers and government departments to identify skill needs, and targets subsidies to meet those needs. However, the Department does not have a robust, data driven process to remove subsidies from qualifications which are no longer a priority. There is a risk that some qualifications are being subsidised which do not reflect the skill needs of New South Wales. 
The Department needs to better use the data it has, and collect additional data, to support its analysis of priority skill needs in New South Wales, and direct funding accordingly.
In addition to subsidising priority qualifications, the Department promotes engagement in skills training by:
  • funding scholarships and support for disadvantaged students
  • funding training in regional and remote areas
  • providing additional support to deliver some qualifications that the market is not providing.

The Department needs to evaluate these funding strategies to ensure they are achieving their goals. It should also explore why training providers are not delivering some priority qualifications through Smart and Skilled.

Training providers compete for funding allocations based on their capacity to deliver. The Department successfully manages the budget by capping funding allocated to each Smart and Skilled training provider. However, training providers have only one year of funding certainty at present. Training providers that are performing well are not rewarded with greater certainty.

The Department needs to improve its communication with prospective students to ensure they can make informed decisions in the VET market.

The Department also needs to communicate more transparently to training providers about its funding allocations and decisions about changes to the NSW Skills List. 

The NSW Skills List is unlikely to be missing high priority qualifications, but may include lower priority qualifications because the Department does not have a robust process to identify and remove these qualifications from the list. The Department needs to better use available data, and collect further data, to support decisions about which qualifications should be on the NSW Skills List.

The Department relies on stakeholder proposals to update the NSW Skills List. Stakeholders include industry, training providers and government departments. These stakeholders, particularly industry, are likely to be aware of skill needs, and have a strong incentive to propose qualifications that address these needs. The Department’s process of collecting stakeholder proposals helps to ensure that it can identify qualifications needed to address material skill needs. 

It is also important that the Department ensures the NSW Skills List only includes priority qualifications that need to be subsidised by government. The Department does not have robust processes in place to remove qualifications from the NSW Skills List. As a result, there is a risk that the list may include lower priority skill areas. Since the NSW Skills List was first created, new additions to the list have outnumbered those removed by five to one.

The Department does not always validate information gathered from stakeholder proposals, even when it has data to do so. Further, its decision making about what to include on, or delete from, the NSW Skills List is not transparent because the rationale for decisions is not adequately documented. 

The Department is undertaking projects to better use data to support its decisions about what should be on the NSW Skills List. Some of these projects should deliver useful data soon, but some can only provide useful information when sufficient trend data is available. 

Recommendation

The Department should: 

  • by June 2019, increase transparency of decisions about proposed changes to the NSW Skills List and improve record-keeping of deliberations regarding these changes
  • by December 2019, use data more effectively and consistently to ensure that the NSW Skills List only includes high priority qualifications
The Department funds training providers that deliver qualifications on the NSW Skills List. Alignment of funding to skill needs relies on the accuracy of the NSW Skills List, which may include some lower priority qualifications.

Only qualifications on the NSW Skills List are eligible for subsidies under Smart and Skilled. As the Department does not have a robust process for removing low priority qualifications from the NSW Skills list, some low priority qualifications may be subsidised. 

The Department allocates the Smart and Skilled budget through contracts with Smart and Skilled training providers. Training providers that meet contractual obligations and perform well in terms of enrolments and completion rates are rewarded with renewed contracts and more funding for increased enrolments, but these decisions are not based on student outcomes. The Department reduces or removes funding from training providers that do not meet quality standards, breach contract conditions or that are unable to spend their allocated funding effectively. Contracts are for only one year, offering training providers little funding certainty. 

Smart and Skilled provides additional funding for scholarships and for training providers in locations where the cost of delivery is high or to those that cater to students with disabilities. The Department has not yet evaluated whether this additional funding is achieving its intended outcomes. 

Eight per cent of the qualifications that have been on the NSW Skills List since 2015 are not delivered under Smart and Skilled anywhere in New South Wales. A further 14 per cent of the qualifications that are offered by training providers have had no student commencements. The Department is yet to identify the reasons that these high priority qualifications are either not offered or not taken up by students.

Recommendation

The Department should:

  • by June 2019, investigate why training providers do not offer, and prospective students do not enrol in, some Smart and Skilled subsidised qualifications 
  • by December 2019, evaluate the effectiveness of Smart and Skilled funding which supplements standard subsidies for qualifications on the NSW Skills List, to determine whether it is achieving its objectives
  • by December 2019, provide longer term funding certainty to high performing training providers, while retaining incentives for them to continue to perform well.
The Department needs to improve its communication, particularly with prospective students.

In a contestable market, it is important for consumers to have sufficient information to make informed decisions. The Department does not provide some key information to prospective VET students to support their decisions, such as measures of provider quality and examples of employment and further education outcomes of students completing particular courses. Existing information is spread across numerous channels and is not presented in a user friendly manner. This is a potential barrier to participation in VET for those less engaged with the system or less ICT literate.

The Department conveys relevant information about the program to training providers through its websites and its regional offices. However, it could better communicate some specific information directly to individual Smart and Skilled training providers, such as reasons their proposals to include new qualifications on the NSW Skills List are accepted or rejected. 

While the Department is implementing a communication strategy for VET in New South Wales, it does not have a specific communications strategy for Smart and Skilled which comprehensively identifies the needs of different stakeholders and how these can be addressed. 

Recommendation

By December 2019, the Department should develop and implement a specific communications strategy for Smart and Skilled to:

  • support prospective student engagement and informed decision making
  • meet the information needs of training providers 

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #305 - released 26 July 2018

Published

Actions for Regulation of water pollution in drinking water catchments and illegal disposal of solid waste

Regulation of water pollution in drinking water catchments and illegal disposal of solid waste

Environment
Compliance
Internal controls and governance
Management and administration
Regulation
Risk

There are important gaps in how the Environmental Protection Authority (EPA) implements its regulatory framework for water pollution in drinking water catchments and illegal solid waste disposal. This limits the effectiveness of its regulatory responses, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford.

The NSW Environment Protection Authority (the EPA) is the State’s primary environmental regulator. The EPA regulates waste and water pollution under the Protection of the Environment Operations Act 1997 (the Act) through its licensing, monitoring, regulation and enforcement activities. The community should be able to rely on the effectiveness of this regulation to protect the environment and human health. The EPA has regulatory responsibility for more significant and specific activities which can potentially harm the environment.

Activities regulated by the EPA include manufacturing, chemical production, electricity generation, mining, waste management, livestock processing, mineral processing, sewerage treatment, and road construction. For these activities, the operator must have an EPA issued environment protection licence (licence). Licences have conditions attached which may limit the amount and concentrations of substances the activity may produce and discharge into the environment. Conditions also require the licensee to report on its licensed activities.

This audit assessed the effectiveness of the EPA’s regulatory response to water pollution in drinking water catchments and illegal solid waste disposal. The findings and recommendations of this review can be reasonably applied to the EPA’s other regulatory functions, as the areas we examined were indicative of how the EPA regulates all pollution types and incidents.

 
Conclusion
There are important gaps in how the EPA implements its regulatory framework for water pollution in drinking water catchments and illegal solid waste disposal which limit the effectiveness of its regulatory response. The EPA uses a risk-based regulatory framework that has elements consistent with the NSW Government Guidance for regulators to implement outcomes and risk-based regulation. However, the EPA did not demonstrate that it has established reliable practices to accurately and consistently detect the risk of non compliances by licensees, and apply consistent regulatory actions. This may expose the risk of harm to the environment and human health.
The EPA also could not demonstrate that it has effective governance and oversight of its regulatory operations. The EPA operates in a complex regulatory environment where its regional offices have broad discretions for how they operate. The EPA has not balanced this devolved structure with an effective governance approach that includes appropriate internal controls to monitor the consistency or quality of its regulatory activities. It also does not have an effective performance framework that sets relevant performance expectations and outcome-based key performance indicators (KPIs) for its regional offices. 
These deficiencies mean that the EPA cannot be confident that it conducts compliance and enforcement activities consistently across the State and that licensees are complying with their licence conditions or the Act.
The EPA's reporting on environmental and regulatory outcomes is limited and most of the data it uses is self reported by industry. It has not set outcome-based key result areas to assess performance and trends over time. 
The EPA uses a risk-based regulatory framework for water pollution and illegal solid waste disposal but there are important gaps in implementation that reduce its effectiveness.
Elements of the EPA’s risk-based regulatory framework for water pollution and illegal solid waste disposal are consistent with the NSW Government Guidance for regulators to implement outcomes and risk-based regulation. There are important gaps in how the EPA implements its risk-based approach that limit the effectiveness of its regulatory response. The EPA could not demonstrate that it effectively regulates licensees because it has not established reliable practices that accurately and consistently detect licence non compliances or breaches of the Act and enforce regulatory actions.
The EPA lacks effective governance arrangements to support its devolved regional structure. The EPA's performance framework has limited and inconclusive reporting on regional performance to the EPA’s Chief Executive Officer or to the EPA Board. The EPA cannot assure that it is conducting its regulatory responsibilities effectively and efficiently. 
The EPA does not consistently evaluate its regulatory approach to ensure it is effective and efficient. For example, there are no set requirements for how EPA officers conduct mandatory site inspections, which means that there is a risk that officers are not detecting all breaches or non-compliances. The inconsistent approach also means that the EPA cannot rely on the data it collects from these site inspections to understand whether its regulatory response is effective and efficient. In addition, where the EPA identifies instances of non compliance or breaches, it does not apply all available regulatory actions to encourage compliance.
The EPA also does not have a systematic approach to validate self-reported information in licensees’ annual returns, despite the data being used to assess administrative fees payable to the EPA and its regulatory response to non-compliances. 
The EPA does not use performance frameworks to monitor the consistency or quality of work conducted across the State. The EPA has also failed to provide effective guidance for its staff. Many of its policies and procedures are out-dated, inconsistent, hard to access, or not mandated.
Recommendations
By 31 December 2018, to improve governance and oversight, the EPA should:
1. implement a more effective performance framework with regular reports to the Chief Executive Officer and to the EPA Board on outcomes-based key result areas that assess its environmental and regulatory performance and trends over time
By 30 June 2019, to improve consistency in its practices, the EPA should:
2. progressively update and make accessible its policies and procedures for regulatory operations, and mandate procedures where necessary to ensure consistent application
3. implement internal controls to monitor the consistency and quality of its regulatory operations. 
The EPA does not apply a consistent approach to setting licence conditions for discharges to water.
The requirements for setting licence conditions for water pollution are complex and require technical and scientific expertise. In August 2016, the EPA approved guidance developed by its technical experts in the Water Technical Advisory Unit to assist its regional staff. However, the EPA did not mandate the use of the guidance until mid-April 2018. Up until then, the EPA had left discretion to regional offices to decide what guidance their staff use. This meant that practices have differed across the organisation. The EPA is yet to conduct training for staff to ensure they consistently apply the 2016 guidance.
The EPA has not implemented any appropriate internal controls or quality assurance process to monitor the consistency or quality of licence conditions set by its officers across the State. This is not consistent with good regulatory practice.
The triennial 2016 audit of the Sydney drinking water catchment report highlighted that Lake Burragorang has experienced worsening water quality over the past 20 years from increased salinity levels. The salinity levels were nearly twice as high as in other storages in the Sydney drinking water catchment. The report recommended that the source and implication of the increased salinity levels be investigated. The report did not propose which public authority should carry out such an investigation. 
To date, no NSW Government agency has addressed the report's recommendation. There are three public authorities, the EPA, DPE and WaterNSW that are responsible for regulating activities that impact on water quality in the Sydney drinking water catchment, which includes Lake Burragorang. 
Recommendation
By 30 June 2019, to address worsening water quality in Lake Burragorang, the EPA should:
4. (a) review the impact of its licensed activities on water quality in Lake Burragorang, and
  (b) develop strategies relating to its licensed activities (in consultation with other relevant NSW Government agencies) to improve and maintain the lake's water quality.
The EPA’s risk-based approach to monitoring compliance of licensees has limited effectiveness. 
The EPA tailors its compliance monitoring approach based on the performance of licensees. This means that licensees that perform better have a lower administrative fee and fewer mandatory site inspections. 
However, this approach relies on information that is not complete or accurate. Sources of information include licensees’ annual returns, EPA site inspections and compliance audits, and pollution reports from the public. 
Licensees report annually to the EPA on their performance, including compliance against their licence conditions. The Act contains significant financial penalties if licensees provide false and misleading information in their annual returns. However, the EPA does not systematically or consistently validate information self-reported by licensees, or consistently apply regulatory actions if it discovers non-compliance. 
Self-reported compliance data is used in part to assess a licensed premises’ overall environmental risk level, which underpins the calculation of the administrative fee, the EPA’s site inspection frequency, and the licensee’s exposure to regulatory actions. It is also used to assess the load-based licence fee that the licensee pays.
The EPA has set minimum mandatory site inspection frequencies for licensed premises based on its assessed overall risk level. This is a key tool to detect non-compliance or breaches of the Act. However, the EPA has not issued a policy or procedures that define what these mandatory inspections should cover and how they are to be conducted. We found variations in how the EPA officers in the offices we visited conducted these inspections. The inconsistent approach means that the EPA does not have complete and accurate information of licensees’ compliance. The inconsistent approach also means that the EPA is not effectively identifying all non-compliances for it to consider applying appropriate regulatory actions.
The EPA also receives reports of pollution incidents from the public that may indicate non-compliance. However, the EPA has not set expected time frames within which it expects its officers to investigate pollution incidents. The EPA regional offices decide what to investigate and timeframes. The EPA does not measure regional performance regarding timeframes. 
The few compliance audits the EPA conducts annually are effective in identifying licence non-compliances and breaches of the Act. However, the EPA does not have a policy or required procedures for its regulatory officers to consistently apply appropriate regulatory actions in response to compliance audit findings. 
The EPA has not implemented any effective internal controls or quality assurance process to check the consistency or quality of how its regulatory officers monitor compliance across the State. This is not consistent with good regulatory practice.
Recommendations
To improve compliance monitoring, the EPA should implement procedures to:
5. by 30 June 2019, validate self-reported information, eliminate hardcopy submissions and require licensees to report on their breaches of the Act and associated regulations in their annual returns
6. by 31 December 2018, conduct mandatory site inspections under the risk-based licensing scheme to assess compliance with all regulatory requirements and licence conditions.
 
The EPA cannot assure that its regulatory enforcement approach is fully effective.
The EPA’s compliance policy and prosecution guidelines have a large number of available regulatory actions and factors which should be taken into account when selecting an appropriate regulatory response. The extensive legislation determining the EPA’s regulatory activities, and the devolved regional structure the EPA has adopted in delivering its compliance and regulatory functions, increases the risk of inconsistent compliance decisions and regulatory responses. A good regulatory framework needs a consistent approach to enforcement to incentivise compliance. 
The EPA has not balanced this devolved regional structure with appropriate governance arrangements to give it assurance that its regulatory officers apply a consistent approach to enforcement.
The EPA has not issued standard procedures to ensure consistent non-court enforcement action for breaches of the Act or non-compliance with licence conditions. Given our finding that the EPA does not effectively detect breaches and non-compliances, there is a risk that it is not applying appropriate regulatory actions for many breaches and non-compliances.
A recent EPA compliance audit identified significant non-compliances with incident management plan requirements. However, the EPA has not applied regulatory actions for making false statements on annual returns for those licensees that certified their plans complied with such requirements. The EPA also has not applied available regulatory actions for the non-compliances which led to the false or misleading statements.
Recommendation
By 31 December 2018 to improve enforcement, the EPA should:
7. Implement procedures to systematically assess non-compliances with licence conditions and breaches of the Act and to implement appropriate and consistent regulatory actions.
The EPA has implemented the actions listed in the NSW Illegal Dumping Strategy 2014–16. To date, the EPA has also implemented four of the six recommendations made by the ICAC on EPA's oversight of Regional Illegal Dumping Squads.
The EPA did not achieve the NSW Illegal Dumping Strategy 2014–16 target of a 30 per cent reduction in instances of large scale illegal dumping in Sydney, the Illawarra, Hunter and Central Coast from 2011 levels. 
In the reporting period, the incidences of large scale illegal dumping more than doubled. The EPA advised that this increase may be the result of greater public awareness and reporting rather than increased illegal dumping activity. 
By June 2018, the EPA is due to implement one outstanding recommendation made by the ICAC but has not set a time for the other outstanding recommendation.  

Published

Actions for Grants to non-government schools

Grants to non-government schools

Education
Compliance
Internal controls and governance
Management and administration

The NSW Department of Education could strengthen its management of the $1.2 billion provided to non-government schools annually. This would provide greater accountability for the use of public funds, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford.

Non‑government schools educate 418,000 school children each year, representing 35 per cent of all students in NSW. The NSW Department of Education administers several grant schemes to support these schools, with the aim of improving student learning outcomes and supporting parent choice. To be eligible for NSW Government funding, non‑government schools must be registered with the NSW Education Standards Authority (NESA) and not operate 'for profit' as per section 83C of the NSW Education Act 1990 (the Act). Non‑government schools can either be registered as independent or part of a System Authority.

In 2017–18, non‑government schools in NSW will receive over $1.2 billion from the NSW Government, as well as $3.4 billion from the Australian Government. Recently, the Australian Government has changed the way it funds schools. The NSW Government is assessing how these changes will impact State funding for non‑government schools.

This audit assessed how effectively and efficiently NSW Government grants to non‑government schools are allocated and managed. This audit did not assess the use of NSW Government grants by individual non‑government schools or System Authorities because the Auditor‑General of New South Wales does not have the mandate to assess how government funds are spent by non‑government entities.

Conclusion

The Department of Education effectively and efficiently allocates grants to non‑government schools. Clarifying the objectives of grants, monitoring progress towards these objectives, and improving oversight, would strengthen accountability for the use of public funds by non‑government schools.

We tested a sample of grants provided to non‑government schools under all major schemes, and found that the Department of Education consistently allocates and distributes grants in line with its methodology. The Department has clear processes and procedures to efficiently collect data from schools, calculate the level of funding each school or System should receive, obtain appropriate approvals, and make payments.

We identified three areas where the Department could strengthen its management of grants to provide greater accountability for the use of public funds. First, the Department’s objectives for providing grants to non‑government schools are covered by legislation, intergovernmental agreements and grant guidelines. The Department could consolidate these objectives to allow for more consistent monitoring. Second, the Department relies on schools or System Authorities to engage a registered auditor to certify the accuracy of information on their enrolments and usage of grants. Greater scrutiny of the registration and independence of the auditors would increase confidence in the accuracy of this information. Third, the Department does not monitor how System Authorities reallocate grant funding to their member schools. Further oversight in this area would increase accountability for the use of public funds.

The Department effectively and efficiently allocates grants to non‑government schools. Strengthening its processes would provide greater assurance that the information it collects is accurate.

The Department provides clear guidelines to assist schools to provide the necessary census information to calculate per capita grants. Schools must get an independent external auditor, registered with ASIC, to certify their enrolment figures. The Department checks a sample of the auditors to ensure that they are registered with ASIC. Some other jurisdictions perform additional procedures to increase confidence in the accuracy of the census (for example, independently checking a sample of schools’ census data).

The Department accurately calculates and distributes per capita grants in accordance with its methodology. The previous methodology, used prior to 2018, was not updated frequently enough to reflect changes in schools' circumstances. Over 2014 to 2017, the Department provided additional grants to non‑government schools under the National Education Reform Agreement (NERA), to bring funding more closely in line with the Australian Department of Education and Training's Schooling Resource Standard (SRS). From 2018, the Department has changed the way it calculates per capita grants to more closely align with the Australian Department of Education and Training's approach.

The Department determines eligibility for grants by checking a school's registration status with NESA. However, NESA's approach to monitoring compliance with the registration requirements prioritises student learning and wellbeing requirements over the requirement for policies and procedures for proper governance. Given their importance to the appropriate use of government funding, NESA could increase its monitoring of policies and procedures for proper governance through its program of random inspections. Further, the Department and NESA should enter into a formal agreement to share information to more accurately determine the level of risk of non‑compliance at each school. This may help both agencies more effectively target their monitoring to higher‑risk schools.

By December 2018, the NSW Department of Education should:

  1. Strengthen its processes to provide greater assurance that the enrolment and expenditure information it collects from non‑government schools is accurate. This should build on the work the Australian Government already does in this area.
  2. Establish formal information‑sharing arrangements with the NSW Education Standards Authority to more effectively monitor schools' eligibility to receive funding.
     

By December 2018, the NSW Education Standards Authority should:

  1. Extend its inspection practices to increase coverage of the registration requirement for policies and procedures for the proper governance of schools.
  2. Establish formal information‑sharing arrangements with the NSW Department of Education to more effectively monitor schools' continued compliance with the registration requirements.

The Department’s current approach to managing grants to non‑government schools could be improved to provide greater confidence that funds are being spent in line with the objectives of the grant schemes.

The NSW Government provides funding to non‑government schools to improve student learning outcomes, and to support schooling choices by parents, but does not monitor whether these grants are achieving this. In addition, each grant program has specific objectives. The main objectives for the per capita grant program is to increase the rate of students completing Year 12 (or equivalent), and to improve education outcomes for students. While non‑government schools publicly report on some educational measures via the MySchool website, these measures do not address all the objectives. Strengthened monitoring and reporting of progress towards objectives, at a school level, would increase accountability for public funding. This may require the Department to formalise its access to student level information.

The Department has listed five broad categories of acceptable use for per capita grants, however, provides no further guidance on what expenditure would fit into these categories. Clarifying the appropriate use of grants would increase confidence that funding is being used as intended. Schools must engage an independent auditor, registered with ASIC, to certify that the funding has been spent. The Department could strengthen this approach by improving its processes to check the registration of the auditor, and to verify their independence.

The Department has limited oversight of funding provided to System Authorities (Systems). The Department provides grants to Systems for all their member schools. The Systems can distribute the grants to their schools according to their own methodology. Systems are not required to report to the Department how much of their grant was retained for administrative or centralised expenses. Increased oversight over how the Systems distribute this grant could provide increased transparency for the use of public funds by systems.

By December 2018, the NSW Department of Education should:

  1. Establish and communicate funding conditions that require funded schools to:
    • adhere to conditions of funding, such as the acceptable use of grants, and accounting requirements to demonstrate compliance
    • report their progress towards the objectives of the scheme or wider Government initiatives
    • allow the Department to conduct investigations to verify enrolment and expenditure of funds
    • provide the Department with access to existing student level data to inform policy development and analysis.
  1. Increase its oversight of System Authorities by requiring them to:
    • re‑allocate funds across their system on a needs basis, and report to the Department on this
    • provide a yearly submission with enough detail to demonstrate that each System school has spent their State funding in line with the Department's requirements.

Published

Actions for Detecting and responding to cyber security incidents

Detecting and responding to cyber security incidents

Finance
Cyber security
Information technology
Internal controls and governance
Management and administration
Workforce and capability

A report released today by the Auditor-General for New South Wales, Margaret Crawford, found there is no whole-of-government capability to detect and respond effectively to cyber security incidents. There is very limited sharing of information on incidents amongst agencies, and some agencies have poor detection and response practices and procedures.

The NSW Government relies on digital technology to deliver services, organise and store information, manage business processes, and control critical infrastructure. The increasing global interconnectivity between computer networks has dramatically increased the risk of cyber security incidents. Such incidents can harm government service delivery and may include the theft of information, denial of access to critical technology, or even the hijacking of systems for profit or malicious intent.

This audit examined cyber security incident detection and response in the NSW public sector. It focused on the role of the Department of Finance, Services and Innovation (DFSI), which oversees the Information Security Community of Practice, the Information Security Event Reporting Protocol, and the Digital Information Security Policy (the Policy).

The audit also examined ten case study agencies to develop a perspective on how they detect and respond to incidents. We chose agencies that are collectively responsible for personal data, critical infrastructure, financial information and intellectual property.

Conclusion
There is no whole‑of‑government capability to detect and respond effectively to cyber security incidents. There is limited sharing of information on incidents amongst agencies, and some of the agencies we reviewed have poor detection and response practices and procedures. There is a risk that incidents will go undetected longer than they should, and opportunities to contain and restrict the damage may be lost.
Given current weaknesses, the NSW public sector’s ability to detect and respond to incidents needs to improve significantly and quickly. DFSI has started to address this by appointing a Government Chief Information Security Officer (GCISO) to improve cyber security capability across the public sector. Her role includes coordinating efforts to increase the NSW Government’s ability to respond to and recover from whole‑of‑government threats and attacks.

Some of our case study agencies had strong processes for detection and response to cyber security incidents but others had a low capability to detect and respond in a timely way.

Most agencies have access to an automated tool for analysing logs generated by their IT systems. However, coverage of these tools varies. Some agencies do not have an automated tool and only review logs periodically or on an ad hoc basis, meaning they are less likely to detect incidents.

Few agencies have contractual arrangements in place for IT service providers to report incidents to them. If a service provider elects to not report an incident, it will delay the agency’s response and may result in increased damage.

Most case study agencies had procedures for responding to incidents, although some lack guidance on who to notify and when. Some agencies do not have response procedures, limiting their ability to minimise the business damage that may flow from a cyber security incident. Few agencies could demonstrate that they have trained their staff on either incident detection or response procedures and could provide little information on the role requirements and responsibilities of their staff in doing so.

Most agencies’ incident procedures contain limited information on how to report an incident, who to report it to, when this should occur and what information should be provided. None of our case study agencies’ procedures mentioned reporting to DFSI, highlighting that even though reporting is mandatory for most agencies their procedures do not require it.

Case study agencies provided little evidence to indicate they are learning from incidents, meaning that opportunities to better manage future incidents may be lost.

Recommendations

The Department of Finance, Services and Innovation should:

  • assist agencies by providing:
    • better practice guidelines for incident detection, response and reporting to help agencies develop their own practices and procedures
    • training and awareness programs, including tailored programs for a range of audiences such as cyber professionals, finance staff, and audit and risk committees
    • role requirements and responsibilities for cyber security across government, relevant to size and complexity of each agency
    • a support model for agencies that have limited detection and response capabilities
       
  • revise the Digital Information Security Policy and Information Security Event Reporting Protocol by
    • clarifying what security incidents must be reported to DFSI and when
    • extending mandatory reporting requirements to those NSW Government agencies not currently covered by the policy and protocol, including State owned corporations.

DFSI lacks a clear mandate or capability to provide effective detection and response support to agencies, and there is limited sharing of information on cyber security incidents.

DFSI does not currently have a clear mandate and the necessary resources and systems to detect, receive, share and respond to cyber security incidents across the NSW public sector. It does not have a clear mandate to assess whether agencies have an acceptable detection and response capability. It is aware of deficiencies in agencies and across whole‑of‑government, and has begun to conduct research into this capability.

Intelligence gathering across the public sector is also limited, meaning agencies may not respond to threats in a timely manner. DFSI has not allocated resources for gathering of threat intelligence and communicating it across government, although it has begun to build this capacity.

Incident reporting to DFSI is mandatory for most agencies, however, most of our case study agencies do not report incidents to DFSI, reducing the likelihood of containing an incident if it spreads to other agencies. When incidents have been reported, DFSI has not provided dedicated resources to assess them and coordinate the public sector’s response. There are currently no formal requirements for DFSI to respond to incidents and no guidance on what it is meant to do if an incident is reported. The lack of central coordination in incident response risks delays and increased damage to multiple agencies.

DFSI's reporting protocol is weak and does not clearly specify what agencies should report and when. This makes agencies less likely to report incidents. The lack of a standard format for incident reporting and a consistent method for assessing an incident, including the level of risk associated with it, also make it difficult for DFSI to determine an appropriate response.

There are limited avenues for sharing information amongst agencies after incidents have been resolved, meaning the public sector may be losing valuable opportunities to improve its protection and response.

Recommendations

The Department of Finance, Services and Innovation should:

  • develop whole‑of‑government procedure, protocol and supporting systems to effectively share reported threats and respond to cyber security incidents impacting multiple agencies, including follow-up and communicating lessons learnt
  • develop a means by which agencies can report incidents in a more effective manner, such as a secure online template, that allows for early warnings and standardised details of incidents and remedial advice
  • enhance NSW public sector threat intelligence gathering and sharing including formal links with Australian Government security agencies, other states and the private sector
  • direct agencies to include standard clauses in contracts requiring IT service providers report all cyber security incidents within a reasonable timeframe
  • provide assurance that agencies have appropriate reporting procedures and report to DFSI as required by the policy and protocol by:
    • extending the attestation requirement within the DISP to cover procedures and reporting
    • reviewing a sample of agencies' incident reporting procedures each year.

Published

Actions for Managing IT Services Contracts

Managing IT Services Contracts

Finance
Health
Justice
Compliance
Information technology
Internal controls and governance
Procurement
Project management
Risk

Neither agency (NSW Ministry of Health and NSW Police Force) demonstrated that they continued to get value for money over the life of these long term contracts or that they had effectively managed all critical elements of the three contracts we reviewed post award. This is because both agencies treated contract extensions or renewals as simply continuing previous contractual arrangements, rather than as establishing a new contract and financial commitment. Consequently, there was not a robust analysis of the continuing need for the mix and quantity of services being provided or an assessment of value for money in terms of the prices being paid.

 

Parliamentary reference - Report number #220 - released 1 February 2012