Refine search Expand filter

Reports

Published

Actions for NSW planning portal

NSW planning portal

Planning
Industry
Environment
Local Government
Information technology
Project management
Risk

What the report is about

The ePlanning program is an initiative of the Department of Planning and Environment (the department) to deliver a digital planning service for New South Wales through the NSW planning portal (the portal).

Using the portal, relevant planning activities can be carried out online, including all stages of development applications.

The portal has been developed under three separate business cases in 2013, 2014 and 2020.

In late 2019, the government mandated the use of the portal for all development applications. This decision took effect across 2020–21.

This audit assessed the effectiveness of the department's implementation, governance and stakeholder engagement in delivering the NSW planning portal. 

What we found

Since implementation commenced in 2013, the NSW planning portal has progressively achieved its objectives to provide citizens with access to consolidated planning information, and allow them to prepare and submit development applications online.

Shortcomings in the department's initial planning and management of the program led to a significant time overrun. It has taken the department longer and cost significantly more to implement the portal than first anticipated. 

In recent years the department has improved the planning, implementation and governance of the ePlanning program, resulting in improved delivery of the portal’s core functions.

The department now has a clear view of the scope necessary to finalise the program, but has not yet published the services it plans to implement in 2022 and 2023.

Mandating the use of the portal for all development applications changed the program's strategic risk environment and required the department to work more closely with a cohort of stakeholders, many of whom did not want to adopt the portal.

Despite this change, the department kept its overall delivery approach the same.

While implementation of the portal has delivered financial benefits, the department has overestimated their value.

The Department has only reported benefits since 2019 and has not independently assured the calculation of benefits.

What we recommended

By December 2022, the department should:

  • publish a roadmap of the services it expects to release on the portal across 2022 and 2023
  • update its ePlanning program assumptions, benefits targets and change management approach to reflect the government's decision to mandate the use of the portal for all stages of a development application
  • independently assure and report publicly the correct calculation of ePlanning program benefits.

Fast facts

  • 10 years taken to implement the portal when completed
  • 3 years longer than initially planned to implement the portal
  • $146m capital expenditure on the portal when completed
  • $38.5m more spent than planned in the business cases.

The ePlanning program is an initiative of the Department of Planning and Environment (the department) to deliver a digital planning service for New South Wales through the NSW planning portal (the portal, or the planning portal). The department defines the portal as an online environment where community, industry and government can work together to better understand and meet their obligations under the Environmental Planning and Assessment Act 1979 (NSW). Using the portal, relevant planning activities can be carried out online throughout New South Wales. This includes, but is not limited to:

  • applying for and gaining planning approval
  • applying for and gaining approval for building works, sub-dividing land and similar activities
  • issuing occupancy and other certificates.

The portal has been developed under three separate business cases. The first business case in 2013 led to the creation of a central portal, which made planning information available to view by planning applicants and allowed some planning applications to be lodged and tracked online.

Under a second business case prepared in 2014, the department set out to improve and widen the functions available via the portal. The department prepared a third business case in 2020 to fund further improvements to the portal over the period July 2020 to June 2023. The third business case also extended the portal's functions to support the building and occupation stages of the planning cycle.

In late 2019, the government mandated the use of the portal for all stages of development applications. This decision took effect across 2020–21 and applied to all councils as well as certifiers and others involved in the planning process.

The objective of this performance audit was to assess the effectiveness of the department's implementation, governance and stakeholder engagement in delivering the NSW planning portal. We investigated whether:

  • delivery of the NSW planning portal was planned effectively
  • sound governance arrangements are in place to ensure effective implementation of the program
  • users of the NSW planning portal are supported effectively to adopt and use the system.
Conclusion

Since implementation commenced in 2013, the NSW planning portal has progressively achieved its objectives to provide citizens with access to consolidated planning information and allow them to prepare and submit development applications online. Implementation was initially hindered by deficiencies in planning and it has taken the department significantly longer and cost significantly more to implement the portal than first anticipated. While the portal's implementation has delivered financial benefits, the department has overestimated their value. As a result, the department cannot yet demonstrate that the portal has achieved overall financial benefits, relative to its costs.

In the first two years of the ePlanning program, the department delivered a portal that allowed planners, developers, certifiers and the public to view important planning information. However, the department found the delivery of a second, transactional version of the portal in 2017 to be much more challenging. This version was intended to offer more integrated information and allow development applications to be submitted and managed online. The department did not rollout this version after a pilot showed significant weaknesses with the portal's performance. A subsequent review found that this was partly because the department did not have a clear view of the portal’s role or the best way to implement it. In recent years the department has improved the planning, implementation and governance of the ePlanning program resulting in improved delivery of the portal’s core functions.

By the time the program reaches its scheduled completion in 2023, it will have taken the department ten years and around $146 million in capital expenditure to implement the portal. This will be significantly longer and more expensive than the department originally expected. This overrun is partly due to an increased scope of services delivered through the portal and an initial under-appreciation of what is involved in creating a standard, central resource such as the portal. The department also experienced some significant implementation difficulties – which saw the transactional portal discontinued after it was found to be not fit for purpose. Following this, the department re-set the program in 2017–18 and re-planned much of the portal's subsequent development.

In November 2019, the New South Wales Government decided to mandate the use of the portal for all stages of development applications by the end of 2020–21. The department had previously planned that the portal would be progressively adopted by all councils and other stakeholders over the five years to 2025. The decision to mandate the portal's use for all development applications brought forward many of the portal's benefits as well as the challenges of its implementation. The department did not change its overall delivery approach in response to the changed risks associated with the government's decision to mandate use of the portal.

The current version of the portal has given the department more timely and comprehensive planning information and has helped New South Wales to provide continuous planning services during COVID-19 lockdowns, which interrupted many other public functions. The portal has also delivered financial benefits, however the department has not independently assured benefits calculations carried out by its consultant, and the reported benefits are overstated. In addition, some stakeholders report that the portal is a net cost to their organisation. This has included some certifiers and some councils which had implemented or had started to implement their own ePlanning reforms when use of the portal was mandated in 2019. The department now needs to address the issues faced by these stakeholders while continuing to deliver the remaining improvements and enhancements to the portal. Over the remaining year of the program, it will be critical that the department focuses on the agreed program scope and carefully evaluates any opportunities to further develop the portal to support future planning reforms.

This part of the report sets out how:

  • the ePlanning program has been planned and delivered
  • users of the portal have been supported
  • the program has been governed.

This part of the report sets out the ePlanning program's:

  • expected and reported financial benefits
  • calculation of financial benefits.

In 2019, the department increased its expectations for net financial benefits

The department's three ePlanning business cases each forecast substantial financial benefits from the implementation of the planning portal. The department expected that most financial benefits would flow to planning applicants due to a quicker and more consistent planning process. It also expected that government agencies and councils would benefit from the portal.

Exhibit 6: Summary of the financial benefits originally expected
  Business case 1
($ million)
Business case 2
($ million)
Business case 3
($ million)
Total
($ million)
Benefits 90.0 44.3 270.9 405.2
Costs 43.3 29.4 89.8 162.5
Net benefits 46.7 15.0 181.1 242.7

Note: Benefits and costs are incremental. All amounts are calculated over ten years. Amounts for business case 1, 2 and 3 amounts are expressed in 2013, 2015 and 2019 dollars respectively. All amounts are discounted at seven per cent to show their value at the time when they were calculated. Amounts may not add due to rounding.
Source: Audit Office analysis of data provided by the Department of Planning and Environment.

In 2019 the department commissioned a review to explore opportunities to better identify, monitor and realise the benefits of the ePlanning program. Using this work, the department updated the expected benefits for business cases 1 and 2 to take account of:

  • errors and miscalculations in the original benefits calculations
  • slower delivery of the portal and changes to the take-up of portal services by councils
  • changes to the services supported by the portal.
Exhibit 7: Summary of the financial benefits expected for business case 1 and 2 after the 2019 update
  Original business case 1 and 2 (combined)
($ million)
New business case 1 and 2 (combined)
($ million)
Benefits 134.3 210.6
Costs 72.7 96.3
Net benefits 61.7 114.3

Note: Benefits and costs are incremental. All amounts are calculated over ten years. Amounts for the original business case 1 and 2 are expressed in 2013 and 2015 dollars respectively. The new combined amount is expressed in 2019 dollars. All amounts are discounted or inflated at seven per cent to show their value at the time when they were calculated. Amounts may not add due to rounding.
Source: Audit Office analysis of data provided by the Department of Planning and Environment.

Reported benefits significantly exceed the current targets

In September 2021, the department reported that the program had achieved $334 million of benefits over the three financial years up to June 2021 plus the first two months of 2021–22. These reported benefits were significantly higher than expected. 

Exhibit 8: Reported financial benefits from the ePlanning program
  2018–19
($ million)
2019–20
($ million)
2020–21
($ million)
July to August 2021
($ million)
Total
($ million)
Benefits 5.2 68.8 214.7 45.1 333.8
Target 2.5 14.4 56.7 19.2 92.8
Amount and per cent above target 2.7
108%
54.4
378%
158
279%
25.9
135%
241
260%

Source: Audit Office analysis of data provided by the Department of Planning and Environment.

The department attributes the higher-than-expected financial benefits to the following:

  • benefit targets have not been updated to reflect the impact of the 2019 decision to mandate the use of the portal for all development applications. This decision brought forward the expected benefits as well as potential costs of the program. However, the department did not update its third business case which was draft at the time. The business case was subsequently approved in July 2020
  • one-off cost savings for agencies not having to develop their own systems
  • public exhibitions of planning proposals continuing to be available online during 2020 when some newspapers stopping printing due to COVID-19.

The calculation of benefits is overstated

The department reported $334 million of benefits in September 2021 due to the ePlanning program. This calculation is overstated because:

  • a proportion of reported benefits is likely to be due to other planning reforms
  • the calculation of the largest single benefit is incorrect
  • the reported benefits may not fully account for dis-benefits reported by some stakeholders.

The program’s benefits are calculated primarily from changes in planning performance data, such as the time it takes to determine a planning development application. The department currently attributes the benefits from shorter planning cycles entirely to the effect of the ePlanning program. However, planning cycles are impacted by many other factors such as the complexity of planning regulations and the availability of planning professionals. Planning cycles may also be impacted by other departmental initiatives which are designed to improve the time that it takes for a planning application to be evaluated. The Introduction describes some of these initiatives.

The largest contribution to the department’s September 2021 benefit report was an estimated saving of $151 million for developers due to lower costs associated with holding their investment for a shorter time. However, the department’s calculation of this benefit assumes a high baseline for the time to determine a development application. It also assumes that all development applications except for additions or alterations to existing properties will incur financing costs. However, a small but material number of these applications will be self-financed. The calculation also includes several data errors in spreadsheets.

The calculation of some benefits relies upon an extrapolation of the benefits experienced by a small number of early-adopter councils, including lower printing and scanning costs, fewer forms and quicker processing times. However, some councils report that their costs have increased following the introduction of the portal, primarily because aspects of the portal duplicate work that they carry out in their own systems. The portal has also required some councils to re-engineer aspects of their own systems, such as the integration of their planning systems with other council systems such as finance or property and rating systems. It has also required councils to create new ways of integrating council information systems with the planning portal.

The department has published information to help councils and certifiers to automatically integrate their systems with the planning portal. This approach uses application programming interfaces (or APIs) which are an industry-standard way for systems to share information. In April and May 2021, the government granted $4.8 million to 96 regional councils to assist with the cost of developing, implementing and maintaining APIs. The maximum amount of funding for each council was $50,000. The department is closely monitoring the implementation of APIs by councils and other portal users. Once they are fully implemented the department expects APIs to reduce costs incurred by stakeholders.

The department has not yet measured stakeholder costs. It was beyond the scope of this audit to validate these costs.

The department has not independently assured the calculation of reported benefits

In 2020 the department appointed an external provider to calculate the benefits achieved by the ePlanning program. The department advised that it chose to outsource the calculation of benefits because the provider had the required expertise and because it wanted an independent calculation of the benefits. The process involves:

  • extraction and verification of planning performance data by the department
  • population of data input sheets by the department
  • calculation of benefits by the external provider using the data input
  • confirmation by the department that the calculation includes all expected benefit sources.

The department does not have access to the benefits calculation model which is owned and operated by the external provider. The department trusts that the provider correctly calculates the benefits and does not verify the reported benefit numbers. However, as the benefits model involves many linked spreadsheets and approximately 300 individual data points, there is a risk that the calculation model contains errors beyond those discussed in this audit.

The reported benefits have only been calculated since 2019

The department originally intended to track benefits from October 2014. However, it only started to track benefits in 2019 when it appointed an external provider to calculate the benefits achieved by the portal. Any benefits or dis-benefits between the introduction of the portal and 2019 are unknown and not included in the department’s calculation of benefits.

Appendix one – Response from agency

Appendix two – About the audit

Appendix three – Performance auditing

 

Copyright notice

© Copyright reserved by the Audit Office of New South Wales. All rights reserved. No part of this publication may be reproduced without prior consent of the Audit Office of New South Wales. The Audit Office does not accept responsibility for loss or damage suffered by any person acting on or refraining from action as a result of any of this material.

 

Parliamentary reference - Report number #366 - released 21 June 2022

Published

Actions for Mobile speed cameras

Mobile speed cameras

Transport
Compliance
Financial reporting
Information technology
Internal controls and governance
Management and administration
Regulation
Service delivery

Key aspects of the state’s mobile speed camera program need to be improved to maximise road safety benefits, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Mobile speed cameras are deployed in a limited number of locations with a small number of these being used frequently. This, along with decisions to limit the hours that mobile speed cameras operate, and to use multiple warning signs, have reduced the broad deterrence of speeding across the general network - the main policy objective of the mobile speed camera program.

The primary goal of speed cameras is to reduce speeding and make the roads safer. Our 2011 performance audit on speed cameras found that, in general, speed cameras change driver behaviour and have a positive impact on road safety.

Transport for NSW published the NSW Speed Camera Strategy in June 2012 in response to our audit. According to the Strategy, the main purpose of mobile speed cameras is to reduce speeding across the road network by providing a general deterrence through anywhere, anytime enforcement and by creating a perceived risk of detection across the road network. Fixed and red-light speed cameras aim to reduce speeding at specific locations.

Roads and Maritime Services and Transport for NSW deploy mobile speed cameras (MSCs) in consultation with NSW Police. The cameras are operated by contractors authorised by Roads and Maritime Services. MSC locations are stretches of road that can be more than 20 kilometres long. MSC sites are specific places within these locations that meet the requirements for a MSC vehicle to be able to operate there.

This audit assessed whether the mobile speed camera program is effectively managed to maximise road safety benefits across the NSW road network.

Conclusion

The mobile speed camera program requires improvements to key aspects of its management to maximise road safety benefits. While camera locations have been selected based on crash history, the limited number of locations restricts network coverage. It also makes enforcement more predictable, reducing the ability to provide a general deterrence. Implementation of the program has been consistent with government decisions to limit its hours of operation and use multiple warning signs. These factors limit the ability of the mobile speed camera program to effectively deliver a broad general network deterrence from speeding.

Many locations are needed to enable network-wide coverage and ensure MSC sessions are randomised and not predictable. However, there are insufficient locations available to operate MSCs that meet strict criteria for crash history, operator safety, signage and technical requirements. MSC performance would be improved if there were more locations.

A scheduling system is meant to randomise MSC location visits to ensure they are not predictable. However, a relatively small number of locations have been visited many times making their deployment more predictable in these places. The allocation of MSCs across the time of day, day of week and across regions is prioritised based on crash history but the frequency of location visits does not correspond with the crash risk for each location.

There is evidence of a reduction in fatal and serious crashes at the 30 best-performing MSC locations. However, there is limited evidence that the current MSC program in NSW has led to a behavioural change in drivers by creating a general network deterrence. While the overall reduction in serious injuries on roads has continued, fatalities have started to climb again. Compliance with speed limits has improved at the sites and locations that MSCs operate, but the results of overall network speed surveys vary, with recent improvements in some speed zones but not others.
There is no supporting justification for the number of hours of operation for the program. The rate of MSC enforcement (hours per capita) in NSW is less than Queensland and Victoria. The government decision to use multiple warning signs has made it harder to identify and maintain suitable MSC locations, and impeded their use for enforcement in both traffic directions and in school zones. 

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #308 - released 18 October 2018

Published

Actions for Detecting and responding to cyber security incidents

Detecting and responding to cyber security incidents

Finance
Cyber security
Information technology
Internal controls and governance
Management and administration
Workforce and capability

A report released today by the Auditor-General for New South Wales, Margaret Crawford, found there is no whole-of-government capability to detect and respond effectively to cyber security incidents. There is very limited sharing of information on incidents amongst agencies, and some agencies have poor detection and response practices and procedures.

The NSW Government relies on digital technology to deliver services, organise and store information, manage business processes, and control critical infrastructure. The increasing global interconnectivity between computer networks has dramatically increased the risk of cyber security incidents. Such incidents can harm government service delivery and may include the theft of information, denial of access to critical technology, or even the hijacking of systems for profit or malicious intent.

This audit examined cyber security incident detection and response in the NSW public sector. It focused on the role of the Department of Finance, Services and Innovation (DFSI), which oversees the Information Security Community of Practice, the Information Security Event Reporting Protocol, and the Digital Information Security Policy (the Policy).

The audit also examined ten case study agencies to develop a perspective on how they detect and respond to incidents. We chose agencies that are collectively responsible for personal data, critical infrastructure, financial information and intellectual property.

Conclusion
There is no whole‑of‑government capability to detect and respond effectively to cyber security incidents. There is limited sharing of information on incidents amongst agencies, and some of the agencies we reviewed have poor detection and response practices and procedures. There is a risk that incidents will go undetected longer than they should, and opportunities to contain and restrict the damage may be lost.
Given current weaknesses, the NSW public sector’s ability to detect and respond to incidents needs to improve significantly and quickly. DFSI has started to address this by appointing a Government Chief Information Security Officer (GCISO) to improve cyber security capability across the public sector. Her role includes coordinating efforts to increase the NSW Government’s ability to respond to and recover from whole‑of‑government threats and attacks.

Some of our case study agencies had strong processes for detection and response to cyber security incidents but others had a low capability to detect and respond in a timely way.

Most agencies have access to an automated tool for analysing logs generated by their IT systems. However, coverage of these tools varies. Some agencies do not have an automated tool and only review logs periodically or on an ad hoc basis, meaning they are less likely to detect incidents.

Few agencies have contractual arrangements in place for IT service providers to report incidents to them. If a service provider elects to not report an incident, it will delay the agency’s response and may result in increased damage.

Most case study agencies had procedures for responding to incidents, although some lack guidance on who to notify and when. Some agencies do not have response procedures, limiting their ability to minimise the business damage that may flow from a cyber security incident. Few agencies could demonstrate that they have trained their staff on either incident detection or response procedures and could provide little information on the role requirements and responsibilities of their staff in doing so.

Most agencies’ incident procedures contain limited information on how to report an incident, who to report it to, when this should occur and what information should be provided. None of our case study agencies’ procedures mentioned reporting to DFSI, highlighting that even though reporting is mandatory for most agencies their procedures do not require it.

Case study agencies provided little evidence to indicate they are learning from incidents, meaning that opportunities to better manage future incidents may be lost.

Recommendations

The Department of Finance, Services and Innovation should:

  • assist agencies by providing:
    • better practice guidelines for incident detection, response and reporting to help agencies develop their own practices and procedures
    • training and awareness programs, including tailored programs for a range of audiences such as cyber professionals, finance staff, and audit and risk committees
    • role requirements and responsibilities for cyber security across government, relevant to size and complexity of each agency
    • a support model for agencies that have limited detection and response capabilities
       
  • revise the Digital Information Security Policy and Information Security Event Reporting Protocol by
    • clarifying what security incidents must be reported to DFSI and when
    • extending mandatory reporting requirements to those NSW Government agencies not currently covered by the policy and protocol, including State owned corporations.

DFSI lacks a clear mandate or capability to provide effective detection and response support to agencies, and there is limited sharing of information on cyber security incidents.

DFSI does not currently have a clear mandate and the necessary resources and systems to detect, receive, share and respond to cyber security incidents across the NSW public sector. It does not have a clear mandate to assess whether agencies have an acceptable detection and response capability. It is aware of deficiencies in agencies and across whole‑of‑government, and has begun to conduct research into this capability.

Intelligence gathering across the public sector is also limited, meaning agencies may not respond to threats in a timely manner. DFSI has not allocated resources for gathering of threat intelligence and communicating it across government, although it has begun to build this capacity.

Incident reporting to DFSI is mandatory for most agencies, however, most of our case study agencies do not report incidents to DFSI, reducing the likelihood of containing an incident if it spreads to other agencies. When incidents have been reported, DFSI has not provided dedicated resources to assess them and coordinate the public sector’s response. There are currently no formal requirements for DFSI to respond to incidents and no guidance on what it is meant to do if an incident is reported. The lack of central coordination in incident response risks delays and increased damage to multiple agencies.

DFSI's reporting protocol is weak and does not clearly specify what agencies should report and when. This makes agencies less likely to report incidents. The lack of a standard format for incident reporting and a consistent method for assessing an incident, including the level of risk associated with it, also make it difficult for DFSI to determine an appropriate response.

There are limited avenues for sharing information amongst agencies after incidents have been resolved, meaning the public sector may be losing valuable opportunities to improve its protection and response.

Recommendations

The Department of Finance, Services and Innovation should:

  • develop whole‑of‑government procedure, protocol and supporting systems to effectively share reported threats and respond to cyber security incidents impacting multiple agencies, including follow-up and communicating lessons learnt
  • develop a means by which agencies can report incidents in a more effective manner, such as a secure online template, that allows for early warnings and standardised details of incidents and remedial advice
  • enhance NSW public sector threat intelligence gathering and sharing including formal links with Australian Government security agencies, other states and the private sector
  • direct agencies to include standard clauses in contracts requiring IT service providers report all cyber security incidents within a reasonable timeframe
  • provide assurance that agencies have appropriate reporting procedures and report to DFSI as required by the policy and protocol by:
    • extending the attestation requirement within the DISP to cover procedures and reporting
    • reviewing a sample of agencies' incident reporting procedures each year.

Published

Actions for Managing IT Services Contracts

Managing IT Services Contracts

Finance
Health
Justice
Compliance
Information technology
Internal controls and governance
Procurement
Project management
Risk

Neither agency (NSW Ministry of Health and NSW Police Force) demonstrated that they continued to get value for money over the life of these long term contracts or that they had effectively managed all critical elements of the three contracts we reviewed post award. This is because both agencies treated contract extensions or renewals as simply continuing previous contractual arrangements, rather than as establishing a new contract and financial commitment. Consequently, there was not a robust analysis of the continuing need for the mix and quantity of services being provided or an assessment of value for money in terms of the prices being paid.

 

Parliamentary reference - Report number #220 - released 1 February 2012