Refine search Expand filter

Reports

Published

Actions for Managing Antisocial behaviour in public housing

Managing Antisocial behaviour in public housing

Community Services
Asset valuation
Infrastructure
Regulation
Service delivery
Workforce and capability

The Department of Family and Community Services (FACS) has not adequately supported or resourced its staff to manage antisocial behaviour in public housing according to a report released today by the Deputy Auditor-General for New South Wales, Ian Goodwin. 

In recent decades, policy makers and legislators in Australian states and territories have developed and implemented initiatives to manage antisocial behaviour in public housing environments. All jurisdictions now have some form of legislation or policy to encourage public housing tenants to comply with rules and obligations of ‘good neighbourliness’. In November 2015, the NSW Parliament changed legislation to introduce a new approach to manage antisocial behaviour in public housing. This approach is commonly described as the ‘strikes’ approach. 

When introduced in the NSW Parliament, the ‘strikes’ approach was described as a means to:

  • improve the behaviour of a minority of tenants engaging in antisocial behaviour 
  • create better, safer communities for law abiding tenants, including those who are ageing and vulnerable.

FACS has a number of tasks as a landlord, including a responsibility to collect rent and organise housing maintenance. FACS also has a role to support tenants with complex needs and manage antisocial behaviour. These roles have some inherent tensions. The FACS antisocial behaviour management policy aims are: 

to balance the responsibilities of tenants, the rights of their neighbours in social housing, private residents and the broader community with the need to support tenants to sustain their public housing tenancies.

This audit assessed the efficiency and effectiveness of the ‘strikes’ approach to managing antisocial behaviour in public housing environments.

We examined whether:

  • the approach is being implemented as intended and leading to improved safety and security in social housing environments
  • FACS and its partner agencies have the capability and capacity to implement the approach
  • there are effective mechanisms to monitor, report and progressively improve the approach.
Conclusion

FACS has not adequately supported or resourced its staff to implement the antisocial behaviour policy. FACS antisocial behaviour data is incomplete and unreliable. Accordingly, there is insufficient data to determine the nature and extent of the problem and whether the implementation of the policy is leading to improved safety and security

FACS management of minor and moderate incidents of antisocial behaviour is poor. FACS has not dedicated sufficient training to equip frontline housing staff with the relevant skills to apply the antisocial behaviour management policy. At more than half of the housing offices we visited, staff had not been trained to:

  • conduct effective interviews to determine whether an antisocial behaviour complaint can be substantiated

  • de escalate conflict and manage complex behaviours when required

  • properly manage the safety of staff and tenants

  • establish information sharing arrangements with police

  • collect evidence that meets requirements at the NSW Civil and Administrative Tribunal

  • record and manage antisocial behaviour incidents using the information management system HOMES ASB.

When frontline housing staff are informed about serious and severe illegal antisocial behaviour incidents, they generally refer them to the FACS Legal Division. Staff in the Legal Division are trained and proficient in managing antisocial behaviour in compliance with the policy and therefore, the more serious incidents are managed effectively using HOMES ASB. 


FACS provides housing services to most remote townships via outreach visits from the Dubbo office. In remote townships, the policy is not being fully implemented due to insufficient frontline housing staff. There is very limited knowledge of the policy in these areas and FACS data shows few recorded antisocial behaviour incidents in remote regions. 


The FACS information management system (HOMES ASB) is poorly designed and has significant functional limitations that impede the ability of staff to record and manage antisocial behaviour. Staff at most of the housing offices we visited were unable to accurately record antisocial behaviour matters in HOMES ASB, making the data incorrect and unreliable.

Published

Actions for HealthRoster benefits realisation

HealthRoster benefits realisation

Health
Compliance
Information technology
Management and administration
Project management
Workforce and capability

The HealthRoster system is delivering some business benefits but Local Health Districts are yet to use all of its features, according to a report released today by the Auditor-General for New South Wales,  Margaret Crawford. HealthRoster is an IT system designed to more effectively roster staff to meet the needs of Local Health Districts and other NSW health agencies.

The NSW public health system employs over 100,000 people in clinical and non-clinical roles across the state. With increasing demand for services, it is vital that NSW Health effectively rosters staff to ensure high quality and efficient patient care, while maintaining good workplace practices to support staff in demanding roles.

NSW Health is implementing HealthRoster as its single state-wide rostering system to more effectively roster staff according to the demands of each location. Between 2013–14 and 2016–17, our financial audits of individual LHDs had reported issues with rostering and payroll processes and systems.

NSW Health grouped all Local Health Districts (LHDs), and other NSW Health organisations, into four clusters to manage the implementation of HealthRoster over four years. Refer to Exhibit 4 for a list of the NSW Health entities in each cluster.

  • Cluster 1 implementation commenced in 2014–15 and was completed in 2015–16.
  • Cluster 2 implementation commenced in 2015–16 and was completed in 2016–17.
  • Cluster 3 began implementation in 2016–17 and was underway during the conduct of the audit.
  • Cluster 4 began planning for implementation in 2017–18.

Full implementation, including capability for centralised data and reporting, is planned for completion in 2019.

This audit assessed the effectiveness of the HealthRoster system in delivering business benefits. In making this assessment, we examined whether:

  • expected business benefits of HealthRoster were well-defined
  • HealthRoster is achieving business benefits where implemented.

The HealthRoster project has a timespan from 2009 to 2019. We examined the HealthRoster implementation in LHDs, and other NSW Health organisations, focusing on the period from 2014, when eHealth assumed responsibility for project implementation, to early 2018.

Conclusion
The HealthRoster system is realising functional business benefits in the LHDs where it has been implemented. In these LHDs, financial control of payroll expenditure and rostering compliance with employment award conditions has improved. However, these LHDs are not measuring the value of broader benefits such as better management of staff leave and overtime.
NSW Health has addressed the lessons learned from earlier implementations to improve later implementations. Business benefits identified in the business case were well defined and are consistent with business needs identified by NSW Health. Three of four cluster 1 LHDs have been able to reduce the number of issues with rostering and payroll processes. LHDs in earlier implementations need to use HealthRoster more effectively to ensure they are getting all available benefits from it.
HealthRoster is taking six years longer, and costing $37.2 million more, to fully implement than originally planned. NSW Health attributes the increased cost and extended timeframe to the large scale and complexity of the full implementation of HealthRoster.

Business benefits identified for HealthRoster accurately reflect business needs.

NSW Health has a good understanding of the issues in previous rostering systems and has designed HealthRoster to adequately address these issues. Interviews with frontline staff indicate that HealthRoster facilitates rostering which complies with industrial awards. This is a key business benefit that supports the provision of quality patient care. We saw no evidence that any major business needs or issues with the previous rostering systems are not being addressed by HealthRoster.

In the period examined in this audit since 2015, NSW Health has applied appropriate project management and governance structures to ensure that risks and issues are well managed during HealthRoster implementation.

HealthRoster has had two changes to its budget and timeline. Overall, the capital cost for the project has increased from $88.6 million to $125.6 million (42 per cent) and has delayed expected project completion by four years from 2015 to 2019. NSW Health attributes the increased cost and extended time frame to the large scale and complexity of the full implementation of HealthRoster.

NSW Health has established appropriate governance arrangements to ensure that HealthRoster is successfully implemented and that it will achieve business benefits in the long term. During implementation, local steering committees monitor risks and resolve implementation issues. Risks or issues that cannot be resolved locally are escalated to the state-wide steering committee.

NSW Health has grouped local health districts, and other NSW Health organisations, into four clusters for implementation. This has enabled NSW Health to apply lessons learnt from each implementation to improve future implementations.

NSW Health has a benefits realisation framework, but it is not fully applied to HealthRoster.

NSW Health can demonstrate that HealthRoster has delivered some functional business benefits, including rosters that comply with a wide variety of employment awards.

NSW Health is not yet measuring and tracking the value of business benefits achieved. NSW Health did not have benefits realisation plans with baseline measures defined for LHDs in cluster 1 and 2 before implementation. Without baseline measures NSW Health is unable to quantify business benefits achieved. However, analysis of post-implementation reviews and interviews with frontline staff indicate that benefits are being achieved. As a result, NSW Health now includes defining baseline measures and setting targets as part of LHD implementation planning. It has created a benefits realisation toolkit to assist this process from cluster 3 implementations onwards.

NSW Health conducted post-implementation reviews for clusters 1 and 2 and found that LHDs in these clusters were not using HealthRoster to realise all the benefits that HealthRoster could deliver.

By September 2018, NSW Health should:

  1. Ensure that Local Health Districts undertake benefits realisation planning according to the NSW Health benefits realisation framework
  2. Regularly measure benefits realised, at state and local health district levels, from the statewide implementation of HealthRoster
  3. Review the use of HealthRoster in Local Health Districts in clusters 1 and 2 and assist them to improve their HealthRoster related processes and practices.

By June 2019, NSW Health should:

  1. Ensure that all Local Health Districts are effectively using demand based rostering.

Appendix one - Response from agency

Appendix two - About the audit

Appendix three - Performance auditing

 

Parliamentary reference - Report number #301 - released 7 June 2018

Published

Actions for Detecting and responding to cyber security incidents

Detecting and responding to cyber security incidents

Finance
Cyber security
Information technology
Internal controls and governance
Management and administration
Workforce and capability

A report released today by the Auditor-General for New South Wales, Margaret Crawford, found there is no whole-of-government capability to detect and respond effectively to cyber security incidents. There is very limited sharing of information on incidents amongst agencies, and some agencies have poor detection and response practices and procedures.

The NSW Government relies on digital technology to deliver services, organise and store information, manage business processes, and control critical infrastructure. The increasing global interconnectivity between computer networks has dramatically increased the risk of cyber security incidents. Such incidents can harm government service delivery and may include the theft of information, denial of access to critical technology, or even the hijacking of systems for profit or malicious intent.

This audit examined cyber security incident detection and response in the NSW public sector. It focused on the role of the Department of Finance, Services and Innovation (DFSI), which oversees the Information Security Community of Practice, the Information Security Event Reporting Protocol, and the Digital Information Security Policy (the Policy).

The audit also examined ten case study agencies to develop a perspective on how they detect and respond to incidents. We chose agencies that are collectively responsible for personal data, critical infrastructure, financial information and intellectual property.

Conclusion
There is no whole‑of‑government capability to detect and respond effectively to cyber security incidents. There is limited sharing of information on incidents amongst agencies, and some of the agencies we reviewed have poor detection and response practices and procedures. There is a risk that incidents will go undetected longer than they should, and opportunities to contain and restrict the damage may be lost.
Given current weaknesses, the NSW public sector’s ability to detect and respond to incidents needs to improve significantly and quickly. DFSI has started to address this by appointing a Government Chief Information Security Officer (GCISO) to improve cyber security capability across the public sector. Her role includes coordinating efforts to increase the NSW Government’s ability to respond to and recover from whole‑of‑government threats and attacks.

Some of our case study agencies had strong processes for detection and response to cyber security incidents but others had a low capability to detect and respond in a timely way.

Most agencies have access to an automated tool for analysing logs generated by their IT systems. However, coverage of these tools varies. Some agencies do not have an automated tool and only review logs periodically or on an ad hoc basis, meaning they are less likely to detect incidents.

Few agencies have contractual arrangements in place for IT service providers to report incidents to them. If a service provider elects to not report an incident, it will delay the agency’s response and may result in increased damage.

Most case study agencies had procedures for responding to incidents, although some lack guidance on who to notify and when. Some agencies do not have response procedures, limiting their ability to minimise the business damage that may flow from a cyber security incident. Few agencies could demonstrate that they have trained their staff on either incident detection or response procedures and could provide little information on the role requirements and responsibilities of their staff in doing so.

Most agencies’ incident procedures contain limited information on how to report an incident, who to report it to, when this should occur and what information should be provided. None of our case study agencies’ procedures mentioned reporting to DFSI, highlighting that even though reporting is mandatory for most agencies their procedures do not require it.

Case study agencies provided little evidence to indicate they are learning from incidents, meaning that opportunities to better manage future incidents may be lost.

Recommendations

The Department of Finance, Services and Innovation should:

  • assist agencies by providing:
    • better practice guidelines for incident detection, response and reporting to help agencies develop their own practices and procedures
    • training and awareness programs, including tailored programs for a range of audiences such as cyber professionals, finance staff, and audit and risk committees
    • role requirements and responsibilities for cyber security across government, relevant to size and complexity of each agency
    • a support model for agencies that have limited detection and response capabilities
       
  • revise the Digital Information Security Policy and Information Security Event Reporting Protocol by
    • clarifying what security incidents must be reported to DFSI and when
    • extending mandatory reporting requirements to those NSW Government agencies not currently covered by the policy and protocol, including State owned corporations.

DFSI lacks a clear mandate or capability to provide effective detection and response support to agencies, and there is limited sharing of information on cyber security incidents.

DFSI does not currently have a clear mandate and the necessary resources and systems to detect, receive, share and respond to cyber security incidents across the NSW public sector. It does not have a clear mandate to assess whether agencies have an acceptable detection and response capability. It is aware of deficiencies in agencies and across whole‑of‑government, and has begun to conduct research into this capability.

Intelligence gathering across the public sector is also limited, meaning agencies may not respond to threats in a timely manner. DFSI has not allocated resources for gathering of threat intelligence and communicating it across government, although it has begun to build this capacity.

Incident reporting to DFSI is mandatory for most agencies, however, most of our case study agencies do not report incidents to DFSI, reducing the likelihood of containing an incident if it spreads to other agencies. When incidents have been reported, DFSI has not provided dedicated resources to assess them and coordinate the public sector’s response. There are currently no formal requirements for DFSI to respond to incidents and no guidance on what it is meant to do if an incident is reported. The lack of central coordination in incident response risks delays and increased damage to multiple agencies.

DFSI's reporting protocol is weak and does not clearly specify what agencies should report and when. This makes agencies less likely to report incidents. The lack of a standard format for incident reporting and a consistent method for assessing an incident, including the level of risk associated with it, also make it difficult for DFSI to determine an appropriate response.

There are limited avenues for sharing information amongst agencies after incidents have been resolved, meaning the public sector may be losing valuable opportunities to improve its protection and response.

Recommendations

The Department of Finance, Services and Innovation should:

  • develop whole‑of‑government procedure, protocol and supporting systems to effectively share reported threats and respond to cyber security incidents impacting multiple agencies, including follow-up and communicating lessons learnt
  • develop a means by which agencies can report incidents in a more effective manner, such as a secure online template, that allows for early warnings and standardised details of incidents and remedial advice
  • enhance NSW public sector threat intelligence gathering and sharing including formal links with Australian Government security agencies, other states and the private sector
  • direct agencies to include standard clauses in contracts requiring IT service providers report all cyber security incidents within a reasonable timeframe
  • provide assurance that agencies have appropriate reporting procedures and report to DFSI as required by the policy and protocol by:
    • extending the attestation requirement within the DISP to cover procedures and reporting
    • reviewing a sample of agencies' incident reporting procedures each year.

Published

Actions for Council reporting on service delivery

Council reporting on service delivery

Local Government
Compliance
Internal controls and governance
Management and administration
Service delivery

New South Wales local government councils’ could do more to demonstrate how well they are delivering services in their reports to the public, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Many councils report activity, but do not report on outcomes in a way that would help their communities assess how well they are performing. Most councils also did not report on the cost of services, making it difficult for communities to see how efficiently they are being delivered. And councils are not consistently publishing targets to demonstrate what they are striving for.

I am pleased to present my first local government performance audit pursuant to section 421D of the Local Government Act 1993.

My new mandate supports the Parliament’s objectives to:

  • strengthen governance and financial oversight in the local government sector
  • improve financial management, fiscal responsibility and public accountability for how councils use citizens’ funds.

Performance audits aim to help councils improve their efficiency and effectiveness. They will also provide communities with independent information on the performance of their councils.

For this inaugural audit in the local government sector, I have chosen to examine how well councils report to their constituents about the services they provide.

In this way, the report will enable benchmarking and provide improvement guidance to all councils across New South Wales.

Specific recommendations to drive improved reporting are directed to the Office of Local Government, which is the regulator of councils in New South Wales.

Councils provide a range of services which have a direct impact on the amenity, safety and health of their communities. These services need to meet the needs and expectations of their communities, as well as relevant regulatory requirements set by state and federal governments. Councils have a high level of autonomy in decisions about how and to whom they provide services, so it is important that local communities have access to information about how well they are being delivered and meeting community needs. Ultimately councils should aim to ensure that reporting performance is subject to quality controls designed to provide independent assurance.

Conclusion
While councils report on outputs, reporting on outcomes and performance over time can be improved. Improved reporting would include objectives with targets that better demonstrate performance over time. This would help communities understand what services are being delivered, how efficiently and effectively they are being delivered, and what improvements are being made.
To ensure greater transparency on service effectiveness and efficiency, the Office of Local Government (OLG) should work with councils to develop guidance principles to improve reporting on service delivery to local communities. This audit identified an interest amongst councils in improving their reporting and broad agreement with the good practice principles developed as part of the audit.
The Integrated Planning and Reporting Framework (the Framework), which councils are required to use to report on service delivery, is intended to promote better practice. However, the Framework is silent on efficiency reporting and provides limited guidance on how long-term strategic documents link with annual reports produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.
OLG should also work with state agencies to reduce the overall reporting burden on councils by consolidating state agency reporting requirements. 

Councils report extensively on the things they have done, but minimally on the outcomes from that effort, efficiency and performance over time.

Councils could improve reporting on service delivery by more clearly relating the resources needed with the outputs produced, and by reporting against clear targets. This would enable communities to understand how efficiently services are being delivered and how well councils are tracking against their goals and priorities.

Across the sector, a greater focus is also needed on reporting performance over time so that communities can track changes in performance and councils can demonstrate whether they are on target to meet any agreed timeframes for service improvements.

The degree to which councils demonstrate good practice in reporting on service delivery varies greatly between councils. Metropolitan and regional town and city councils generally produce better quality reporting than rural councils. This variation indicates that, at least in the near-term, OLG's efforts in building capability in reporting would be best directed toward rural councils.

Recommendation

By mid-2018, OLG should:

  • assist rural councils to develop their reporting capability.

The Framework which councils are required to use to report on service delivery, is intended to drive good practice in reporting. Despite this, the Framework is silent on a number of aspects of reporting that should be considered fundamental to transparent reporting on service delivery. It does not provide guidance on reporting efficiency or cost effectiveness in service delivery and provides limited guidance on how annual reports link with other plans produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.

Recommendation

By mid-2018, OLG should:

  • issue additional guidance on good practice in council reporting, with specific information on:
    • reporting on performance against targets
    • reporting on performance against outcome
    • assessing and reporting on efficiency and cost effectiveness
    • reporting performance over time
    • clearer integration of all reports and plans that are required by the Framework, particularly the role of End of Term Reporting
    • defining reporting terms to encourage consistency.

The Framework is silent on inclusion of efficiency or cost effectiveness indicators in reports

The guidelines produced by OLG in 2013 to assist councils to implement their Framework requirements advise that performance measures should be included in all plans. However, the Framework does not specifically state that efficiency or cost effectiveness indicators should be included as part of this process. This has been identified as a weakness in the 2012 performance audit report and the Local Government Reform Panel review of reporting by councils on service delivery.

The Framework and supporting documents provide limited guidance on reporting

Councils' annual reports provide a consolidated summary of their efforts and achievements in service delivery and financial management. However, OLG provides limited guidance on:

  • good practice in reporting to the community
  • how the annual report links with other plans and reports required by the Framework.

Further, the Framework includes both Annual and End of Term Reports. However, End of Term reports are published prior to council elections and are mainly a consolidation of annual reports produced during a council’s term. The relationship between Annual reports and End of Term reports is not clear.

OLG is reviewing the Framework and guidance

OLG commenced work on reviewing of the Framework in 2013 but this was deferred with work re‑starting in 2017. The revised guidelines and manual were expected to be released late in 2017.

OLG should build on the Framework to improve guidance on reporting on service delivery, including in annual reports

The Framework provides limited guidance on how best to report on service delivery, including in annual reports. It is silent on inclusion of efficiency or cost effectiveness indicators in reporting, which are fundamental aspects of performance reporting. Councils we consulted would welcome more guidance from OLG on these aspects of reporting.

Our consultation with councils highlighted that many council staff would welcome a set of reporting principles that provide guidance to councils, without being prescriptive. This would allow councils to tailor their approach to the individual characteristics, needs and priorities of their local communities.

Consolidating what councils are required to report to state agencies would reduce the reporting burden and enable councils to better report on performance. Comparative performance indicators are also needed to provide councils and the public with a clear understanding of councils' performance relative to each other.

Recommendations

By mid-2018, OLG should:

  • commence work to consolidate the information reported by individual councils to NSW Government agencies as part of their compliance requirements.
  • progress work on the development of a Performance Measurement Framework, and associated performance indicators, that can be used by councils and the NSW Government in sector-wide performance reporting.

Streamlining the reporting burden would help councils improve reporting

The NSW Government does not have a central view of all local government reporting, planning and compliance obligations. A 2016 draft IPART ‘Review of reporting and compliance burdens on Local Government’ noted that councils provide a wide range of services under 67 different Acts, administered by 27 different NSW Government agencies. Consolidating and coordinating reporting requirements would assist with better reporting over time and comparative reporting. It would also provide an opportunity for NSW Government agencies to reduce the reporting burden on councils by identifying and removing duplication.

Enabling rural councils to perform tailored surveys of their communities may be more beneficial than a state-wide survey in defining outcome indicators

Some councils use community satisfaction survey data to develop outcome indicators for reporting. The results from these are used by councils to set service delivery targets and report on outcomes. This helps to drive service delivery in line with community expectations. While some regional councils do conduct satisfaction surveys, surveys are mainly used by metropolitan councils which generally have the resources needed to run them.

OLG and the Department of Premier and Cabinet have explored the potential to conduct state-wide resident satisfaction surveys with a view to establishing measures to improve service delivery. This work has drawn from a similar approach adopted in Victoria. Our consultation with stakeholders in Victoria indicated that the state level survey is not sufficiently detailed or specific enough to be used as a tool in setting targets that respond to local circumstances, expectations and priorities. Our analysis of reports and consultation with stakeholders suggest that better use of resident survey data in rural and regional areas may support improvements in performance reporting in these areas. Rural councils may benefit more from tailored surveys of groups of councils with similar challenges, priorities and circumstances than from a standard state-wide survey. These could potentially be achieved through regional cooperation between groups of similar councils or regional groups.

Comparative reporting indicators are needed to enable councils to respond to service delivery priorities of their communities

The Local Government Reform Panel in 2012 identified the need for ‘more consistent data collection and benchmarking to enable councils and the public to gain a clear understanding of how a council is performing relative to their peers’.

OLG commenced work in 2012 to build a new performance measurement Framework for councils which aimed to move away from compliance reporting. This work was also strongly influenced by the approach used in Victoria that requires councils to report on a set of 79 indicators which are reported on the Victorian 'Know your council' website. OLG’s work did not fully progress at the time and several other local government representative bodies have since commenced work to establish performance measurement frameworks. OLG advised us it has recently recommenced its work on this project.

Our consultation identified some desire amongst councils to be able to compare their performance to support improvement in the delivery of services. We also identified a level of frustration that more progress has not been made toward establishment of a set of indicators that councils can use to measure performance and drive improvement in service delivery.

Several councils we spoke with were concerned that the current approaches to comparative reporting did not adequately acknowledge that councils need to tailor their service types, level and mix to the needs of their community. Comparative reporting approaches tend to focus on output measures such as number of applications processed, library loans annually and opening hours for sporting facilities, rather than outcome measures. These approaches risk unjustified and adverse interpretations of performance where councils have made a decision based on community consultation, local priorities and available resources. To mitigate this, it is important to

  • adopt a partnership approach to the development of indicators
  • ensure indicators measure performance, not just level of activity
  • compare performance between councils that are similar in terms of size and location.

It may be more feasible, at least in the short term, for OLG to support small groups of like councils to develop indicators suited to their situation.

Based on our consultations, key lessons from implementing a sector-wide performance indicator framework in Victoria included the benefits of:

  • consolidation of the various compliance data currently being reported by councils to provide an initial platform for comparative performance reporting
  • adopting a partnership approach to development of common indicators with groups of like councils.

Published

Actions for The Impact of the Raised School Leaving Age

The Impact of the Raised School Leaving Age

Education
Management and administration
Service delivery

The Department monitors the attendance of all students who remain enrolled at government schools, and responds when these students fail to attend. For young people that have been granted an exemption from attending school, the Department monitors apprentices, trainees and those completing the equivalent of Year 10 of secondary education at TAFE. However, the Department does not monitor young people post Year 10 in full-time work or vocational education programs until they turn 17 years of age. In accordance with the law, it is a parent’s responsibility to make sure that a child is attending school or involved in an approved alternate activity until they turn 17 years of age.

 

Parliamentary reference - Report number #226 - released 1 November 2012

Published

Actions for Managing IT Services Contracts

Managing IT Services Contracts

Finance
Health
Justice
Compliance
Information technology
Internal controls and governance
Procurement
Project management
Risk

Neither agency (NSW Ministry of Health and NSW Police Force) demonstrated that they continued to get value for money over the life of these long term contracts or that they had effectively managed all critical elements of the three contracts we reviewed post award. This is because both agencies treated contract extensions or renewals as simply continuing previous contractual arrangements, rather than as establishing a new contract and financial commitment. Consequently, there was not a robust analysis of the continuing need for the mix and quantity of services being provided or an assessment of value for money in terms of the prices being paid.

 

Parliamentary reference - Report number #220 - released 1 February 2012