Refine search Expand filter

Reports

Published

Actions for Government Advertising 2017-18

Government Advertising 2017-18

Premier and Cabinet
Compliance
Regulation

The State Insurance Regulatory Authority’s (SIRA) ‘green slip refund’ campaign, and the TAFE semester one 2018 student recruitment campaign, complied with most requirements of the Government Advertising Act 2011 and the Government Advertising Guidelines, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford.

The Government Advertising Act 2011 (the Act) requires the Auditor-General to conduct a performance audit on the activities of one or more government agencies in relation to government advertising campaigns in each financial year. The performance audit assesses whether a government agency or agencies has carried out activities in relation to government advertising in an effective, economical and efficient manner and in compliance with the Act, the regulations, other laws and the Government Advertising Guidelines (the Guidelines).

This audit examined two campaigns conducted in 2017–18:

  • the 'Green slip refund' campaign run by the State Insurance Regulatory Authority (SIRA)
  • the semester one component of the 'TAFE NSW 2018 Student Recruitment Annual Campaign Program' run by the NSW TAFE Commission (TAFE).

Section 6 of the Act prohibits political advertising. Under this section, material that is part of a government advertising campaign must not contain the name, voice or image of a minister, member of parliament or a candidate nominated for election to parliament or the name, logo or any slogan of a political party. Further, a campaign must not be designed to influence (directly or indirectly) support for a political party.

Conclusion
Neither campaign breached the prohibition on political advertising contained in section 6 of the Act. Both campaigns also complied with most requirements of the Act, the regulations, other laws and the Guidelines. Neither agency could demonstrate that their campaigns were fully effective or economical.
SIRA did not breach section 6 of the Act, which prohibits political advertising. However, SIRA used its post-campaign evaluation to ask the public whether they believe the government was helping to reduce the cost of living by making reforms in a variety of areas, including some that were not related to the green slip campaign. SIRA advised that these additional statements were included to provide a broader context for any change in the green slip campaign survey results. This is not an appropriate use of the post-campaign evaluation because the post-campaign evaluation should measure the success of the campaign against its stated objectives.
Neither campaign met all their key objectives, limiting the overall effectiveness of the campaigns. SIRA successfully increased awareness of the availability of green slip refunds and met the target for the proportion of people claiming their refunds online. However, it did not meet its objective to inform the public about the reforms to the green slip scheme, beyond the refunds available to motorists. While 62 per cent of surveyed people were aware of the reforms, there was little knowledge about many specific aspects of the reforms, which people largely associated with lower insurance prices and refunds. TAFE was successful in achieving targets for changing the public perception of TAFE. However, it failed to achieve its semester one enrolment target.
SIRA was not able to demonstrate that its campaign was economical as it directly negotiated with a single supplier for the campaign's creative materials. This is contrary to the NSW Government's and SIRA's own procurement guidance that advise it to seek quotes from suppliers on a prequalification scheme if available. SIRA had access to the Advertising and Digital Communication Services prequalification scheme, but still continued with direct negotiations. While SIRA sought to demonstrate value for money by comparing the supplier's quote to the expenditure on creative materials in other campaigns, it did not document this evaluation to ensure that decision makers were fully informed. 
TAFE was not able to demonstrate that its campaign was economical as it did not compare the campaign with a zero-advertising scenario to demonstrate the exact benefits directly attributable to the campaign. TAFE's cost-benefit analysis also did not identify to what extent benefits could be achieved without advertising, nor did it consider alternatives to advertising which could achieve the same impact as the advertising campaign. All these elements should have been included in TAFE's cost benefit analysis.
Both agencies achieved some efficiencies in implementing their campaigns. SIRA booked all of its media placements in a cost-efficient manner. TAFE booked most of its media placements in a cost-efficient manner and achieved further efficiencies through the re-use of previous campaign material.

The State Insurance Regulatory Authority (SIRA) conducted the 'Green slip refund' campaign between March and June 2018. SIRA ran this campaign to raise awareness of the Compulsory Third Party (CTP) refunds and reforms after the Motor Accidents Injuries Act 2017 commenced in December 2017. SIRA's view is that the reforms include a reduced cost for CTP insurance, benefits for at-fault drivers, reduced opportunity for fraud and attempts to lower insurance company profits. Green slip holders are also able to claim partial refunds on their 2017 green slip insurance premium. The campaign aimed to make green slip holders aware of the refunds available, encourage them to claim online and to inform people about the changes to the green slip scheme. The campaign focused on the first two of these objectives. The total cost of the campaign was $1.9 million. See Appendix two for more details on this campaign.

The 'Green slip refund' advertising campaign did not breach section 6 of the Act which prohibits political advertising. However, SIRA used its post-campaign evaluation to ask the public whether they believe the government was helping to reduce the cost of living by making reforms in a variety of areas, including some that were not related to the green slip campaign. SIRA advised that these additional statements were included to provide a broader context for any change in the green slip campaign survey results. This is not an appropriate use of the post-campaign evaluation because the post-campaign evaluation should measure the success of the campaign against its stated objectives. 
The campaign met most of its objectives, including raising awareness of the green slip refunds and encouraging people to claim online. However, the campaign was not fully effective because it did not inform the public of the green slip reforms. This was one of the objectives of the campaign. Sixty-two per cent of people in the post-campaign survey stated that they were aware of the reforms, an increase from the baseline of 20 per cent. However, these people largely associated the reforms with lower insurance prices and had a low awareness of any other elements of the reforms, such as SIRA's view that the reforms introduced better support for people injured on the road. This indicates that the campaign did little to inform people about the green slip reforms beyond the price of insurance. 
SIRA was able to ensure cost-efficient media purchases by signing its media booking authority within the timeframe advised by DPC.
SIRA could not demonstrate that the campaign was carried out economically. SIRA directly negotiated with a single supplier to procure the creative materials for this campaign. Direct negotiations make it difficult to ensure value for money due to the lack of competition. SIRA proceeded with direct negotiations despite being able to access a prequalification scheme which could increase competition. In doing so, SIRA did not follow government's or its internal procurement guidance. While SIRA sought to demonstrate value for money by comparing the supplier's quote to the expenditure on creative materials in other campaigns, it did not document this evaluation to ensure that decision makers were fully informed. 

Campaign materials we reviewed did not breach section 6 of the Act

Section 6 of the Act prohibits political advertising as part of a government advertising campaign. A government advertising campaign must not:

  • be designed to influence (directly or indirectly) support for a political party
  • contain the name, voice or image of a minister, a member of parliament or a candidate nominated for election to parliament
  • contain the name, logo, slogan or any other reference to a political party.

The audit team found no breaches of section 6 of the Act in the campaign material we reviewed.

Before the start of the campaign, SIRA conducted a survey which asked people whether they agreed ‘that the NSW Government is helping to reduce the cost of living by making positive reforms to:

  • reduce the cost of green slips
  • reduce the cost of health insurance
  • increase the number of jobs
  • increase investment in the state.'

SIRA's initial submission to peer review listed one of the campaign objectives as improving the perception of the government as a positive reformer. DPC advised SIRA that this should not be included. SIRA removed this objective.

Even though SIRA appropriately removed this objective, the post-campaign evaluation still measured agreement with the above statements, three of which did not relate to this campaign or SIRA's responsibilities. SIRA advised that these three additional statements were included to provide a broader context for any change in the green slip campaign survey results. For example, if all four measures reported an increase in positive responses of roughly the same size, then the increase may have been due to factors other than the advertising campaign.

This is not an appropriate use of the post-campaign evaluation, which should measure the success of the campaign against its stated objectives. The Guidelines list the purposes that government advertising may serve and none of these relate to improving the perception of the government. The inclusion of the above questions in SIRA's post-campaign evaluation creates a risk that the results may be used for party political purposes.

The campaign met most targets, however some were not challenging to achieve

The post-campaign evaluation demonstrated that the campaign met the targets for 12 of its 13 objectives including the targets relating to raising awareness of the refunds and the proportion of people claiming their refunds online. A fourteenth objective, the percentage of people aware that they should contact SIRA after a road accident injury, did not have a target set, meaning that it is not possible to say whether the campaign had the desired impact in this case.

In August 2017, before the campaign commenced, SIRA conducted a survey to determine the baselines for some of its objectives. This is a good practice to support an effective post campaign evaluation process. The survey found that 20 per cent of people were aware of the green slip reforms. SIRA's objective was to raise this to 25 per cent, which represents a small gain relative to the proposed campaign expenditure. The campaign aimed for 40 per cent of motorists to be aware of refunds, which is very low given that this was the primary focus of the campaign. SIRA followed the advice of its survey provider when setting these targets. 

In the survey carried out after the campaign, 66 per cent of people were aware of the availability of green slip refunds for most motorists. The campaign also aimed to get 83 per cent of motorists to claim their refunds via online channels. It met this target, with a total of 84 per cent. Finally, 62 per cent of people in the post-campaign survey were aware of the green slip reforms. This result is discussed further below.

The overall target for total number of refunds claimed is 85 per cent of eligible drivers, that is to say CTP holders. SIRA will evaluate the results of this objective after the conclusion of the refund period in June 2019.

The campaign did little to inform the public about the broader green slip reforms

One objective of the green slip refund campaign was to inform the public about the green slip reforms. The final campaign creative material focused almost entirely on the green slip refunds rather than the range of other reforms. This was because the peer review raised concerns that the creative material was attempting to deliver too many messages. 

The campaign submission stated that the advertising campaign would raise awareness of the broader reforms to the CTP scheme, citing several examples such as reduced opportunities for fraud and reduced insurer profits. SIRA also advised the Minister for Finance, Services and Property that secondary messaging in the campaign would benefit public understanding of the reforms.

Some of the television and radio advertisements referred to ‘more protection’ or ‘better protection’ for people injured on New South Wales roads, however advertisements did not refer to other elements of the reforms. Other campaign creative materials contained messages solely relating to the green slip refund and made no further reference to the broader reforms. SIRA used other communication channels, such as giving wallet cards to health service providers, to spread these messages to people, particularly those who had been injured.

Sixty two per cent of people in the post-campaign survey were aware of the green slip reforms. SIRA asked these people which benefits they associated with the reforms. The results of this survey are in Exhibit 4. Seventy-one per cent of this sample identified the reduced costs of green slips as one of the changes, but awareness of other elements of the reforms remains low. Though 29 per cent of people perceive the reforms to make the green slip scheme ‘fairer’, no more than 15 per cent of people could list a specific benefit which did not relate to insurance prices.

Exhibit 4: Perceived benefits associated with the changes to the CTP green slip scheme
Perceived benefit Percentage aware of this benefit
Reduced costs of green slips for vehicle owners 71%
A fairer scheme for all people 29%
Reduced costs of comprehensive vehicle insurance 20%
Better support for people injured on our roads 15%
Less chances of fraudulent claims 15%
Lowering insurance company profits 13%
Quicker payment of claims to injured people 10%

Source: State Insurance Regulatory Authority.

Another campaign target was to ensure that people understood that they should contact SIRA in case of an injury. None of the campaign creative materials contained this information. SIRA did some limited work to inform the public about this through its social media channels. One of the pieces of creative material directed the reader to SIRA's website for further information on the reforms, which contained this information. During the campaign period, there was an increase in the number of calls received by SIRA's CTP Assist phone line. However, in the post-campaign evaluation, only two per cent of surveyed people identified that they should contact SIRA in case of an injury.

The media plan allowed sufficient time for cost-efficient media placement

During the peer review process, DPC provides advice to agencies about the time they should allow to ensure cost-efficient media placement. For example, DPC advise that agencies book television advertising six to 12 weeks in advance and that agencies book radio advertising two to eight weeks in advance.

SIRA allowed sufficient time between the completion of the peer review process and the commencement of the first advertising. SIRA signed the agreement with the approved Media Agency Services provider eight weeks before the campaign started, meaning that it could achieve cost-efficient media placement for all types of media used in this campaign.

SIRA directly negotiated with a single supplier, making it difficult to demonstrate value for money

SIRA directly negotiated with a single supplier to procure the campaign's creative material. A direct negotiation occurs when an agency negotiates with a proponent without first undergoing a competitive process. It is difficult to demonstrate value for money using direct negotiation due to the lack of competition. 

ICAC's 'Guidelines for managing risks in direct negotiations' (ICAC Guidelines) provide guidance on how to undertake direct negotiations. SIRA has a direct negotiation checklist that aligns to the ICAC Guidelines. The SIRA checklist advises that staff should confirm that existing New South Wales prequalification schemes cannot provide the procurement before undertaking a direct negotiation. SIRA did not do this.

To procure creative materials, agencies can access the Advertising and Digital Communication Services prequalification scheme (the prequalification scheme). Using the prequalification scheme allows agencies to quickly seek quotes from suppliers who have a demonstrated track record and expertise. While agencies are not required to use the prequalification scheme, the NSW Procurement Board advises that agencies should use prequalification schemes where they are available to promote competition. 

By using direct negotiation when the prequalification scheme was available, and by not seeking quotes from other suppliers, SIRA was acting in a way that reduced competition. This increases the risk that SIRA did not achieve value for money in its procurement of creative materials.

SIRA advised that it sought to ensure value for money by comparing the quote from its selected supplier with the amount spent on creative materials in other campaigns of similar size. SIRA did not document this analysis at the time or include it as part of the briefing note staff used to seek approval for undertaking direct negotiation. As a result, decision-makers were not fully informed when approving this engagement. 

SIRA reported in a briefing note that it engaged in direct negotiations because:

  • it believed that the original timeframe did not allow for a competitive tender process
  • the supplier had done previous work on a related campaign for SIRA
  • the supplier provided sample work which received positive feedback from focus groups.

In July 2017, when peer review commenced, SIRA planned to launch the campaign in November 2017 to coincide with the beginning of the green slip reforms. SIRA believed that this timeframe was narrow enough to warrant entering direct negotiations. The ICAC Guidelines advise that a narrow timeframe is not a valid reason to enter into a direct negotiation. In late October 2017, the campaign launch was delayed until March 2018 to stagger the demand on the resources of Service NSW, which is administering the refund. 

The ICAC Guidelines also advise against re-appointing a supplier because it has performed previous work. Instead, agencies could consider previous experience as one of several factors when deciding between quotes. In cases where an agency asks a supplier to provide sample work, the ICAC Guidelines advise that agencies should request sample work from multiple potential suppliers to promote competition.

The campaign's cost benefit analysis complied with the Act and Guidelines 

The Act requires a cost benefit analysis (CBA) for any government advertising campaign likely to exceed $1.0 million in value. Section six of the Guidelines set out the requirements for a government advertising CBA. The campaign's CBA complied with the requirements of the Act and the Guidelines.

The campaign CBA could have demonstrated further cost effectiveness if it considered alternative media mixes as outlined in NSW Treasury's 'Cost Benefit Analysis Framework for Government Advertising and Information Campaigns'. This would also have been consistent with the Handbook.

The cluster Secretary signed the compliance certificate instead of the head of SIRA

The Act requires the head of the agency running the campaign to sign a compliance certificate. 

The Secretary of the Department of Finance, Services and Innovation, the cluster to which SIRA belongs, signed the campaign's compliance certificate. However, section 17(2) of the State Insurance and Care Governance Act 2015 states that SIRA is ‘for the purposes of any Act, a NSW Government agency.’ Given this, the Chief Executive of SIRA was responsible for signing the compliance certificate for this campaign.

This is a minor non-compliance with the Act because the Chief Executive had reviewed the campaign and recommended that the Secretary sign the compliance certificate.  

The NSW TAFE Commission (TAFE) ran the 'TAFE NSW 2018 Student Recruitment Annual Campaign Program' from November 2017 to September 2018. The aim of the campaign was to assist TAFE in achieving its 2018 student enrolment target by improving the perception of TAFE's brand and generating student enquiries. This is the first state-wide campaign run by TAFE operating under the One TAFE model. Previously, each TAFE Institute ran its own campaigns. The total budget of the campaign was $19.5 million. This audit examined only the semester one 2018 component of the campaign, which ran from November 2017 to April 2018 at a total cost of $9.5 million. See Appendix two for more details on this campaign.

The semester one component of the 'TAFE NSW 2018 Student Recruitment Annual Campaign Program' did not breach the specific provisions of section 6 of the Act which prohibits political advertising.
The campaign was not fully effective because it did not achieve its objective of reaching TAFE's semester one enrolment target.
The campaign was successful at achieving the campaign's targets which related to changing the public perception of TAFE.
TAFE was able to place most of its campaign media within cost-efficient timeframes. TAFE also achieved efficiencies by re-using many creative materials from a previous campaign.
TAFE could not demonstrate this campaign was carried out economically. TAFE's cost benefit analysis (CBA) for this campaign did not comply with three requirements of the Guidelines. For example, TAFE did not compare the campaign to a baseline case of not advertising. 
The Guidelines require government advertising to be accurate in all statements. TAFE breached this requirement. The campaign material included one statement that was inaccurate and one that was overstated.
The revision of the Brand Guidelines in August 2017 impacted this campaign. TAFE re-used many creative materials that were created when TAFE was not required to include the NSW Government logo on its advertising material. DPC appears to have directed agencies that were launching advertising campaigns to immediately comply with the Brand Guidelines, however we could not find evidence that this advice was given to TAFE. As such, 59 per cent of TAFE's materials were not compliant with the Brand Guidelines at the launch of the campaign in November 2017. TAFE had made most of this campaign's creative materials compliant by June 2018.

The campaign materials we reviewed did not breach section 6 of the Act

Section 6 of the Act prohibits political advertising as part of a government advertising campaign. A government advertising campaign must not:

  • be designed to influence (directly or indirectly) support for a political party
  • contain the name, voice or image of a minister, a member of parliament or a candidate nominated for election to parliament
  • contain the name, logo, slogan or any other reference to a political party.

The audit team found no breaches of section 6 of the Act in the campaign material we reviewed.

The campaign achieved 16 of 24 objectives, but did not reach its enrolment target

The campaign had 24 objectives which had a target for semester one. TAFE set these targets using a combination of previous experience, corporate objectives and brand surveys.

The overall objective of the combined semester one and two campaigns was to support TAFE achieving its 2018 total enrolment target of 549,636. TAFE's semester one target was 361,350, which it did not achieve. This indicates that the campaign was not fully effective.

The campaign achieved 11 of its 16 output objectives. The output targets related to TAFE's media placements and ability to reach an audience efficiently. TAFE tracked progress against many of the campaign's output objectives daily. TAFE altered its media channels throughout the campaign meaning that some of the output objectives were not met because TAFE decided to focus on alternative media channels. The campaign also achieved all seven of its outcome objectives. The outcome objectives related to changing the public perception of TAFE.

TAFE's initial media plan allowed for efficient media placement

During the peer review process, DPC provides advice to agencies about the time they should allow to ensure cost-efficient media placement. For example, DPC advise that agencies book television advertising six to 12 weeks in advance and that agencies book radio advertising two to eight weeks in advance. 

While TAFE's initial media plan allowed sufficient time between the approval of the campaign and its launch, a delay in receiving final approval for the campaign meant TAFE could not purchase media placements until two months later than planned. Most purchases still remained within DPC's recommended timeframes, but Indigenous television advertisements and metropolitan out of home advertisements both fell outside DPC's recommended time periods by one week. These delays did not impact on TAFE's efficiency.

TAFE re-used many creative materials, achieving some cost-savings

Rather than commissioning new creative materials, TAFE re-used many creative materials from the previous campaign and supplemented these with a selection of new creative materials. TAFE advised that this led to a cost saving of approximately $130,000.
TAFE sought quotes from suppliers on the government's Advertising and Digital Communication Services prequalification scheme for two creative material contracts. These contracts covered updates to existing materials and a selection of new materials.

The campaign's cost-benefit analysis did not comply with three requirements of the Guidelines

The Act requires an agency to conduct a cost-benefit analysis (CBA) if the cost of an advertising campaign is likely to exceed $1.0 million. The Guidelines set out the requirements of this CBA. TAFE did not comply with three of these requirements, outlined in Exhibit 5.

Exhibit 5: Guideline requirements for CBAs with which TAFE did not comply
6.2 The cost benefit analysis must isolate the additional costs and benefits attributable to the advertising campaign itself compared to the base-case of not-advertising.
6.3 The cost benefit analysis must specify the extent to which the expected benefits could be achieved without advertising.
6.4 The cost benefit analysis must outline what options other than advertising could be used to successfully implement the program and achieve the program benefits and a comparison of their costs.
Source: NSW Government Advertising Guidelines (2012).

In this circumstance, section 6.2 of the Guidelines required the CBA to identify the number of enrolments TAFE would expect if it did not advertise. TAFE advised us that it is not possible to say what this scenario would look like because there had always been some degree of advertising, however, this argument is not reflected in the CBA. 

TAFE used 2017 as the baseline in the CBA. In 2017, TAFE spent $13.2 million on advertising. As such, the CBA was only able to isolate the impact of the increased expenditure rather than the impact of the campaign's entire $19.5 million expenditure. TAFE advised that 2017 had the most reliable state-wide data and this contributed to the decision to use it as the baseline.

During the audit, TAFE sought advice from NSW Treasury regarding whether a 2017 baseline was appropriate and NSW Treasury advised that it was. Regardless, TAFE did not receive this advice prior to writing the CBA and did not put commentary around this in the CBA. This would also not be sufficient for fulfilling the requirements of the Guidelines.

The CBA did not comply with sections 6.3 and 6.4 of the Guidelines. The CBA briefly considered the impact of spending the campaign budget directly on new training courses, however there was no sustained analysis of this option. TAFE staff advised that there are no realistic alternatives to advertising for achieving the campaign's objectives. However we did not see analysis to support this conclusion in documents provided to us. 

The campaign CBA could have better demonstrated cost effectiveness if it considered alternative media mixes as outlined in NSW Treasury's 'Cost Benefit Analysis Framework for Government Advertising and Information Campaigns'. This would also have been consistent with the Handbook.

TAFE made one inaccurate claim in its advertising and overstated a second

The Guidelines set out rules regarding the content of a government advertising campaign. Exhibit 6 sets out one of the principles with which agencies must comply.

Exhibit 6: Guidelines' requirement for accuracy
The following principles apply to the style and content of government advertising campaigns:
  • Accuracy in the presentation of all facts, statistics, comparisons and other arguments. All statements and claims of fact included in government advertising campaigns must be able to be substantiated.
Source: NSW Government Advertising Guidelines (2012).

TAFE made one inaccurate claim in its advertising and overstated a second.

In some campaign creative material, TAFE claimed that 78 per cent of its own graduates are employed after training (Exhibit 15 in Appendix 2). According to the National Centre for Vocational Education Research, 78 per cent of New South Wales Vocational Education and Training (VET) graduates (i.e. from all training providers) are employed after training. The result for TAFE graduates is 70.4 per cent.

One of the campaign's television advertisements refers to TAFE as ‘Australia's most reputable education provider’. This statement referred to a survey of current TAFE students who were asked where they would consider studying in future: TAFE, University or a private college. The current TAFE students selected TAFE by a large margin. The limited scope of TAFE's student survey and its results do not support the claim that it is ‘Australia's most reputable education provider’.

DPC did not consistently communicate the transitional arrangements for the Brand Guidelines and as such much of TAFE's creative material did not comply at campaign launch

On 7 August 2017, the government released the NSW Government Brand Guidelines (Brand Guidelines), setting out how agencies use the NSW Government logo. The Brand Guidelines replaced the Branding Style Guide which had been in place since September 2015. Some agencies were exempt from using the Branding Style Guide and the introduction of the new Brand Guidelines required these agencies to apply for a new exemption.

TAFE had recently commenced the peer review process for this campaign when the Brand Guidelines were released. TAFE was exempt from the requirements of the Branding Style Guide and as such the material which TAFE was planning to re-use in the new campaign did not contain the NSW Government logo.

Communication about how long agencies had to make themselves compliant with the Brand Guidelines was unclear. On 11 August 2017, the Chair of the Cabinet Standing Committee on Communication and Government Advertising (the Committee) sent a letter to the Secretary of the Department of Industry informing him that the Department must update all its material to be compliant with the Brand Guidelines ‘as soon as practicable within an 18-month transition period’. The Department of Industry advised TAFE that new advertising would need to be immediately compliant, however it was not clear if this included materials which agencies were re-using from previous campaigns. DPC advised the audit team that it expected re-used materials to be compliant when agencies launched new campaigns. DPC provided this advice to some agencies but did not communicate it more broadly. We could not source evidence that DPC provided this advice to TAFE.

DPC ran workshops to explain the transitional arrangements in September 2017 for the changes in the Brand Guidelines, however these did not specifically address the transitional timeframes for new advertising campaigns.

The Department of Industry, on behalf of TAFE, applied to the Committee for approval to co-brand the TAFE logo with the NSW Government logo. This was approved in October 2017. The requirements for co-branding are in Exhibit 7.

Exhibit 7: Co-branding in the NSW Government Brand Guidelines

Co-branding partners the agency logo with the NSW Government logo. The NSW Government logo must always be presented as the dominant or lead brand. The Brand Guidelines provide the following template shown below the exhibit box.

The NSW Government logo is on the left and the agency logo is placed on the right, with a dividing line between them.

Published

Actions for Unsolicited proposal process for the lease of Ausgrid

Unsolicited proposal process for the lease of Ausgrid

Premier and Cabinet
Asset valuation
Infrastructure
Internal controls and governance
Management and administration
Procurement
Project management
Service delivery
Shared services and collaboration

In October 2016, the NSW Government accepted an unsolicited proposal from IFM Investors and AustralianSuper to lease 50.4 per cent of Ausgrid for 99 years. The deal followed the Federal Government’s rejection of two bids from foreign investors, for national security reasons.

A performance audit of the lease of Ausgrid has found shortcomings in the unsolicited proposal process. Releasing the audit findings today, the Auditor-General for New South Wales, Margaret Crawford said ‘this transaction involved a $20 billion asset owned by the people of New South Wales. As such, it warranted strict adherence to established guidelines’.

Ausgrid is a distributor of electricity to eastern parts of Sydney, the Central Coast, Newcastle and the Hunter Region.

In June 2014, the then government announced its commitment to lease components of the state's electricity network as part of the Rebuilding NSW plan. Implementation of the policy began after the government was re-elected in 2015. Between November 2015 and August 2016, the NSW Government held a competitive tender process to lease 50.4 per cent of Ausgrid for 99 years. The NSW Government abandoned the process on 19 August 2016 after the Australian Treasurer rejected two bids from foreign investors, for national security reasons. That day, the Premier and Treasurer released a media statement clarifying the government's objective to complete the transaction via a competitive process in time to include the proceeds in the 2017–18 budget.

On 31 August 2016, the state received an unsolicited proposal from IFM Investors and AustralianSuper to acquire an interest in Ausgrid under the same terms proposed by the state during the tender process. In October 2016, the government accepted the unsolicited proposal. 

This audit examined whether the unsolicited proposal process for the partial long-term lease of Ausgrid was effectively conducted and in compliance with the government’s 2014 Unsolicited Proposals: Guide for Submission and Assessment (Unsolicited Proposals Guide or the Guide). 

The audit focused on how the government-appointed Assessment Panel and Proposal Specific Steering Committee assessed key requirements in the Guide that unsolicited proposals must be demonstrably unique and represent value for money. 

Conclusion

The evidence available does not conclusively demonstrate the unsolicited proposal was unique, and there were some shortcomings in the negotiation process, documentation and segregation of duties. That said, before the final commitment to proceed with the lease, the state obtained assurance that the proposal delivered value for money. 

It is particularly important to demonstrate unsolicited proposals are unique, in order to justify the departure from other transaction processes that offer greater competition, transparency and certainty about value for money.

The Assessment Panel and the Proposal Specific Steering Committee determined the Ausgrid unsolicited proposal was unique, primarily on the basis that the proponent did not require foreign investment approval from the Australian Treasurer, and the lease transaction could be concluded earlier than through a second tender process. However, the evidence that persuaded the Panel and Committee did not demonstrate that no other proponent could conclude the transaction in time to meet the government’s deadline. 

It is not appropriate to determine an unsolicited proposal is unique because it delivers an earlier outcome than possible through a tender process. The Panel and Committee did not contend, and it is not evident, that the unsolicited proposal was the only way to meet the government’s transaction deadline.

The evidence does not demonstrate that the proponent was the only party that would not have needed foreign investment approval to participate in the transaction. It also does not demonstrate that the requirement for foreign investment approval would have reduced the pool of foreign buyers to the degree that it would be reasonable to assume none would emerge. 

The Panel, Committee and financial advisers determined that the final price represented value for money, and that retendering offered a material risk of a worse financial outcome. However, an acceptable price was revealed early in the negotiation process, and doing so made it highly unlikely that the proponent would offer a higher price than that disclosed. The Department of Premier and Cabinet (DPC) and NSW Treasury were not able to provide a documented reserve price, bargaining strategy or similar which put the negotiations in context. It is not evident that the Panel or Committee authorised, justified or endorsed negotiations in advance. 

Key aspects of governance recommended by the Guide were in place. Some shortcomings relating to role segregation, record keeping and probity assurance weakened the effectiveness of the unsolicited proposal process adopted for Ausgrid.

The reasons for accepting that the proposal and proponent were unique are not compelling.

The Unsolicited Proposals Guide says the 'unique benefits of the proposal and the unique ability of the proponent to deliver the proposal' must be demonstrated. 

The conclusion reached by the Panel and Committee that the proposal offered a ‘unique ability to deliver (a) strategic outcome’ was primarily based on the proponent not requiring foreign investment approval from the Australian Treasurer, and allowing the government to complete the lease transaction earlier than by going through a second tender process. 

It is not appropriate to determine an unsolicited proposal is unique because it delivers an earlier outcome than possible through a tender process. The Panel and Committee did not contend, and it is not evident, that the unsolicited proposal was the only way to meet the government’s transaction deadline.

The evidence does not demonstrate that the proponent was the only party that would not have needed foreign investment approval to participate in the transaction. Nor does it demonstrate that the requirement for foreign investment approval would have reduced the pool of foreign buyers to the degree that it would be reasonable to assume none would emerge. 

That said, the Australian Treasurer’s decision to reject the two bids from the previous tender process created uncertainty about the conditions under which he would approve international bids. The financial advisers engaged for the Ausgrid transaction informed the Panel and Committee that:

  • it was not likely another viable proponent would emerge soon enough to meet the government’s transaction deadline
  • the market would be unlikely to deliver a better result than offered by the proponent
  • going to tender presented a material risk of a worse financial result. 

The Unsolicited Proposals Guide says that a proposal to directly purchase or acquire a government-owned entity or property will generally not be unique. The Ausgrid unsolicited proposal fell into this category. 

Recommendations:
DPC should ensure future Assessment Panels and Steering Committees considering a proposal to acquire a government business or asset:

  • recognise that when considering uniqueness they should: 
    • require very strong evidence to decide that both the proponent and proposal are the only ones of their kind that could meet the government’s objectives 
    • give thorough consideration to any reasonable counter-arguments against uniqueness.
  • rigorously consider all elements of the Unsolicited Proposals Guide when determining whether a proposal should be dealt with as an unsolicited proposal, and document these deliberations and all relevant evidence
  • do not use speed of transaction compared to a market process as justification for uniqueness.
The process to obtain assurance that the final price represented value for money was adequate. However, the negotiation approach reduced assurance that the bid price was maximised. 

The Panel and Committee concluded the price represented value for money, based on peer-reviewed advice from their financial advisers and knowledge acquired from previous tenders. The financial advisers also told the Panel and Committee that there was a material risk the state would receive a lower price than offered by the unsolicited proposal if it immediately proceeded with a second market transaction. 

The state commenced negotiations on price earlier than the Guide says they should have. Early disclosure of a price that the state would accept reduced the likelihood of achieving a price greater than this. DPC says the intent of this meeting was to quickly establish whether the proponents could meet the state’s benchmark rather than spending more time and resources on a proposal which had no prospect of proceeding.

DPC and NSW Treasury were not able to provide a documented reserve price, negotiation strategy or similar which put the negotiations and price achieved in context. It was not evident that the Panel or Committee authorised, justified or endorsed negotiations in advance. However, the Panel and Committee endorsed the outcomes of the negotiations. 

The negotiations were informed by the range of prices achieved for similar assets and the specific bids for Ausgrid from the earlier market process.

Recommendations:
DPC should ensure any future Assessment Panels and Steering Committees considering a proposal to acquire a government business or asset:

  • document a minimum acceptable price, and a negotiating strategy designed to maximise price, before commencing negotiations
  • do not communicate an acceptable price to the proponent, before the negotiation stage of the process, and then only as part of a documented bargaining strategy.
Key aspects of governance recommended by the Guide were in place, but there were some shortcomings around role segregation, record keeping and probity assurance.

The state established a governance structure in accordance with the Unsolicited Proposals Guide, including an Assessment Panel and Proposal Specific Steering Committee. The members of the Panel and Steering Committee were senior and experienced officers, as befitted the size and nature of the unsolicited proposal. 

The separation of negotiation, assessment and review envisaged by the Guide was not maintained fully. The Chair of the Assessment Panel and a member of the Steering Committee were involved in negotiations with the proponent. 

DPC could not provide comprehensive records of some key interactions with the proponent or a documented negotiation strategy. The absence of such records means the Department cannot demonstrate engagement and negotiation processes were authorised and rigorous. 

The probity adviser reported there were no material probity issues with the transaction. The probity adviser also provided audit services. This is not good practice. The same party should not provide both advisory and audit services on the same transaction.

Recommendations:
DPC should ensure any future Assessment Panels and Steering Committees considering a proposal to acquire a government entity or asset:
•    maintain separation between negotiation, assessment and review in line with the Unsolicited Proposals Guide
•    keep an auditable trail of documentation relating to the negotiation process
•    maintain separation between any probity audit services engaged and the probity advisory and reporting services recommended in the current Guide.

Published

Actions for Progress and measurement of the Premier's Priorities

Progress and measurement of the Premier's Priorities

Premier and Cabinet
Compliance
Internal controls and governance
Management and administration
Project management
Risk
Service delivery
Shared services and collaboration
Workforce and capability

The Premier’s Implementation Unit uses a systematic approach to measuring and reporting progress towards the Premier’s Priorities performance targets, but public reporting needed to improve, according to a report released today by the Auditor-General of NSW, Margaret Crawford.

The Premier of New South Wales has established 12 Premier’s Priorities. These are key performance targets for government.

The 12 Premier's Priorities
  • 150,000 new jobs by 2019

  • Reduce the volume of litter by 40 per cent by 2020

  • 10 key projects in metro and regional areas to be delivered on time and on budget, and nearly 90 local infrastructure projects to be delivered on time

  • Increase the proportion of NSW students in the top two NAPLAN bands by eight per cent by 2019

  • Increase the proportion of women in senior leadership roles in the NSW Government sector from 33 to 50 per cent by 2025 and double the number of Aboriginal and Torres Strait Islander people in senior leadership roles in the NSW Government sector, from 57 to 114

  • Increase the proportion of young people who successfully move from Specialist Homelessness Services to long-term accommodation to more than 34 per cent by 2019

  • 61,000 housing completions on average per year to 2021

  • Reduce the proportion of domestic violence perpetrators reoffending by 25 per cent by 2021

  • Improve customer satisfaction with key government services every year, this term of government to 2019

  • Decrease the percentage of children and young people re-reported at risk of significant harm by 15 per cent by 2020

  • 81 per cent of patients through emergency departments within four hours by 2019

  • Reduce overweight and obesity rates of children by five percentage points by 2025


Source: Department of Premier and Cabinet, Premier’s Priorities website.

Each Premier’s Priority has a lead agency and minister responsible for achieving the performance target.

The Premier’s Implementation Unit (PIU) was established within the Department of Premier and Cabinet (DPC) in 2015. The PIU is a delivery unit that supports agencies to measure and monitor performance, make progress toward the Premier’s Priorities targets, and report progress to the Premier, key ministers and the public.

This audit assessed how effectively the NSW Government is progressing and reporting on the Premier's Priorities.

 


The Premier’s Implementation Unit (PIU) is effective in assisting agencies to make progress against the Premier’s Priorities targets. Progress reporting is regular but transparency to the public is weakened by the lack of information about specific measurement limitations and lack of clarity about the relationship of the targets to broader government objectives.The PIU promotes a systematic approach to measuring performance and reporting progress towards the Premier’s Priorities’ performance targets. Public reporting would be improved with additional information about the rationale for choosing specific targets to report on broader government objectives.

The PIU provides a systematic approach to measuring performance and reporting progress towards the Premier's Priorities performance targets. Public reporting would be improved with additional information about the rationale for choosing specific targets to report on broader government objectives. The data used to measure the Premier’s Priorities comes from a variety of government and external datasets, some of which have known limitations. These limitations are not revealed in public reporting, and only some are revealed in progress reported to the Premier and ministers. This limits the transparency of reporting.

The PIU assists agencies to avoid unintended outcomes that can arise from prioritising particular performance measures over other areas of activity. The PIU has adopted a collaborative approach to assisting agencies to analyse performance using data, and helping them work across organisational silos to achieve the Premier’s Priorities targets.


 


Data used to measure progress for some of the Premier’s Priorities has limitations which are not made clear when progress is reported. This reduces transparency about the reported progress. Public reporting would also be improved with additional information about the relationship between specific performance measures and broader government objectives.

The PIU is responsible for reporting progress to the Premier, key ministers and the public. Agencies provide performance data and some play a role in preparing progress reports for the Premier and ministers. For 11 of the Premier's Priorities, progress is reported against measurable and time-related performance targets. For the infrastructure priority, progress is reported against project milestones.

Progress of some Priorities is measured using data that has known limitations, which should be noted wherever progress is reported. For example, the data used to report on housing completions does not take housing demolitions into account, and is therefore overstating the contribution of this performance measure to housing supply. This known limitation is not explained in progress reports or on the public website.

Data used to measure progress is sourced from a mix of government and external datasets. Updated progress data for most Premier’s Priorities is published on the Premier’s Priorities website annually, although reported to the Premier and key ministers more frequently. The PIU reviews the data and validates it through fieldwork with front line agencies. The PIU also assists agencies to avoid unintended outcomes that can arise from prioritising single performance measures. Most, but not all, agencies use additional indicators to check for misuse of data or perverse outcomes.

We examined the reporting processes and controls for five of the Premier’s Priorities. We found that there is insufficient assurance over the accuracy of the data on housing approvals.

The relationships between performance measures and broader government objectives is not always clearly explained on the Premier’s Priority website, which is the key source of public information about the Premier’s Priorities. For example, the Premier’s Priority to reduce litter volumes is communicated as “Keeping our Environment Clean.” While the website explains why reducing litter is important, it does not clearly explain why that particular target has been chosen to measure progress in keeping the environment clean.

By December 2018, the Department of Premier and Cabinet should:

  1. improve transparency of public reporting by:
    • providing information about limitations of reported data and associated performance
    • clarifying the relationship between the Premier’s Priorities performance targets and broader government objectives.
  2. ensure that processes to check and verify data are in place for all agency data sources
  3. encourage agencies to develop and implement additional supporting indicators for all Premier’s Priority performance measures to prevent and detect unintended consequences or misuse of data.

 


The Premier's Implementation Unit is effective in supporting agencies to deliver progress towards the Premier’s Priority targets.

The PIU promotes a systematic approach to monitoring and reporting progress against a target, based on a methodology used in delivery units elsewhere in the world. The PIU undertakes internal self-evaluation, and commissions regular reviews of methodology implementation from the consultancy that owns the methodology and helped to establish the PIU. However, the unit lacks periodic independent reviews of their overall effectiveness. The PIU has adopted a collaborative approach and assists agencies to analyse performance using data, and work across organisational silos to achieve the Premier’s Priorities targets.

Agency representatives recognise the benefits of being responsible for a Premier's Priority and speak of the value of being held to account and having the attention of the Premier and senior ministers.

By June 2019, the Department of Premier and Cabinet should:

  1. establish routine collection of feedback about PIU performance including:
    • independent assurance of PIU performance
    • opportunity for agencies to provide confidential feedback.

 

 

Published

Actions for Shared services in local government

Shared services in local government

Local Government
Internal controls and governance
Management and administration
Shared services and collaboration

Local councils need to properly assess the performance of their current services before considering whether to enter into arrangements with other councils to jointly manage back-office functions or services for their communities. This is one of the recommended practices for councils in a report released today by the Auditor-General for New South Wales, Margaret Crawford. ‘When councils have decided to jointly provide services, they do not always have a strong business case, which clearly identifies the expected costs, benefits and risks of shared service arrangements’, said the Auditor-General.

Councils provide a range of services to meet the needs of their communities. It is important that they consider the most effective and efficient way to deliver them. Many councils work together to share knowledge, resources and services. When done well, councils can save money and improve access to services. This audit assessed how efficiently and effectively councils engage in shared service arrangements. We define ‘shared services’ as two or more councils jointly managing activities to deliver services to communities or perform back-office functions. 

The information we gathered for this audit included a survey of all general-purpose councils in NSW. In total 67 councils (52 per cent) responded to the survey from 128 invited to participate. Appendix two outlines in more detail some of the results from our survey. 

Conclusion
Most councils we surveyed are not efficiently and effectively engaging in shared services. This is due to three main factors. 
First, not all surveyed councils are assessing the performance of their current services before deciding on the best service delivery model. Where they have decided that sharing services is the best way to deliver services, they do not always build a business case which outlines the costs, benefits and risks of the proposed shared service arrangement before entering into it.
Second, some governance models used by councils to share services affect the scope, management and effectiveness of their shared service operations. Not all models are subject to the same checks and balances applied to councils, risking transparency and accountability. Councils must comply with legislative obligations under the Local Government Act 1993 (NSW), including principles for their day-to-day operations. When two or more councils decide to share services, they should choose the most suitable governance model in line with these obligations. 
Third, some councils we surveyed and spoke to lack the capability required to establish and manage shared service arrangements. Identifying whether sharing is the best way to deliver council services involves analysing how services are currently being delivered and building a business case. Councils also need to negotiate with partner councils and determine which governance model is fit for purpose. Planning to establish a shared service arrangement involves strong project management. Evaluating the arrangements identifies whether they are delivering to the expected outcomes. All of these tasks need a specialised skill set that councils do not always have in-house. Resources are available to support councils and to build their capability, but not all councils are seeking this out or considering their capability needs before proceeding.  
Some councils are not clearly defining the expected costs and benefits of shared service arrangements. As a result, the benefits from these arrangements cannot be effectively evaluated.
Some councils are entering into shared service arrangements without formally assessing their costs and benefits or investigating alternative service delivery models. Some councils are also not evaluating shared services against baseline data or initial expectations. Councils should base their arrangements on a clear analysis of the costs, benefits and risks involved. They should evaluate performance against clearly defined outcomes.
The decision to share a service involves an assessment of financial and non-financial costs and benefits. Non-financial benefits include being able to deliver additional services, improve service quality, and deliver regional services across councils or levels of government. 
When councils need support to assess and evaluate shared service arrangements, guidance is available through organisations or by peer learning with other councils.
The governance models councils use for shared services can affect their scope and effectiveness. Some councils need to improve their project management practices to better manage issues, risks and reporting. 
Shared services can operate under several possible governance models. Each governance model has different legal or administrative obligations, risks and benefits. Some arrangements can affect the scope and effectiveness of shared services. For example, some models do not allow councils to jointly manage services, requiring one council to take all risks and responsibilities. In addition, some models may reduce transparency and accountability to councils and their communities.
Regardless of these obligations and risks, councils can still improve how they manage their shared services operations by focusing on project management and better oversight. They would benefit from more guidance on shared service governance models to help them ensure the they are fit for purpose.
Recommendation
The Office of Local Government should, by April 2019:

Develop guidance which outlines the risks and opportunities of governance models that councils can use to share services. This should include advice on legal requirements, transparency in decisions, and accountability for effective use of public resources.

Published

Actions for Regional Assistance Programs

Regional Assistance Programs

Premier and Cabinet
Planning
Transport
Compliance
Infrastructure
Management and administration
Project management

Infrastructure NSW effectively manages how grant applications for regional assistance programs are assessed and recommended for funding. Its contract management processes are also effective. However, we are unable to conclude whether the objectives of these programs have been achieved as the relevant agencies have not yet measured their benefits, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. 

In 2011, the NSW Government established Restart NSW to fund new infrastructure with the proceeds from the sale and lease of government assets. From 2011 to 2017, the NSW Government allocated $1.7 billion from the fund for infrastructure in regional areas, with an additional commitment of $1.3 billion to be allocated by 2021. The NSW Government allocates these funds through regional assistance programs such as Resources for Regions and Fixing Country Roads. NSW councils are the primary recipients of funding provided under these programs.

The NSW Government announced the Resources for Regions program in 2012 with the aim of addressing infrastructure constraints in mining affected communities. Infrastructure NSW administers the program, with support from the Department of Premier and Cabinet.

The NSW Government announced the Fixing Country Roads program in 2014 with the aim of building more efficient road freight networks. Transport for NSW and Infrastructure NSW jointly administer this program, which funds local councils to deliver projects that help connect local and regional roads to state highways and freight hubs.

This audit assessed whether these two programs (Resources for Regions and Fixing Country Roads) were being effectively managed and achieved their objectives. In making this assessment, we answered the following questions:

  • How well are the relevant agencies managing the assessment and recommendation process?
  • How do the relevant agencies ensure that funded projects are being delivered?
  • Do the funded projects meet program and project objectives?

The audit focussed on four rounds of Resources for Regions funding between 2013–14 to 2015–16, as well as the first two rounds of Fixing Country Roads funding in 2014–15 and 2015–16.

Conclusion
Infrastructure NSW effectively manages how grant applications are assessed and recommended for funding. Infrastructure NSW’s contract management processes are also effective. However, we are unable to conclude on whether program objectives are being achieved as Infrastructure NSW has not yet measured program benefits.
While Infrastructure NSW and Transport for NSW managed the assessment processes effectively overall, they have not fully maintained all required documentation, such as conflict of interest registers. Keeping accurate records is important to support transparency and accountability to the public about funding allocation. The relevant agencies have taken steps to address this in the current funding rounds for both programs.
For both programs assessed, the relevant agencies have developed good strategies over time to support councils through the application process. These strategies include workshops, briefings and feedback for unsuccessful applicants. Transport for NSW and the Department of Premier and Cabinet have implemented effective tools to assist applicants in demonstrating the economic impact of their projects.
Infrastructure NSW is effective in identifying projects that are 'at‑risk' and assists in bringing them back on track. Infrastructure NSW has a risk‑based methodology to verify payment claims, which includes elements of good practice in grants administration. For example, it requires grant recipients to provide photos and engages Public Works Advisory to review progress claims and visit project sites.
Infrastructure NSW collects project completion reports for all Resources for Regions and Fixing Country Roads funded projects. Infrastructure NSW intends to assess benefits for both programs once each project in a funding round is completed. To date, no funding round has been completed. As a result, no benefits assessment has been done for any completed project funded in either program.
 

The project selection criteria are consistent with the program objectives set by the NSW Government, and the RIAP applied the criteria consistently. Probity and record keeping practices did not fully comply with the probity plans.

The assessment methodology designed by Infrastructure NSW is consistent with2 the program objectives and criteria. In the rounds that we reviewed, all funded projects met the assessment criteria.

Infrastructure NSW developed probity plans for both programs which provided guidance on the record keeping required to maintain an audit trail, including the use of conflict of interest registers. Infrastructure NSW and Transport for NSW did not fully comply with these requirements. The relevant agencies have taken steps to address this in the current funding rounds for both programs.

NSW Procurement Board Directions require agencies to ensure that they do not engage a probity advisor that is engaged elsewhere in the agency. Infrastructure NSW has not fully complied with this requirement. A conflict of interest arose when Infrastructure NSW engaged the same consultancy to act as its internal auditor and probity advisor.

While these infringements of probity arrangements are unlikely to have had a major impact on the assessment process, they weaken the transparency and accountability of the process.

Some councils have identified resourcing and capability issues which impact on their ability to participate in the application process. For both programs, the relevant agencies conducted briefings and webinars with applicants to provide advice on the objectives of the programs and how to improve the quality of their applications. Additionally, Transport for NSW and the Department of Premier and Cabinet have developed tools to assist councils to demonstrate the economic impact of their applications.

The relevant agencies provided feedback on unsuccessful applications to councils. Councils reported that the quality of this feedback has improved over time.

Recommendations

  1. By June 2018, Infrastructure NSW should:
    • ensure probity reports address whether all elements of the probity plan have been effectively implemented.
  1. By June 2018, Infrastructure NSW and Transport for NSW should:
    • maintain and store all documentation regarding assessment and probity matters according to the State Records Act 1998, the NSW Standard on Records Management and the relevant probity plans

Infrastructure NSW is responsible for overseeing and monitoring projects funded under Resources for Regions and Fixing Country Roads. Infrastructure NSW effectively manages projects to keep them on track, however it could do more to assure itself that all recipients have complied with funding deeds. Benefits and outcomes should also start to be measured and reported as soon as practicable after projects are completed to inform assessment of future projects.

Infrastructure NSW identifies projects experiencing unreasonable delays or higher than expected expenses as 'at‑risk'. After Infrastructure NSW identifies a project as 'at‑risk', it puts in place processes to resolve issues to bring them back on track. Infrastructure NSW, working with Public Works Advisory regional offices, employs a risk‑based approach to validate payment claims, however this process should be strengthened. Infrastructure NSW would get better assurance by also conducting annual audits of compliance with the funding deed for a random sample of projects.

Infrastructure NSW collects project completion reports for all Resources for Regions and Fixing Country Roads funded projects. It applies the Infrastructure Investor Assurance Framework to Resources for Regions and Fixing Country Roads at a program level. This means that each round of funding (under both programs) is treated as a distinct program for the purposes of benefits realisation. It plans to assess whether benefits have been realised once each project in a funding round is completed. As a result, no benefits realisation assessment has been done for any project funded under either Resources for Regions or Fixing Country Roads. Without project‑level benefits realisation, future decisions are not informed by the lessons from previous investments.

Recommendations

  1. By December 2018, Infrastructure NSW should:
    • conduct annual audits of compliance with the funding deed for a random sample of projects funded under Resources for Regions and Fixing Country Roads
    • publish the circumstances under which unspent funds can be allocated to changes in project scope
    • measure benefits delivered by projects that were completed before December 2017
    • implement an annual process to measure benefits for projects completed after December 2017
  1. By December 2018, Transport for NSW and Infrastructure NSW should:
    • incorporate a benefits realisation framework as part of the detailed application.

Published

Actions for Council reporting on service delivery

Council reporting on service delivery

Local Government
Compliance
Internal controls and governance
Management and administration
Service delivery

New South Wales local government councils’ could do more to demonstrate how well they are delivering services in their reports to the public, according to a report released today by the Auditor-General for New South Wales, Margaret Crawford. Many councils report activity, but do not report on outcomes in a way that would help their communities assess how well they are performing. Most councils also did not report on the cost of services, making it difficult for communities to see how efficiently they are being delivered. And councils are not consistently publishing targets to demonstrate what they are striving for.

I am pleased to present my first local government performance audit pursuant to section 421D of the Local Government Act 1993.

My new mandate supports the Parliament’s objectives to:

  • strengthen governance and financial oversight in the local government sector
  • improve financial management, fiscal responsibility and public accountability for how councils use citizens’ funds.

Performance audits aim to help councils improve their efficiency and effectiveness. They will also provide communities with independent information on the performance of their councils.

For this inaugural audit in the local government sector, I have chosen to examine how well councils report to their constituents about the services they provide.

In this way, the report will enable benchmarking and provide improvement guidance to all councils across New South Wales.

Specific recommendations to drive improved reporting are directed to the Office of Local Government, which is the regulator of councils in New South Wales.

Councils provide a range of services which have a direct impact on the amenity, safety and health of their communities. These services need to meet the needs and expectations of their communities, as well as relevant regulatory requirements set by state and federal governments. Councils have a high level of autonomy in decisions about how and to whom they provide services, so it is important that local communities have access to information about how well they are being delivered and meeting community needs. Ultimately councils should aim to ensure that reporting performance is subject to quality controls designed to provide independent assurance.

Conclusion
While councils report on outputs, reporting on outcomes and performance over time can be improved. Improved reporting would include objectives with targets that better demonstrate performance over time. This would help communities understand what services are being delivered, how efficiently and effectively they are being delivered, and what improvements are being made.
To ensure greater transparency on service effectiveness and efficiency, the Office of Local Government (OLG) should work with councils to develop guidance principles to improve reporting on service delivery to local communities. This audit identified an interest amongst councils in improving their reporting and broad agreement with the good practice principles developed as part of the audit.
The Integrated Planning and Reporting Framework (the Framework), which councils are required to use to report on service delivery, is intended to promote better practice. However, the Framework is silent on efficiency reporting and provides limited guidance on how long-term strategic documents link with annual reports produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.
OLG should also work with state agencies to reduce the overall reporting burden on councils by consolidating state agency reporting requirements. 

Councils report extensively on the things they have done, but minimally on the outcomes from that effort, efficiency and performance over time.

Councils could improve reporting on service delivery by more clearly relating the resources needed with the outputs produced, and by reporting against clear targets. This would enable communities to understand how efficiently services are being delivered and how well councils are tracking against their goals and priorities.

Across the sector, a greater focus is also needed on reporting performance over time so that communities can track changes in performance and councils can demonstrate whether they are on target to meet any agreed timeframes for service improvements.

The degree to which councils demonstrate good practice in reporting on service delivery varies greatly between councils. Metropolitan and regional town and city councils generally produce better quality reporting than rural councils. This variation indicates that, at least in the near-term, OLG's efforts in building capability in reporting would be best directed toward rural councils.

Recommendation

By mid-2018, OLG should:

  • assist rural councils to develop their reporting capability.

The Framework which councils are required to use to report on service delivery, is intended to drive good practice in reporting. Despite this, the Framework is silent on a number of aspects of reporting that should be considered fundamental to transparent reporting on service delivery. It does not provide guidance on reporting efficiency or cost effectiveness in service delivery and provides limited guidance on how annual reports link with other plans produced as part of the Framework. OLG's review of the Framework, currently underway, needs to address these issues.

Recommendation

By mid-2018, OLG should:

  • issue additional guidance on good practice in council reporting, with specific information on:
    • reporting on performance against targets
    • reporting on performance against outcome
    • assessing and reporting on efficiency and cost effectiveness
    • reporting performance over time
    • clearer integration of all reports and plans that are required by the Framework, particularly the role of End of Term Reporting
    • defining reporting terms to encourage consistency.

The Framework is silent on inclusion of efficiency or cost effectiveness indicators in reports

The guidelines produced by OLG in 2013 to assist councils to implement their Framework requirements advise that performance measures should be included in all plans. However, the Framework does not specifically state that efficiency or cost effectiveness indicators should be included as part of this process. This has been identified as a weakness in the 2012 performance audit report and the Local Government Reform Panel review of reporting by councils on service delivery.

The Framework and supporting documents provide limited guidance on reporting

Councils' annual reports provide a consolidated summary of their efforts and achievements in service delivery and financial management. However, OLG provides limited guidance on:

  • good practice in reporting to the community
  • how the annual report links with other plans and reports required by the Framework.

Further, the Framework includes both Annual and End of Term Reports. However, End of Term reports are published prior to council elections and are mainly a consolidation of annual reports produced during a council’s term. The relationship between Annual reports and End of Term reports is not clear.

OLG is reviewing the Framework and guidance

OLG commenced work on reviewing of the Framework in 2013 but this was deferred with work re‑starting in 2017. The revised guidelines and manual were expected to be released late in 2017.

OLG should build on the Framework to improve guidance on reporting on service delivery, including in annual reports

The Framework provides limited guidance on how best to report on service delivery, including in annual reports. It is silent on inclusion of efficiency or cost effectiveness indicators in reporting, which are fundamental aspects of performance reporting. Councils we consulted would welcome more guidance from OLG on these aspects of reporting.

Our consultation with councils highlighted that many council staff would welcome a set of reporting principles that provide guidance to councils, without being prescriptive. This would allow councils to tailor their approach to the individual characteristics, needs and priorities of their local communities.

Consolidating what councils are required to report to state agencies would reduce the reporting burden and enable councils to better report on performance. Comparative performance indicators are also needed to provide councils and the public with a clear understanding of councils' performance relative to each other.

Recommendations

By mid-2018, OLG should:

  • commence work to consolidate the information reported by individual councils to NSW Government agencies as part of their compliance requirements.
  • progress work on the development of a Performance Measurement Framework, and associated performance indicators, that can be used by councils and the NSW Government in sector-wide performance reporting.

Streamlining the reporting burden would help councils improve reporting

The NSW Government does not have a central view of all local government reporting, planning and compliance obligations. A 2016 draft IPART ‘Review of reporting and compliance burdens on Local Government’ noted that councils provide a wide range of services under 67 different Acts, administered by 27 different NSW Government agencies. Consolidating and coordinating reporting requirements would assist with better reporting over time and comparative reporting. It would also provide an opportunity for NSW Government agencies to reduce the reporting burden on councils by identifying and removing duplication.

Enabling rural councils to perform tailored surveys of their communities may be more beneficial than a state-wide survey in defining outcome indicators

Some councils use community satisfaction survey data to develop outcome indicators for reporting. The results from these are used by councils to set service delivery targets and report on outcomes. This helps to drive service delivery in line with community expectations. While some regional councils do conduct satisfaction surveys, surveys are mainly used by metropolitan councils which generally have the resources needed to run them.

OLG and the Department of Premier and Cabinet have explored the potential to conduct state-wide resident satisfaction surveys with a view to establishing measures to improve service delivery. This work has drawn from a similar approach adopted in Victoria. Our consultation with stakeholders in Victoria indicated that the state level survey is not sufficiently detailed or specific enough to be used as a tool in setting targets that respond to local circumstances, expectations and priorities. Our analysis of reports and consultation with stakeholders suggest that better use of resident survey data in rural and regional areas may support improvements in performance reporting in these areas. Rural councils may benefit more from tailored surveys of groups of councils with similar challenges, priorities and circumstances than from a standard state-wide survey. These could potentially be achieved through regional cooperation between groups of similar councils or regional groups.

Comparative reporting indicators are needed to enable councils to respond to service delivery priorities of their communities

The Local Government Reform Panel in 2012 identified the need for ‘more consistent data collection and benchmarking to enable councils and the public to gain a clear understanding of how a council is performing relative to their peers’.

OLG commenced work in 2012 to build a new performance measurement Framework for councils which aimed to move away from compliance reporting. This work was also strongly influenced by the approach used in Victoria that requires councils to report on a set of 79 indicators which are reported on the Victorian 'Know your council' website. OLG’s work did not fully progress at the time and several other local government representative bodies have since commenced work to establish performance measurement frameworks. OLG advised us it has recently recommenced its work on this project.

Our consultation identified some desire amongst councils to be able to compare their performance to support improvement in the delivery of services. We also identified a level of frustration that more progress has not been made toward establishment of a set of indicators that councils can use to measure performance and drive improvement in service delivery.

Several councils we spoke with were concerned that the current approaches to comparative reporting did not adequately acknowledge that councils need to tailor their service types, level and mix to the needs of their community. Comparative reporting approaches tend to focus on output measures such as number of applications processed, library loans annually and opening hours for sporting facilities, rather than outcome measures. These approaches risk unjustified and adverse interpretations of performance where councils have made a decision based on community consultation, local priorities and available resources. To mitigate this, it is important to

  • adopt a partnership approach to the development of indicators
  • ensure indicators measure performance, not just level of activity
  • compare performance between councils that are similar in terms of size and location.

It may be more feasible, at least in the short term, for OLG to support small groups of like councils to develop indicators suited to their situation.

Based on our consultations, key lessons from implementing a sector-wide performance indicator framework in Victoria included the benefits of:

  • consolidation of the various compliance data currently being reported by councils to provide an initial platform for comparative performance reporting
  • adopting a partnership approach to development of common indicators with groups of like councils.

Published

Actions for Severance Payments to Special Temporary Employees

Severance Payments to Special Temporary Employees

Premier and Cabinet
Compliance
Internal controls and governance
Management and administration
Workforce and capability

In reviewing both the severance pay guidelines and a sample of payments, we found the guidelines to be clear and all except two payments were made in accordance with them. In these two cases the severance payment was stipulated in the employment contract guaranteeing the STE a minimum of six months pay on termination, irrespective of the length of service.

 

Parliamentary reference - Report number #201 - released 16 June 2010

Published

Actions for Freedom of Information

Freedom of Information

Transport
Premier and Cabinet
Education
Management and administration
Regulation
Service delivery

Freedom of Information (FOI) Coordinators and their staff were supportive of the legislation. However, the agencies examined can do considerably more to fully achieve the intentions of the Act. On the positive side, all three agencies had processes in place to handle requests and had made a number of changes to improve the effectiveness of the FOI process. Fees and charges had also been kept to a minimum. No processing fees were requested in the majority of cases, and if charged, were not unreasonable.

 

Parliamentary reference - Report number #114 - released 28 August 2003