Curtin University

Related consultation
Submission received

Name (Individual/Organisation)

Curtin University

Responses

Q1. How could the purpose in the ARC Act be revised to reflect the current and future role of the ARC?

For example, should the ARC Act be amended to specify in legislation:
(a) the scope of research funding supported by the ARC
(b) the balance of Discovery and Linkage research programs
(c) the role of the ARC in actively shaping the research landscape in Australia
(d) any other functions?

If so, what scope, functions and role?

If not, please suggest alternative ways to clarify and define these functions.

(a.) and (b.) The purpose of the ARC as stated in the ARC’s recent Strategic Plan is broadly appropriate. The generality of the Act should be maintained such that the ARC is able to evolve and maintain flexibility to respond to change without the need to modify existing legislation.

(c.) The ARC already shapes the research landscape in Australia by the way it calls for, reviews and awards funds for research. A revised system of grant review with EoIs and optimised expert review confined to high quality applications would reinvest researcher time back into research, thereby improving the research outputs of Australia. The restrictions on funding of researcher salaries impacts negatively on the tertiary education system by transferring cost of research to universities. The ARC Act should be revised to reflect the desired conduct and outcomes of Australian research as a whole and specify funding principles that support efficiency and feasible cost-sharing.

Q2. Do you consider the current ARC governance model is adequate for the ARC to perform its functions?

If not, how could governance of the ARC be improved? For example, should the ARC Act be amended to incorporate a new governance model that establishes a Board on the model outlined in the consultation paper, or another model.

Please expand on your reasoning and/or provide alternative suggestions to enhance the governance, if you consider this to be important.

The ARC is, effectively, line managed by the Minister for Education and also answerable via Senate Estimates. ARC governance does not currently include any involvement or influence of the research or business community. Curtin University supports involvement of the research community in ARC governance, which could be implemented by converting the ARC Advisory Committee into a more formal governance Board. A Board would contribute a balanced view to the CEO and Minister, providing recommendations and reducing politically based decision making.

The Terms of Reference for the Board would need careful drafting to avoid conflicts with the Minister’s responsibilities under the Act. The current review provides an opportunity to assure harmony because the Act is under review. The Board should be skills-based and include university executive research leadership (one STEM; one HASS); practicing researchers – senior and early career; industry (research participants and output implementers); research policy (preferably independent of government) and governance. The ARC CEO should not be a member of the Board. Industry not involved in research specification or direct implementation should not be represented.

The ARC Board should report annually to the sector (public) on the performance of the ARC against expectations via agreed KPI’s. The ARC, itself, should report publicly on its activities and outcomes.

Q3. How could the Act be improved to ensure academic and research expertise is obtained and maintained to support the ARC?

How could this be done without the Act becoming overly prescriptive?

The ARC Act does not currently specify that appropriate research expertise is a prerequisite for the CEO and Executive Director (ED) positions. Curtin strongly supports the idea that the ARC leadership includes significant research experience and demonstrated academic excellence.

To contribute further research expertise and grow the pipeline of appropriately qualified personnel, EDs could be supported by a small team of part-time (0.2 - 0.3 FTE) researchers who remain in their universities to carry out research and have a line of responsibility to the ED. This would provide a greater critical mass of research capability in the ARC and availability of critical leadership and advice.

The 'alumni' of the ARC, academics drawn from the sector over time, could be formalised as a mentoring and advising group for current staff.

Q4. Should the ARC Act be amended to consolidate the pre-eminence or importance of peer review?

Please provide any specific suggestions you may have for amendment of the Act, and/or for non-legislative measures.

Yes, the ARC Act should be amended to explicitly require that peer review drives funding decisions. This could be reinforced by the ARC Act explicitly identifying peer review as a component of quality, for example “Making high quality recommendations, based on rigorous peer review, to the Minister...”. Consideration could be given to amending the ARC Act to limit the ability of the Minister to over-rule the expert determinations made by the ARC. An explicit statement protecting the decisions of the sitting CoE panels would be welcome.

There also needs to be careful definition of “peer”. There is significant criticism of the concept today. The challenge of ‘domain expertise’ versus ‘research expertise’ on review needs to be addressed. Too few applications are reviewed by those with technical expertise in the area and too many opinions come from outside that. This emphasis should be reversed. There could be more use of international referees, who will likely need to be remunerated. The higher cost is justified by better outcomes. The change to more disciplinary review could be complemented by a greater early triage process. Far too much time of experts is spent declaring the bottom 40-50% of applications. Given the low scheme success rates it should be far easier to identify these proposals early and avoid lengthy and expensive reviews of them. Similarly, the very best (~10%) can be relatively easily identified. The deep expertise and time can be focussed on the middle 30-40%.

The term “peer” also needs to be clarified for interdisciplinary research. The ARC has called out the need to increase interdisciplinary research for quite some time but has failed to define and implement effective review for interdisciplinary research proposals. This should not be acceptable in the modern research environment where the call to meet society’s most pressing challenges is clear.

Finally, researchers need to be acknowledged and rewarded for undertaking review. ARC should make the information on quantity and quality of reviews available to the university employing the researchers who are conducting reviews. The ARC should mandate evidence that review is being rewarded tangibly in universities as part of the process of selecting reviewers. This will help push rapid transformation in universities in acknowledging the importance of review and result in increased review quality (more quality time will be spent in review if it is formally recognised and rewarded).

Q5. Please provide suggestions on how the ARC, researchers and universities can better preserve and strengthen the social licence for public funding of research?

Public communications campaigns regarding the yield to Australia (and the world) of public funded research should be part of the performance criteria for the ARC/NHMRC/MRFF. The aim of this would be to counter the narrative that researchers are out of touch with society and politically naïve. Instead, hopefully as part of the University Accord, Australians will value their universities/research institutions as nationally significant assets.

At the core of the social license of the ARC (and ARC-funded research) is the National Interest Test (NIT) statement, since this is the public statement provided to Senate and is written with the intention of being understood by the broader Australian public. There are currently two major flaws in the approach to the National Interest Test statement. The first relates to its place within the grant application process, the second relates to a particularly narrow view of what constitutes ‘national interest’. On the first point, Curtin welcomes the changes announced on 1 December 2022 to integrate the NIT into the overall peer review process, with certification from the DVCR that ‘the statement clearly and simply explains the benefit of the research to Australia‘. This will enable a more holistic assessment of the benefit of the proposed research, while reducing administrative burden/delays that have been associated with recent NIT processes.

A second issue with the current NIT process is the subtle but important underlying assumption of what is meant by the phrase ‘national interest’. Previous recent ministers have made public statements to the effect that research in the national interest is research about Australia. Research is often nationally significant because Australian researchers contribute to fields of international concern, regardless of whether that research is about Australia. Furthermore, the idea that nationally significant research is subjected to a ‘pub test’ indicates a low tolerance for research that might be critical of Australian society, culture or government. Criticality is the lifeblood of research; it is in the DNA of academic culture. Yet there is a danger with this ‘pub test’ approach that research that might critically question aspects of Australia might not be funded.

Of course, it is important that the broad base of taxpayers is convinced that public funds are employed in ways that are in Australia’s national interest. It is, however, a mistake to patronise that broader Australian public with the assumption that they can only understand national interest when research is about, and uncritically celebrates, Australia.

Q6. What elements of ARC processes or practices create administrative burdens and/or duplication of effort for researchers, research offices and research partners?

We acknowledge the ARC’s efforts to evolve and develop into a progressive and efficient government agency. Historically, the automation of elements of the submission of applications and associated activities in RMS has improved the efficiency of grant application and administration processes. This was not an easy task considering ever growing complexity of regulatory environment.

It is now time for the next iteration in efficiency. Harmonisation of funding application processes across all Commonwealth funding would be very helpful. A single portal for ARC/NHMRC/MRFF. Currently, research application formats change frequently to respond to external drivers and identification of process improvements. Maintaining a consistent format, keeping the sections the same and not changing the structure / nature of the ROPE so frequently would significantly reduce the academic workload associated with grant applications. Adopting in other schemes the two-step process used for Centres of Excellence, involving a shorter EOI followed by invitation to submit full applications, could lessen the burden in completing long application forms for schemes with low success rates. Modifying milestone/end of project forms could also enable the collection of information for future research excellence, engagement and impact assessments.

'Guidelines and associated documents'
Although the ARC has streamlined the scheme specific and general information packages in recent years, the sheer volume of guidelines, instructions, agreements, RMS instructions, and policy documents that the Research Offices (RO) and researchers must be familiar with is substantial. This presents an impediment to early career researchers, external overseas-based applicants and new RO staff members.

Researchers/RO Staff need to review multiple documents (mainly grant guidelines, ITA and FAQs) before submission of applications. Consideration might be given to simplifying these documents into a single document/resource and publicly releasing scheme guidance at least a month in advance before the grant opening date. It is noted that as the regulatory environment is becoming more complex [for instance imposition Foreign Interference (FI) requirement], the task of designing simplified and condensed information package will become more challenging. However, this should not prohibit the ARC from considering reducing the administrative burden related to this aspect.

One example of lack of consolidation of information are the requirements underpinning the preparation of the budgets. Currently this information is scattered between guidelines, instructions, agreements, and other sources. Internally, we have worked to resolve this hurdle by creating an internal Budget guidance document; however, better consolidation of this information at the ARC end would be welcomed.

'Onerous requirements made of partners (including industry and international) at the time of some submissions'
Certification by Industry partners, especially for linkage program schemes, is too onerous at application stage given the low success rate. The ARC should consider having a simplified certification for industry/international partners during application and a detailed certification that can be introduced as a condition of the award acceptance process. This way not all applications have to go through the detailed certification process.

'Timing'
Short timeframes place a heavy burden on research officers and applicants. ARC processes would be greatly improved by providing greater guidance ahead of rounds, expedition of the assessment process, and more useful feedback to unsuccessful applicants. Expedition of the Linkage rounds in particular, given the involvement of industry partners, would facilitate maintenance of positive relationships with our industry partners, as would quarterly submission rounds.

The concept of the "Special Research Initiative" could be expanded to dedicated calls (e.g. thematic DP calls for instance). Doing this would bring the ARC functioning closer to that of the European Research Council and foster collaborative initiatives to address chosen priorities.

'Prescriptive financial requirements, variations, and approvals (especially if budget requests are not fully funded);'
Too much focus is currently placed on the budget, which almost always needs to be revised when (increasingly) it is not fully funded, thereby creating a duplication of effort. Efficiency can be achieved by utilising the ARC Industry Fellow scheme budget model for other schemes too.

'Duplication of national security requirements and processes outside the University Foreign Interference Taskforce'
The national security obligations are currently placing a heavy burden on universities and it is difficult to meet the requirements in a timely way. It is challenging to undertake effective due diligence for researchers/ industry partners that are not based at the Administering Organisation. This administrative burden could be re-focussed to the organisation that employs each respective contributor.

'The Appeals process'
One persistent issue is the excessive number of researchers who believe that the current criteria allow for an Assessors’ comments to be dismissed. Based on experience, the ARC has always dismissed these appeals and advised the applicants to address any matters arising from the assessors’ comments in the Rejoinder.

Curtin University’s suggestion is to review the current instructions related to appeals against assessors’ comments, to ensure correct interpretation of the intention of the Funder. This will eliminate significant burden on the ARC and ROs and eliminate false hopes of successful appeal, instead directing all efforts of both researchers and RO into formulating quality rejoinders.

'Variations to Funding Agreement (VFA)'
Although the ARC has attempted in the past to streamline the VFA process, it still lacks administrative simplicity, flexibility and importantly is burdened with significant delays. Typically, we are required to provide a range of supporting documents, including consent from named researchers and organisations, for the purpose of VFA. Currently, the instructions do not provide clear guidance on how to deal with significant delays or lack of cooperation from organisations, predominantly those based overseas. This needs to be addressed by introducing realistic criteria which can be met in a timely manner by the RO.

Noting that the ARC is not without resource challenges and capacity constraints, the significant delays in response time to ROs regarding submitted VFA are unreasonable and impacting on the progression of funded projects. This aspect is also associated with the VFAs related to the transfer of projects to and /or from the Administering Organisation, as some recent examples clearly indicate.

'Certification – General Requirements'
One key area for improvement is certifications. Currently we have a plethora of certifications including, but not limited to, letters of support or approval from DVCR or their delegate; written evidence from researchers; certification embedded in each proposal; institutional approval of PhD equivalency; salary equivalency for overseas applicants; eligibility exemptions and so on.

Although the importance of certifications/letters of approval is well understood by researchers and administrators, the current review presents an opportunity to have a fresh look at this aspect. A good, recent example of the shift in the ARC approach is the requirement of the Request not to Assess (RNTA) to exclude College of Experts process. Previously this type of RNTA required a letter of approval from the DVCR or their delegate. Now the applicant is required to provide justification within the RMS RNTA module linked to the Application, with input from scheme coordinator, where appropriate.

Process improvements for certification include:
• a shift from paper documentation and approvals to online facilitated requirements in RMS where practicable/possible;
• reductions of required approvals and consolidation into one institutional, electronic approval within RMS, with corresponding changes to the online format of proposal; and
• more consideration given to institutionally managed verifications/certifications by Administrative Organisations.

Q7. What improvements could be made:

(a) to ARC processes to promote excellence, improve agility, and better facilitate globally collaborative research and partnerships while maintaining rigour, excellence and peer review at an international standard?

(b) to the ARC Act to give effect to these process improvements, or do you suggest other means?

Please include examples of success or best practice from other countries or communities if you have direct experience of these.

The ARC does not readily promote or enable international collaboration. Applications can include international research partners, but the scope is small and driven by independent researchers. There are few opportunities to leverage ARC funding into global scale funding opportunities with international funding bodies. Many significant research problems are global in nature and mechanisms of co-funding bigger research initiatives (e.g. with NSF, ERC etc) should be considered. The few schemes that do exist are limited to specific partner countries and are not particularly well-funded. Curtin University recommends changes to the ARC Act to allow funding of larger scale international initiatives.

Shortening the timeframe between submission and funding announcements could assist in attracting or securing high-quality fellowship applicants from overseas or returning Australians. Reinstating the travel fellowship for international collaboration in Discovery Projects or providing this as a separate scheme could enhance international reputation and build skills in Australia.

Q8. With respect to ERA and EI:

(a) Do you believe there is a need for a highly rigorous, retrospective excellence and impact assessment exercise, particularly in the absence of a link to funding?

(b) What other evaluation measures or approaches (e.g. data driven approaches) could be deployed to inform research standards and future academic capability that are relevant to all disciplines, without increasing the administrative burden?

(c) Should the ARC Act be amended to reference a research quality, engagement and impact assessment function, however conducted?

(d) If so, should that reference include the function of developing new methods in research assessment and keeping up with best practice and global insights?

(a.) Do you believe there is a need for a highly rigorous, retrospective excellence and impact assessment exercise, particularly in the absence of a link to funding?
Curtin University supports an excellence and impact assessment exercise. There is a need for an evidence base to support the narrative of the value of the research and innovation pathways underpinned by investments (public and private) within the sector. The descriptors of the assessments should be rigorous and transparent. The assessments should be promoted as timely rather than retrospective, as by their nature, all such assessments must be retrospective.

Linking such an assessment to funding is challenging because of the retrospective nature of the exercise. Funding could be being supplied to an institution which had strength in an area some years previously and may no longer have that strength. The converse is also true. It is recognised that this conundrum exists on all funding allocation that are not project based. If funding was to be allocated outside projects to individuals preventing funding of a researcher at an institution which is yet to show strength in an area, would limit both the freedom of researchers to pursue their passion and interests, and novel developments arising from new ideas and collaborations.

The excellence and impact assessment exercise has little value in comparing universities. There is a range of international ranking systems that fulfil that function more-or-less well. A fit-for-purpose excellence and impact assessment exercise should be used to demonstrate the value of the research sector as a whole, to educate the public and justify appropriate levels of investment.

(b.) What other evaluation measures or approaches (e.g. data driven approaches) could be deployed to inform research standards and future academic capability that are relevant to all disciplines, without increasing the administrative burden?

Numerous newly emerging data analysis approaches are being developed that would allow machine learning based assessments of outputs from Australian universities. An approach that removes the burden of the excellence and impact assessment exercise from universities would be welcome, as long as appropriate checks were conducted to ensure validity of the result.
The focus on FOR codes creates a bespoke nomenclature relevant only for the purposes of the ERA and EI assessments. The resulting requirement for this FOR code and assessment specific expertise places a costly administrative burden on organisations. An alternative would be for organisations to demonstrate their own achievements in a manner that suits their outputs and existing infrastructure. A selection of required common measures e.g., category normalised citation indices, translation into policy or practice, could enable cross-institutional comparisons if this was considered necessary.
Regardless, Curtin University recommends that the ARC provide Higher Education Providers (HEPs) with the annual Journal (and other platform outputs) ERA data. If the ANZSRC framework (FOR and SEO) was retained; ORCID; HEP affiliation, and Journal (Output) FOR code could be supplied. The HEPs would then confirm and return the annual collation: they verify the ORCID as staff; and they verify (confirm) the FOR code allocation. Such an approach would reduce the burden of allocations and prevent manipulation of placement of outputs in FOR codes. We recommend that no more than 5 - 10% be moved out of code without a formal request and justification.

For outputs that have been assessed by peer review, further methods may be needed. There are some “peer review” FORs that have substantial numbers of publications / journal outputs. If there is a pathway for researchers in a peer review domain to publish in research journals then they should be recorded (and assessed) by both publication and peer review.

If it is considered necessary to use the excellence and impact assessment data to compare Universities, it is recommended that the annual data on total volume of outputs should be reported (on a moving three-year average) for 2-digit codes for all Institutions. Each Institution should have the 4-digit FOR codes that constitute more than 2% of their outputs identified annually and assessed, to demonstrate the scale of outputs and the characteristics of Institutions.

To inform future academic capability, the ARC could provide confidential excellence and impact assessment data on an annual basis. This could be done for the last 3-4 years so a trajectory analysis could be conducted by the HEP to inform longer term strategies and investments.

Using the volume and scale data the proportion of shared outputs assigned to multiple institutions could be reported, thereby allowing reporting on academic capability networks, as reflected by outputs locally and internationally.

(c.) Should the ARC Act be amended to reference a research quality, engagement and impact assessment function, however conducted?

The ARC is one of several national funding bodies, and specifically excludes health related research. If embedded in the Act the relationship to all other government research funding entities and schemes should be referenced t ensure the interactions are known and understood. Drafting the Act to ensure this over time may be challenging. One approach would be to merge all the government entities and schemes into a harmonised system, at a minimum the ARC and NHMRC and CRC. This may be more an issue for the Accord but is worth noting here.

(d.) If so, should that reference include the function of developing new methods in research assessment and keeping up with best practice and global insights?

If reference to assessment is included in the ARC act, it needs to be explicit that the nature of any assessments is open to change to accommodate global best practice.

Q9. With respect to the ARC’s capability to evaluate research excellence and impact:

(a) How can the ARC best use its expertise and capability in evaluating the outcomes and benefits of research to demonstrate the ongoing value and excellence of Australian research in different disciplines and/or in response to perceived problems?

(b) What elements would be important so that such a capability could inform potential collaborators and end-users, share best practice, and identify national gaps and opportunities?

(c) Would a data-driven methodology assist in fulfilling this purpose?

Whatever approach is adopted for ERA it would be positive for the sector if individual researchers could be allocated their contributions to research quality in a manner consistent with university assessment. The sampling of research outputs is a mechanism that can be easily adapted for this approach. There is a significant movement of researchers between Australian universities. It would be very helpful if every researcher had a set of research performance indicators that are transportable and directly comparable. Without this, h-indices, normalised paper citations and publication counts become the de facto and they are known to be weak indicators on performance. The inclusion of an impact statement in a regularised format would be consistently useful in the same regard.

The separation of impact from excellence is problematic. At one level, the semantics is important. Excellence in impact is difficult to achieve and should be recognised as such. The recognition of the quality of research outputs is what is being assessed in ERA and it should be so named. Overall excellence should be derived from an assessment of the quality and quantity of research outputs and the quality and quantity impact of those outputs. This is quite a departure because it brings into question the appropriateness of a 5-year limit on the assessment of publication quality. It should be entirely reasonable to assess the quality of much older research outputs. We know how long impact can take, particularly from fundamental research but we simply pretend this is not true in assessing quality of outputs. On the flip side, the quantity of outputs that are not recognised by the discipline and cannot be traced to impact should be assessed. It should not be acceptable to only examine the best outputs. The effort in producing vast quantities of poor-quality outputs (or outputs not yet recognised) should be quantified and accounted for. Over time, this should stabilise, i.e., it should not be acceptable to have an increasing rate of outputs that are neither recognised by peers nor traceable to impact. Of course, some great outputs will be lost along the way in terms of recognition, but this will be a very small number compared to the vast quantity we produce that will never be excellent by any measure. Such a change cannot be achieved by tuning up and automating the current ERA.

It is worth stressing that the ARC’s current capability is underpinned by a significant administrative burden on universities. While expanding the capability to demonstrate benefit of research would be valuable, this needs to be within the context of minimised burden on over-stretched university administrative systems.

The ARC could look at better ways of showcasing impactful research. The published high-rated engagement and impact narratives are probably not very appealing to the general public, especially as the use of http links to evidence and outputs was not allowed in the submissions. Providing this function, even if after the assessment, would create a pathway for an audience to engage with complementary demonstrations of impact, including infographics, cartoons, video links, or other formats.

The ARC could analyse the submissions for ‘Approach to Impact’ and develop some resources showcasing best practice so that as a Nation we can all learn from one another.

Data-driven methods may be easier to use but require caveats. Each discipline is different and data needs to be specifically linked to a narrative or is easily misinterpreted, especially in disciplines producing non-traditional outputs.

Q10. Having regard to the Review’s Terms of Reference, the ARC Act itself, the function, structure and operation of the ARC, and the current and potential role of the ARC in fostering excellent Australian research of global significance, do you have any other comments or suggestions?

The ARC has stated in its most recent Strategic Plan that it should have more influence in shaping the Australian research system. There is a need to clarify what is intended. Currently, universities are independently governed entities. Universities make the decision with regards to researcher careers, strategic investments, meeting onerous government financial leverage, strategic investment in research infrastructure, nexus with teaching, mentoring, diversity and inclusion, international interactions and collaborations and so on. What role does the ARC expect to have in influencing this current system? How is that role defined in terms of the full ecosystem, e.g., NHMRC, RDC’s, CSIRO, AIMS and the universities themselves?

The research community involvement in the ARC needs some consideration. Given the shift in emphasis in purpose by the ARC, it is important to create a social contract between the research community and the ARC. It is suggested that this is analogous to the Australian community call of “Our ABC”. The shift of ARC in terms of emphasis of purpose from a source of research funding to a system influencer, if not driver, necessitates a new relationship with the researcher community. It is important that a culture of “Our ARC” be actively pursued.

If the scope of the research funding supported by the ARC were to be specified in amended legislation, it should ensure that humanities and social science (HASS) research is protected in all grant programs. This is particularly important, since recent letters of expectation, such as that by the former acting minister Stuart Robert (6 December 2021) specified 70% of Linkage Projects to National Manufacturing Priorities (NMP) and the remaining allocation to low emission, agriculture, defence and technology, effectively excluding HASS research from the Linkage Projects program. The humanities’ demands on the Linkage Projects program are low relative to other fields of research, and yet form a very important support for humanities research to link with state and national cultural institutions. These linkages significantly amplify the broader impact of humanities research within the general population by bringing sometimes challenging and complex ideas to the public through institutions whose core business is engagement and impact. To effectively cut-off the humanities from Linkage Projects overlooks the important role of Linkage Projects in enabling impact and engagement for the humanities and social sciences. Enshrining HASS research within the legislation across all programs might provide the protection to humanities research that is clearly needed.

The ARC Budget has been shrinking in real terms. Could the Government urgently consider higher investment to address the low success rates for many schemes and funding more of the projects and people deemed ‘fundable’ by the esteemed assessors?

To maintain research capability within universities, ARC Fellowships (not Laureates) could have 3 (or 4) years full-time followed by 3 years of part-time (75%, 50%, 25%) support to assist universities to integrate the fellows into continuing workforce. The host university would be required to commit the “mirror” salary for the last three years, i.e., 25%, 50%, 75%. The current system is a sequence of “highs and lows” with significant negative consequences between fellowships. Some of our most successful up and coming researchers go through trauma at the end of successful fellowships. This should be seen as unacceptable. The tapering system would mean fewer fellowships overall but far better stewardship of those awarded.

Lastly, we would welcome a mechanism for the ARC – perhaps in partnership with the Productivity Commission - to receive independently prepared data on a periodic basis regarding the value of funded research to the wellbeing of Australians and to our economy. This would provide ongoing evidence regarding the value of research to Australia, not only in economic terms, but also in relation to measures such as health, quality of life and equity.

Submission received

14 December 2022

Publishing statement

Yes, I would like my submission to be published and my name and/or the name of the organisation to be published alongside the submission. Your submission will need to meet government accessibility requirements.