Anonymous #24

Related consultation
Submission received

Name (Individual/Organisation)

Anonymous #24

Responses

Q1. How could the purpose in the ARC Act be revised to reflect the current and future role of the ARC?

For example, should the ARC Act be amended to specify in legislation:
(a) the scope of research funding supported by the ARC
(b) the balance of Discovery and Linkage research programs
(c) the role of the ARC in actively shaping the research landscape in Australia
(d) any other functions?

If so, what scope, functions and role?

If not, please suggest alternative ways to clarify and define these functions.

1) The ARC's purpose should be to administer funding to non-medical research (though see 3 below) that is _primarily_ focussed on creating new knowledge. It should not be the funding mechanism for industry projects whose aim is simply to provide existing/recent knowledge to industry for the purposes of "experimental development" (as per the definition referred to in the consultation paper). It seems to be there's a substantial amount of Linkage funding that could be classified this way. This sort of research should be done, of course, but not funded via the ARC as it already receives Government funding through many other mechanisms and, in my view, should be funded by industry primarily (as industry tends to reap the subsequent profits anyway).

2) In principle, I think it would be strange of the Act to specify a balance between Discovery and Linkage. However, it has been clearly demonstrated that politicians interpret these schemes as "Risky and longterm vs. Solid, Short-term and more easily seen as 'creating jobs'". Therefore, it seems the only way to head this off is to legislate a balance between them. I think that specifying a _minimum_ 70/30 balance for Discovery/Linkage is appropriate, given the

3) I think the ARC should have to agree with the NHMRC what constitutes medical vs. non-medical research so that there is always a joint definition that, as much as possible, always allows researchers and researcher offices to understand what proposed research falls under each agency.

However, clearly there's a risk of some research that, by its nature, straddles the remits of both agencies and might therefore not be funded only because of that. This should be considered when the ARC and NHMCR agree on their definitions.

4) I do not understand why the ARC sees itself as "shaping the research landscape in Australia". That should not be its function or aim. Its aim and function should be to "fairly and efficiently administer Government funding for research". Certainly the researchers the ARC contracts – its Executive Directors and College of Experts – should be shaping the research landscape, but only by making decisions based on the excellence of the research proposed by researchers.

Q2. Do you consider the current ARC governance model is adequate for the ARC to perform its functions?

If not, how could governance of the ARC be improved? For example, should the ARC Act be amended to incorporate a new governance model that establishes a Board on the model outlined in the consultation paper, or another model.

Please expand on your reasoning and/or provide alternative suggestions to enhance the governance, if you consider this to be important.

I do not consider the current ARC governance model to be adequate. I believe this has been demonstrated time and again in the last 10 years.

Yes, there should be a Board. The Board (not the Minister) should appoint the CEO. Under any model, the Minister appointing someone who is then supposedly responsible for advising the Minister cannot be trusted.

As far as possible, apolitical appointment of Board members is also important. Using term limits and retaining some members out of phase with others leaving can counter political interference.

I think university executives (Vice Chancellors, Deputy Vice Chancellors, Pro Vice Chancellors, Deans etc.) should _not_ be allowed to serve as ARC Board members. It is impossible for them to be perceived as acting on behalf of anyone outside their universities. As far as possible, Board members should be highly-regarded _practicing_ researchers across a very broad range of disciplines. I am not against non-researchers – i.e. Industry representatives – being on the board, but I think the research-background members of the Board should elect them. This will avoid stacking of politicians' favoured industry representatives on the ARC Board. I think Board members should be paid for their ARC work at slightly more than their normal salary level, for some portion of their time, and have to reduce their FTE at their substantive university jobs by the same portion.

I think the Board should determine what proportion of funding should go into the Discovery and Linkage programs. I also think they should have to approve the amount of funding provided in different schemes, the creation or discontinuation of schemes (especially "Special Initiatives", which often waste huge amounts of time and offer little funding). I think they should be able to direct the ARC to make changes to processes when there's evident need for it.

Q3. How could the Act be improved to ensure academic and research expertise is obtained and maintained to support the ARC?

How could this be done without the Act becoming overly prescriptive?

I think the ARC has made a mistake over the last 10 years in (seemingly) refusing to engage with the vast bulk of the research community and those most affected by its decisions and practices, i.e. actual researchers. They say they "consult the sector" very often, but what does that mean? All too often – and almost always – it only means they've asked the peak body of research management in Australia (I forget their acronym) or research office staff. While it's good to consult those stakeholders, it seems the ARC goes out of its way to avoid researchers themselves! This seems to be why there's now a fairly adversarial public relationship between the ARC and researchers, and this need not continue. Engagement requires money. But, to me, it would be money very well spent if a small increase in the ARC's operating budget was given over to active communication and question-answering with the researchers themselves (and their universities), instead of saying "we will only communicate with a researcher when they make an FOI request or we want them to review a proposal for us".

What's become clear in the last 5 to 10 years is that ARC practice frequently diverges from the norms of the disciplines themselves (battles over fonts, for goodness sake, preprints, various ineligibility scandals, poor and confused communications etc.). It's never possible to guarantee that the CEO, or those in charge of various processes, will consult those within the disciplines – the Executive Directors or the College of Experts – about possible changes. But if possible then some sort of legislative requirement to do so would be welcome.

It is unclear to me what input the Executive Directors have into ARC practice and process. They may very well be adding a lot of value, but that is difficult to see or know from the outside. Perhaps the ARC could promote a better understanding of their function and who they represent, beyond their current public-facing duties of presenting standard ARC talks to universities on rotation.

Q4. Should the ARC Act be amended to consolidate the pre-eminence or importance of peer review?

Please provide any specific suggestions you may have for amendment of the Act, and/or for non-legislative measures.

The Minister should only ever be allowed to refuse to fund a grant recommended by the ARC if there's actual (not perceived, but actual, demonstrable) national security or criminal concerns. Anything else is blatant political interference. This must be made absolutely clear in the legislation, otherwise I think this ARC Review will fall short of its most important task.

Ideally, the provision for Ministerial refusal to fund a recommended grant should be removed from the legislation completely. I don't see why there's anything stopping the Minister from simply approving funding in all cases, including any where there's suspected (or concerns of) national security or criminality and then, once those matters have been appropriately investigated, stopping the ARC forming a funding contract with the recipients. That process would surely then be open to appropriate appeals etc. – basically, there'd be an avenue for natural justice. And the contract could obviously include clauses which mean funding will be withdrawn if at any stage national security or criminal concerns are proven appropriately. But there simply does not need to be any legislative room for political vetos of grants of any kind.

If any grants are not funded by the Minister, or any are withdrawn because of proven national security problems or criminality, then the Minister must have to table and answer questions about appropriately anonymised information about it in the parliament.

The Act should also be amended to remove the possibility of providing the ARC with any other means of rejecting peer-review rankings from the College of Experts. It's my understanding that the National Interest Test process was, until recently, such that the CEO could simply decide whether or not to fund a proposal based on their own appraisal of the NIT statement. This is entirely inappropriate. The legislation should prohibit any form of subjective decision making by any ARC staff outside the peer review judgements/rankings from the Selection Advisory Committees, that would lead to a proposal not being recommended for funding.

Q5. Please provide suggestions on how the ARC, researchers and universities can better preserve and strengthen the social licence for public funding of research?

My understanding is that after the ARC Review's consultation paper was released, the ARC announced a change to the NIT which means the NIT statement will be used within the normal peer review process as part of the "Benefit" criterion for assessing/scoring a proposal. This is a good change. However, the NIT has always been, and remains even now, entirely redundant. If it is written for the public, as I think it still needs to be, then it is not useful for the assessors, who are not the public. So why bother having it in the proposal at all? There should be not "test" here. The "test" is already in the "benefit" section of the Project Description in ARC grants.

Instead, the NIT should simply be a "Statement of public benefits", written for the public and, importantly, only written after grant funding has been awarded. Writing it, to a good standard, and with sign-off from the university, can simply be done as a condition of the funding contract with the ARC. There is no need to introduce more and more into the proposal stage when, really, the only information ever read by the public is for funded grants.

Of course researchers receiving public money should explain to the public what they will do with it. And in plain language that most members of the public can understand. Normally you can't stop them! But there's no need to "front load" this when it shouldn't be part of the assessment of the proposal (which it absolutely shouldn't be) and when 80% of proposals aren't funded. Do it after funding is allocated. Simple. And there's definitely no need for this to be so prescriptive – and nothing like the over-the-top, absurd levels of NIT-picking we've seen this year (another own-goal by the ARC). All this would help improve the social license for public funding of research, with no wasted time at all and the minimum of bureaucratic fuss.

Q6. What elements of ARC processes or practices create administrative burdens and/or duplication of effort for researchers, research offices and research partners?

The fact that the ARC refuses to engage directly with researchers is an enormous source of duplication and wasted effort. Every university research office has to spend staff time channeling pretty much the same set of questions from its researchers to the ARC, and the ARC has to channel answers back to the university research offices, who channel it back to the researchers. Instead, the ARC's operational funding should be increased to engage more directly with those asking the questions (mostly researchers themselves, but sometimes research office staff too) and, to as great an extent as possible, make that information public so that the same question need not be addressed multiple times. There is currently duplication of effort across 40 universities and who knows how many researchers!

There is a huge waste of effort in trying to understand whether a specific researcher is eligible. I simply cannot understand why a simple eligibility tool, within RMS, is not made available so that any individual research can immediately check whether they are eligible for apply for a particular scheme. That university research office staff spend their time doing this manually is obviously a waste of their time when the answer is just algorithmic and can easily be included in an online tool.

Researchers and research offices spend enormous amounts of time getting budgets "right", down to the dollar, and then see their funding request cut by sometimes 50% or more. So what is the point of all that careful budgeting? Budgets should be standardised, with fixed costs for the most requested items simply locked in. For example, "International conference" could simply be a standard budget item with a fixed (reasonable) level of funded attached, requiring no justification. Then, if someone really wants more than the standard amount, they can ask for it and provide a justification (e.g. higher-than-standard conference fee; expensive location etc.). There is simply no need to justify these things in so much detail as currently done. New settings might be needed for hiring, say, postdocs, as they typically have different salaries at different universities. One possibility: Get universities to provide their staff costs at the start of each year and lock those in so that applicants simply select an academic level for each year of someone they intend to hire; the costs would then be automatic. Basically, cut down on the complexity and amount of justification required in budgets to greatly simplify and streamline budgeting.

International PIs who will not receive funding currently face a ridiculous level of bureaucracy to be listed on a grant. It's actually embarrassing to invite any international person onto an ARC grant because it's so much work for them. The Australian CIs end up ghost-writing everything for them. Remove this completely. A simple button in RMS for an application could "Send invitation letter to PI" which sends relevant information to the PI, who accepts the invite (which is equivalent to them "signing" onto the application), and that's it. Then the CIs need to justify why that PI is involved within the Project Description, and that's all.

It seems there are many people made ineligible because they breached the medical research eligibility criteria. While I'm not affected by this personally, I know people who have put huge effort into making sure they don't fall foul of this and, yet, are judged as ineligible even after getting excellent reviews and submitting rejoinders etc. This really is cruel, let alone a waste of effort. The eligibility requirements need to be clearly, and easier to _test_ well _before_ putting in an application.

The proposals are far, far too long. They have become bloated with huge amounts of extra material that simply is not required – should not be required – for any assessor to judge the proposal (or "break a tie" with another proposal). It is simply absurd that to get a $400k grant to hire a single person for 3 years one has to write a proposal longer than 100 pages, only 10 of which actually matter (the Project Description). Every effort to reduce the length of proposals needs to be taken. This will make it easier to propose, easier to assess, and easier to administer (within the ARC). The new ARC Board should decide how long proposals should be to be reasonable (given the small scale of funding involved a lot of the time) and help the ARC reduce the proposals by at least 50%.

Proposal instructions are also currently highly prescriptive. Stop it with the rules about margins and fonts for goodness sake, and exactly what must or must not be addressed under each prescribed section heading, and let people use their presentation skills to create a proposal that gets the main points across most effectively.

Finally, it is simply strange that Australian researchers are not sent an email informing them of their proposal outcome. Instead, they have to monitor social media to see when the ARC or their friends post that the RMS system now shows the results in public-notice-board fashion. It's an important positive or negative moment for so many people, and different people want to experience that differently. A simple email would allow that. I also support suggestions for a fixed 2-day embargo on publicity about outcomes for every scheme after results are emailed to applicants. Let people celebrate amongst themselves, or commiserate with their colleagues, or keep their outcome to themselves for a couple of days if they wish, before announcing the results publicly at a random time that doesn't give people the space and time for the result to sink in. This isn't difficult. It's just an email and rules (that already exist) about how embargoes work.

Q7. What improvements could be made:

(a) to ARC processes to promote excellence, improve agility, and better facilitate globally collaborative research and partnerships while maintaining rigour, excellence and peer review at an international standard?

(b) to the ARC Act to give effect to these process improvements, or do you suggest other means?

Please include examples of success or best practice from other countries or communities if you have direct experience of these.

ARC processes:

1) The processing of grant applications takes far too long. It's often 9 to 10 months between the due date for applications and outcomes being announced. By that time, industry partners have moved on, research fields have moved on, and the proposed research is no longer current. The processing time needs to be less than 6 months and, ideally, about 4 or 5 months. It was about this long for Discovery Projects this year (which closed much later than usual), so it is clearly possible to achieve (though perhaps the ARC's operations budget needs a small increase to ensure this, but it's money very well spent!).

2) Applications need to be far shorter. At least 50% shorter than currently. Project Descriptions shouldn't be cut back much, but really don't need to be longer than 8 pages (in Discovery Projects) once all the unnecessary stuff is taken out (e.g. prescriptive things to address in each section; the "Communication of results" section is a pointless waste of space that isn't assessed anyway etc.). But there can be huge amounts saved elsewhere:
– Budgets should be standardised so that extra written justifications aren't needed in most cases.
– Most of all, the ROPE (i.e. CV) section for each researcher should be no more than 2 pages. Currently there's 7 pages of a huge array of stuff, with ultra-prescriptive headings/sections. Let people tell a story (briefly!).
– Remove 10 career best and full publication list and replace with "10 most relevant research outputs" or similar. Nothing prescriptive.

3) Much greater emphasis in the applications – and instructions to assessors – about O (for "opportunity") in "ROPE" (e.g. CV) sections of proposals. I think most assessors overlook this.

4) Much clearer and easier to access instructions for assessors. Currently, assessors have to really look for the instructions they're supposed to follow. I think most assessors don't ever access them (the ARC should know this from webpage statistics). The instructions should be extremely simplified, giving a 1 page summary of what the assessors are to assess, and what they are NOT to assess. The instructions should emphasise fairness and the "O" (opportunity) in "ROPE".

5) PIs should not have to complete a ROPE. At most, there should be a simple statement of their suitability, and they've agreed to participate at such-and-such an FTE rate etc., but not the full ROPE information. Currently that's a massive barrier to participation from PIs.

6) Currently, the ARC is not doing a good enough job of ensuring the quality of peer reviews. _Many_ reviewers are providing entirely unprofessional statements in their reviews, or their entire reviews are clearly coloured by misconceptions, biases or simply their own "reckons" about the proposed research. While I don't think it's possible to remove the anonymity of reviewers in the process, I think that the anonymity allows reviewers to behave entirely inappropriately, and without much diligence, in many cases. I am very supportive, and was a signatory of, the suggestions for improving ARC grant processes in Prof. Jodie Bradby's 2021 "pre-budget submission": https://treasury.gov.au/sites/default/files/2021-05/171663_bradby_professor_jodie.pdf. This includes the suggestion there to allow the applicants to flag unprofessional statements in reviews, or entire reviews, and have the College of Experts "carriages" of the application assess that claim, not the ARC staff as is currently the case. CoE members within discipline panels should be encouraged to communicate about these cases, to ensure some consistency of practice, or a third "moderator" CoE carriage could be asked to review the decisions in these respects within a discipline panel or even across panels. For example, I mentored a DECRA applicant in the last few years about whom one assessor claimed (without any evidence at all) that all their ideas were taken from their PhD supervisor and that, on those grounds, no funding should be awarded. Now, very clearly this review should not have counted, as it was highly biased and no evidence of any kind was offered. And yet the ARC staff declined to recognise this and remove it from consideration. This is a gross failure of quality control! This needs to be addressed as it a VERY significant means by which biases are introduced into the peer review screening of proposals. I also agree with the above pre-budget submission that actual penalties for unprofessional reviews need to be levied, or threatened, to stamp out poor behaviour. It really is bad enough that this needs to be done.

7) I think we need to abandon the idea that most detailed assessors are unbiased if we want a truly fair assessment system. This is because they are anonymous and, so, can say whatever they like and very few such reviews are ever removed by the ARC. They can also say nice things and provide poor scores. There is very clear anecdotal evidence for this, and I have the same experience (not just for me but for mentee's applications) but unfortunately the ARC forbids any public discussion of particular instances, so it's always going to be hard to get more concrete evidence (lack of evidence here should not be considered evidence of absence). I think several things should be done about this:
– Only show the project description to detailed assessors, which should be anonymised (see 8 below).
– Implement a fairer "unprofessional review" flagging system (see 7 above).
– Ask more reviewers to review each proposal (I know it's hard to do, but only having to review the project description will make it less of a burden), across a wider range of disciplines, and ask the reviewers to provide a self-assessed weighting of how expert they are in the field of the proposed research. Then outlier scores could be removed before forming a less biased weighted average score for the detailed assessment.
– Show the individual assessor scores at the rejoinder stage, not just their comments. Currently comments can be "damning with faint praise" but applicants do not even realise this and don't know there's something to address in their response. It would also help applicants flag bad-faith reviewers to the College of Experts. It would also mean that some people would simply not have to bother submitting a rejoinder at all because they were clearly too low-ranked to have a chance of funding.

8) I think serious consideration should be given to dual-anonymous review. This is where proposals are anonymised so that the reviewers do not know who the applicants are. Even if it is possible to work out who the applicants are (because, say, the field if very small or the project is very niche), removing the identity information has a strong effect on reducing biases. This has been found in, for example, the Hubble Space Telescope proposal process (https://iopscience.iop.org/article/10.1088/1538-3873/ab6ce0), as well as the European Southern Observatory telescope time allocation process (e.g. https://www.nature.com/nature-index/news-blog/following-nasa-lead-researchers-targeting-gender-bias-science-instrument-time). I appreciate that anonymisation is not possible for the ROPE section, but then only very well trained and experienced assessors – i.e. the College of Experts members – should be allowed to see non-anonymised sections.

9) Currently, the ARC include a requirement that one must be an assessor for ARC grants if one wins an ARC grant. While I think the ARC should definitely ask, and even insist on this, they have the power to withdraw funding from those who refuse, even if they refuse as a protest to ARC behaviour or as a matter of conscience. They have done this in the past, and I think it's quite wrong. I do not think the ARC should have this power. It is very clearly against academic freedom and should not be allowed.

10) The RMS system should have automated functions to check the eligibility of applications. That many eligibility issues are only caught late in the process and revealed to the applicant only when outcomes are released is almost criminal. I am told that submission systems in The Netherlands and Denmark have systems where eligibility is assessed automatically (and manually for some aspects) before the application is submitted. Applicants are encouraged to "pre-submit" their application for eligibility and compliance checking _by the funding agency_ (not their university research offices) as often as possible so that eligibility issues are caught early and resolved by the final application date. Or, eligibility checks could be done by the ARC early in the process, even with the possibility of applicants being able to amend applications if necessary to comply with the rules in, say, the first month after submission or something. But there's got to be a better way than the current system!


The ARC Act should be changed to:

1) Force the ARC to publish the expected outcome announcement date _before a scheme opens_. Even though the ARC said they will give a 2 week window on their website for all schemes, that is not currently true (2022-12-07), especially for the Discovery schemes (e.g. Discovery Projects 2024 is already open but has an announcement date of "Fourth Quarter 2023" and the "ARC Scheme calendar 2022-23 PDF" only shows dates through June 2023).

2) Outcome announcement dates should be a single day, not a 2-week window. If this is somehow impossible or undesirable to specify in the Act, then the Act should specify the maximum window size, otherwise the ARC will simply grow this window for their own convenience with time.

3) To assist with scheduling suggestions (1) and (2) above, the Minister should have to approve grants within 1 week of receiving recommendations from the ARC. As there should be no Ministerial vetos allowed, there is no reason for this step to take any longer than a week.

4) Force the announcement of ARC grant outcomes to be done by the ARC and not the minister. A politician should not be able to "take credit" for grants given via the ARC's peer review process. Or, as we saw a few years ago, get local MPs or senators on their own political side to "hand out" the funding, with a photo opportunity, to individual universities. The award of ARC grants should be seen as absolutely apolitical and the outcomes for all grants for any given scheme must be announced simultaneously.

5) Limit the processing time for grants (between grant application deadlines and announcement dates) to 6 months. Really, processing should be faster than that, but having this in legislation will mean real certainty for researchers in how long they know they might have to wait for decisions to be made about their futures.

6) Force the ARC to release much more (appropriately anonymised) data. Currently the ARC release so little data about grant applications and outcomes that the research community struggle to make any sensible analyses of it so as to inform future improvements. For example, I see no compelling reason why the ARC cannot release data with the number of applications in each FOR code, with the gender and academic level information of all CIs included (and including non-binary gender(s)). This should be done _immediately_ when the outcomes are announced. Currently, researchers have to guess at numbers of applications in different fields, and FOR-code level success rate data is not released for years sometimes, and contains very little information of any use. Perhaps this, and other transparency measures, should be contained in a "Transparency" section of the Act?

Q8. With respect to ERA and EI:

(a) Do you believe there is a need for a highly rigorous, retrospective excellence and impact assessment exercise, particularly in the absence of a link to funding?

(b) What other evaluation measures or approaches (e.g. data driven approaches) could be deployed to inform research standards and future academic capability that are relevant to all disciplines, without increasing the administrative burden?

(c) Should the ARC Act be amended to reference a research quality, engagement and impact assessment function, however conducted?

(d) If so, should that reference include the function of developing new methods in research assessment and keeping up with best practice and global insights?

a. No.

b. I'm not sure.

c. I don't see a good reason for doing that. It locks the ARC into performing this function, when I do not see the ARC as having that function (or, currently, the expertise or capacity for it).

d. Not applicable given my answer to c.

Q9. With respect to the ARC’s capability to evaluate research excellence and impact:

(a) How can the ARC best use its expertise and capability in evaluating the outcomes and benefits of research to demonstrate the ongoing value and excellence of Australian research in different disciplines and/or in response to perceived problems?

(b) What elements would be important so that such a capability could inform potential collaborators and end-users, share best practice, and identify national gaps and opportunities?

(c) Would a data-driven methodology assist in fulfilling this purpose?

a. I don't think the ARC really has much expertise or capability in this regard. I don't think they should be charged with demonstrating the value and excellence of Australian research. That should be up to researchers and universities, not an administrative Government agency.

b. I don't know.

c. Clearly, but I've got no idea what that would be.

Q10. Having regard to the Review’s Terms of Reference, the ARC Act itself, the function, structure and operation of the ARC, and the current and potential role of the ARC in fostering excellent Australian research of global significance, do you have any other comments or suggestions?

I think the most profound and important problem the ARC faces is it's real disconnection from researchers. The social license the ARC should have can only be provided by researchers. Even if the Government, university executives and research offices, and the public think the ARC has social license, the ARC will fail if it does not listen to and work with researchers. It's this disconnection, I think, that has led to so many of the ARC's problems in the last 5 to 10 years. Even if more operations funding is required for the ARC to much more deeply engage with researchers themselves – not just peak body representatives, not just DVCRs and VCs, not just research offices, but everyday practising researchers – then it will be money extremely well spent.

Submission received

07 December 2022

Publishing statement

Yes, I would like my submission to be published but my and/or the organisation's details kept anonymous. Your submission will need to meet government accessibility requirements.