Confidentiality obligations of assessors
Confidentiality obligations of assessors
MEDIA STATEMENT
30 June 2023
Confidentiality obligations of assessors
Like many organisations, the Australian Research Council (ARC) is considering a range of issues regarding the use of generative artificial intelligence (AI) that use algorithms to create new content (such as ChatGPT) and that may present confidentiality and security challenges for research and for grant program administration.
While we are undertaking this work we would like to remind all peer reviewers of their obligations to ensure the confidentiality of information received as part of National Competitive Grants Program processes.
The Australian Code for the Responsible Conduct of Research, 2018 sets out that individuals are to participate in peer review in a way that is fair, rigorous and timely and maintains the confidentiality of the content. If there are concerns with how confidentiality has been managed during a review, the ARC has a robust process to manage this concern. Further information can be found under the ARC Research Integrity Policy.
Release of material that is not your own outside of the closed Research Management System, including into generative AI tools, may constitute a breach of confidentiality. As such, the ARC advises that peer reviewers should not use AI as part of their assessment activities.
The ARC will be updating guidance on this area in the near future.