AI literacy for researchers How does AI relate to core academic values?
1 2 3 4
5
voltooid

Code of Conduct for Research Integrity

hoofdstuk
door: Steven Trooster
3 min.

The Netherlands Code of Conduct for Research Integrity spells out 44 standards for goodresearch practices. A selection of them is especially relevant for use of (AI) technologies inresearch. We explicate these here and formulate best practices, organised according tophases of the research process.

All research phases

Standard 1. Comply with relevant national and international laws and legal frameworks.

Good practice: The researcher is knowledgeable of the relevant regulations and only usesAI in ways that comply with them. Of particular interest are privacy and copyright;information should only be uploaded in AI systems if privacy and copyright are respected,and researchers should ensure that any AI systems they use comply with relevantstandards. In case a researcher lacks knowledge to assess such aspects, they are advisedto consult the information and support structure of Radboud University (see section 2 forresources).

Standard 2. Only accept research tasks that fall within one’s domain of expertise.

Good practice: The researcher possesses sufficient expertise to use AI in research (seesection 1). This includes sufficient knowledge of the possibilities and limitations of AI in thespecific research context.

Standard 4. Where possible and desirable, follow the principles of open science (e.g. opendata, open source, open access, reproducibility)

Good practice: Where possible and desirable, the researcher uses AI technologies thatmeet relevant open science standards.

Standard 6. Ensure that sources are verifiable.

Good practice: The researcher uses AI technologies that allow verification of originalsources.

Standard 7. Refrain from using technologies that hinder compliance with the principles andstandards for scientific integrity.

Good practice: The researcher uses only AI technologies that afford compliance withstandards for scientific integrity.

Standard 11. Make potential concerns about compliance with the NCCRI open fordiscussion.6

Good practice: The researcher contributes to a culture, in which concerns about researchintegrity can be discussed openly and without repercussions. This includes questionsabout AI and research integrity.

Design phase

Standard 12. Take the current state of scientific knowledge into account.

Good practice: If AI is used as a part of a research methodology, the researcher takes the current state of knowledge on this methodology and the AI technology into account.

Standard 14. Only make claims about potential results that can be substantiated.

Good practice: In the design phase, the researcher makes realistic claims about potential results that AI use in the research is expected to produce.

Execution phase

Standard 20. Use only technologies whose functionality is known and scientificallyvalidated.4

Good practice: When AI is used as part of a scientific methodology, the researcher makessure its functionality is known and scientifically validated.

Standard 21. Don’t fabricate data and results and present them as if they were real.

Good practice: If a researcher uses AI to generate synthetic5* data or text, the AI output*must be clearly identified as such. (Also see standards 4 and 6.)

Standard 25. Carefully document (raw) data used in the research.

Good practice: A researcher who uses AI technologies that statistically model (raw) datamakes sure that the (raw) data are documented. See also standard 4.

Reporting and dissemination

Standard 26. Be transparent about the methods and procedures followed and record themwhere this is relevant. The argumentation should be clear and the steps in the researchshould be verifiable and, if applicable, replicable.4 An exception is when the goal of the research is to develop, test, or validate (new) technologies. In such cases,technologies can of course be the subject matter of research. But keep in mind that other standards may still apply.5 Synthetic data are artificially generated data. Synthetic data—unlike real data—are not produced by real-world events.7

Good practice: The researcher is transparent about the use of AI technologies as part of the research methodology and records this where relevant.

Standard 28. Draw only conclusions that can be justified and make limitations anduncertainties explicit.

Good practice: The researcher who uses AI in their methodology draws only substantiated claims and honestly reports on any limitations and uncertainties.

Standard 30. Do not plagiarise.

Good practice: The researcher does not use AI technologies in ways that put them at risk of plagiarising.

Evaluation and review

Standard 41. Don’t share information you received for evaluation or review with otherswithout explicit permission, also not via software or other indirect means.

Good practice: The researcher treats manuscripts and proposals under review confidentially. Such materials must not be uploaded in AI applications that may cause information leaks.

Communication

Standard 43. In public communication, be honest and clear about the limits of one’sexpertise.

Good practice: The researcher is honest and clear about the limits of their expertise on AI.