Skip to main content
For employees Search

AI use in Research

A general guide that addresses general issues related to the use of artificial intelligence (AI) in research.

Updated: December 2025.

The breakthrough in artificial intelligence and the fast development of services pose both possibilities and challenges. We must be mindful that AI has its limitations in comprehending deep human concepts that are often closely linked to the arts, such as understanding, consciousness, mortality, free will, meaning, empathy, intuition, creativity, culture, aesthetics, ethics, and philosophy. While AI mirrors our reflections in intelligent ways, we must always reflect independently and critically on our use of it.

We also want to note that AI tools related to music production and composition are ongoing discussions, and these guidelines are subject to future updates.

AI guideline for research

This is a general guide that addresses general issues related to the use of AI in research.

  • The guidelines are not exhaustive. Every research project is unique in its ethical considerations, and this guide serves only as a general recommendation.
  • Users of the following guidelines include, but are not limited to, academic staff and external personnel working on behalf of NMH. The following guidelines will undergo revisions in dialogue with staff and faculty.
  • The guide does not take a position on AI tools related to music production and composition. There are plans for NMH guidelines on AI tools regarding music production and artistic research.
  • The guide does not include guidelines concerning AI tools used by practitioners within the field of the 'Norwegian model of artistic research' (kunstnerisk utviklingsarbeid), where creating and performing art are at the core. The Norwegian artistic research school (Nasjonal forskerskole i kunstnerisk utviklingsarbeid) is collaborating with partner institutions to develop AI guidelines, which are expected to be accessible in autumn 2025.

What is Artificial Intelligence?

At NMH, several tools are already in use, especially as aids for writing and feedback on drafts.

Generative Artificial Intelligence is a subfield of Artificial Intelligence that refers to models that can produce text, images, videos, sound, or other data given suitable input.

To make this possible, these models are trained on vast amounts of data. During training, the model identifies and learns relationships between the data that it is being trained on. Using this acquired understanding, the trained model can produce new data, given new input.

Responsible use of AI models

Recommended AI tools

Examples of other AI tools

Note that NMH may not have institutional access to all services listed below.

Examples of AI use

Citing AI Tools

AI systems are neither authors nor co-authors. Software must be cited whenever it has an impact on the research outcome. We advise NMH researchers to be mindful when presenting and publishing their work.

AI Use in Academic Publishing

Knowing general guidelines on the use of AI is helpful when working with a journal, either as an author, reviewer, or editor. If the journal or the publisher has updated guidelines on the use of AI, consult those documents, as they are likely to reflect the concerns of that journal or publisher.

Only humans can assume authorship of academic articles. AI tools are unable to take the responsibility for research output. Therefore, traditional academic and ethical concerns in research and reporting lie with the human author(s).

The publisher Taylor and Francis, with whom NMH currently has a publishing agreement, declares what authorship means in their AI policy: AI Policy - Taylor & Francis.

Some journals require authors to disclose the use of AI in a separate form. Some also request that substantial usage of AI in any phase of research be disclosed in the manuscript. This way, AI usage can be taken into consideration in the peer-review process.

Dos and don'ts: practical guidelines

Below are practical recommendations for the responsible use of AI tools in research and publishing.

Do:

  • follow the journal’s guidelines, or request advice from editors if none can be found
  • critically evaluate any output of AI tools that you use in all phases of research and reporting. Investigate the data use policies of the tools you wish to use
  • be transparent and document how AI tools were used in all phases of research and reporting
  • as a reviewer for journals: These guidelines echo publishers, such as Taylor and Francis, that prohibit submitting any text written by others into any form of generative AI tools, for review purposes
  • as an editor for journals: Using AI tools to translate abstracts of already published articles is one usage that is suggested as acceptable by some journals
  • Submit the NMH AI-declaration form if needed

Don’t:

  • Do not submit your own or other authors' unpublished work into AI tools that do not meet the strict requirements on data privacy and confidentiality
  • When reviewing student papers, academic staff should not use AI tools as a means to make academic assessments.

Recommended resources