How to disclose Microsoft Copilot use in academic writing
A practical guide for researchers who use Microsoft Copilot and need to disclose that use clearly in papers, theses, and journal submissions.
Microsoft Copilot is easy to use and easy to forget
That is the problem.
Copilot sits inside tools that many researchers already use. You open Word, PowerPoint, Outlook, Teams, Excel, or Edge, ask for help, and move on. Weeks later, when you prepare a submission, nobody remembers what came from you and what came from the tool.
That is how weak disclosures happen.
If Microsoft Copilot helped you draft text, rewrite prose, summarize notes, extract themes, suggest code, or build slides tied to your research, you should record that use. Current journal guidance backs that up. The ICMJE says that authors should disclose AI-assisted technology use at submission, describe it in the cover letter and in the submitted work when appropriate, and keep humans responsible for the final content. It also says that AI tools cannot be authors. (icmje.org)
Elsevier allows generative AI in manuscript preparation before submission, but asks authors to disclose that use in a separate section placed before the references and to follow the journal's Guide for Authors. Elsevier also draws a line between broader AI assistance and basic grammar, spelling, or reference-checking tools. (elsevier.com)
This guide shows you how to disclose Microsoft Copilot use in a way that editors can follow and coauthors can approve.
If you want a clean record instead of a vague sentence in the acknowledgments, generate an AI Usage Card while the work is still fresh.
What counts as Microsoft Copilot use
Many authors think of Copilot as "just editing help." Sometimes that is true. Sometimes it is doing far more.
You should document Copilot use when it helped with tasks like these:
- drafting or rewriting manuscript text
- summarizing articles, notes, transcripts, or meeting records
- generating outlines, tables, or slide content
- suggesting code, formulas, or analysis steps
- extracting themes from qualitative material
- preparing figures or image prompts
- drafting grant, thesis, abstract, or cover-letter text tied to the paper
Elsevier's policy helps you draw the line. If you used Copilot like a spellchecker or grammar checker, a formal declaration may not be needed. If you used it to generate, transform, summarize, or reorganize content, you should disclose it. (elsevier.com)
If you are unsure, use a simple test: did Copilot change the intellectual presentation of the work? If yes, write it down.
For a broader overview of disclosure duties across tools, read Do I Need to Disclose AI Usage in My Paper?.
Why tool-specific disclosure matters
"AI was used" tells an editor almost nothing.
A useful disclosure names the tool, the task, the input material, and the human review that followed. The ICMJE says that authors should carefully review and edit AI-generated content because it can be wrong, incomplete, or biased. It also says that humans must ensure originality, attribution, and the absence of plagiarism in text and images. (icmje.org)
Tool-specific disclosure matters for another reason. "Microsoft Copilot" is not one single thing. You might have used Microsoft 365 Copilot in Word, Microsoft 365 Copilot Chat, Copilot in PowerPoint, Copilot in Edge, or GitHub Copilot. Those tools do different jobs and may pull from different sources.
Microsoft says that Microsoft 365 Copilot Chat for work or school accounts provides enterprise data protection, that prompts and responses are not used to train foundation models, and that chat data is logged. Microsoft also says that Copilot can use work content such as files, emails, chats, meetings, and sites through Microsoft Graph when your account has permission to access that content. (support.microsoft.com)
That means your disclosure should be as specific as you can make it. "Microsoft Copilot" is acceptable. "Microsoft 365 Copilot in Word with an institutional account" is better. "GitHub Copilot for code suggestions in Python data-cleaning scripts" is better still.
The four questions every Copilot disclosure should answer
Keep the statement short. Make it answer four questions.
1. What tool did you use?
Name the product as precisely as you can.
Examples:
- Microsoft 365 Copilot in Word
- Microsoft 365 Copilot Chat
- Microsoft Copilot in PowerPoint
- Copilot in Edge
- GitHub Copilot
Do not guess. Check the app, your account page, or your institutional license if you are unsure.
2. What did you use it for?
State the task in plain language.
Good examples:
- rewriting paragraphs for clarity
- summarizing interview notes into draft memos
- suggesting a slide outline for a lab presentation
- generating starter code for data cleaning
- extracting action items from Teams meeting transcripts
Bad example:
- used for productivity
That tells nobody anything.
3. What material did you give it?
This part often decides whether a use case feels routine or risky.
Record whether you gave Copilot:
- published sources only
- your own draft text
- de-identified research material
- code
- confidential meeting transcripts
- peer review material
- sensitive or restricted data
Microsoft says that work or school accounts can give Copilot access to organizational content that your permissions allow. (support.microsoft.com) If you pasted unpublished text, participant material, or institutional records into the tool, keep that in your internal record even if your published statement stays brief.
This matters even more in peer review. Nature Methods said on February 11, 2026, that uploading manuscripts into generative AI tools during peer review is not allowed because it can compromise confidentiality. ICMJE also says that journals should prohibit uploading manuscripts to AI technologies when confidentiality cannot be assured, unless authors permit it. (nature.com)
4. What human review did you do?
This is the sentence that shows accountability.
State that you reviewed, edited, and verified all AI-assisted output before using it. If Copilot suggested code, say that you tested and validated it. If it summarized notes, say that you checked the summary against the original record. If it rewrote text, say that you revised the final wording yourself.
ICMJE puts that burden on human authors, not on the tool. (icmje.org)
A practical disclosure template for Microsoft Copilot
Use this when Copilot helped with writing support but did not shape the research itself:
\section*{AI use disclosure}
The authors used Microsoft 365 Copilot in Word during manuscript preparation to
suggest edits for clarity, shorten selected paragraphs, and generate outline options
for the introduction. The authors reviewed and revised all output and take full
responsibility for the accuracy, originality, citations, and final wording of the manuscript.Use this when Copilot also touched research materials:
\section*{AI use disclosure}
The authors used Microsoft 365 Copilot Chat with an institutional account to
summarize de-identified project notes and draft a preliminary outline for the methods
section. The tool was not used to make final analytic decisions. The authors checked
all summaries against the original records, rewrote the final text, and take full
responsibility for the manuscript.Use this when Copilot suggested code or formulas that affected analysis workflows:
\subsection*{Use of AI tools}
During data processing and manuscript preparation, the research team used GitHub
Copilot and Microsoft Copilot for limited assistance with code suggestions, note
summarization, and language editing. The team reviewed all AI-assisted outputs,
tested code before use, and did not treat AI-generated content as evidence or as a
source of interpretation.Use this shorter version for an acknowledgments-style statement when the journal asks for one:
\section*{Acknowledgments}
The authors used Microsoft 365 Copilot in Word for limited language editing and
outline generation during manuscript preparation. The authors reviewed and revised
all suggested content and take responsibility for the final manuscript.If you want more wording options, see AI Usage Cards examples and templates and How to disclose ChatGPT usage in academic papers. Then create a reusable record at ai-cards.org.
Where to put the disclosure
The right location depends on what Copilot did.
If Copilot helped with manuscript writing, many journals accept disclosure in an acknowledgments section, a dedicated AI declaration, or the cover letter. ICMJE says that authors should describe AI use in both the cover letter and the submitted work when appropriate. (icmje.org) Elsevier says that reviewers should find the disclosure in a separate section before the references. (elsevier.com)
If Copilot affected the research process, put the disclosure in the methods section too.
That includes cases where Copilot helped summarize notes, suggest code used in analysis, classify text, or generate outputs that shaped your workflow. Readers should not have to hunt for that information in the back matter.
If you submit to a specific venue, check the venue rules before you settle on final wording. Our guide to AI Disclosure Policies by Major Journals can help you start.
What not to do
Do not list Microsoft Copilot as an author. ICMJE and Nature policy reject that. (icmje.org)
Do not write a foggy statement like "AI tools were used for editorial assistance." That sounds like you are hiding the real task.
Do not claim that AI "verified" facts, "validated" findings, or "ensured" accuracy. You did that work, or you did not.
Do not bury research-related AI use in a cover letter alone. If the tool shaped the study workflow or the manuscript's methods, put that in the paper.
Do not trust memory. By submission time, your team will forget what Copilot touched.
A simple logging rule for labs and coauthor teams
Set one rule now: log AI use while the work is happening.
A lightweight internal log should capture:
- date
- user
- tool name
- product context, such as Word, Chat, PowerPoint, Edge, or GitHub Copilot
- task
- input type
- whether the output was used, revised, or discarded
- human review steps
That log makes submission easier. It also stops awkward last-minute questions from coauthors.
You can keep this log in a shared spreadsheet, a lab notebook, or an AI Usage Card. I prefer the card because it turns a messy process into a record you can reuse across a paper, thesis chapter, grant report, or conference submission.
Copilot, privacy, and institutional accounts
Researchers often ask a second question: if I used a university account, do I still need to disclose the tool?
Yes.
Your account type changes the privacy analysis, not the disclosure duty.
Microsoft says that Microsoft 365 Copilot Chat with a work or school account provides enterprise data protection and that prompts and responses are not used to train foundation models. It also says that prompts, triggered Bing search queries, and responses are logged. (support.microsoft.com)
That may lower some privacy risk compared with consumer tools. It does not remove the need to tell editors and readers that Copilot helped produce the work.
If your institution gave you access to Copilot, you should also check local policy on student records, research data, and confidential material before you upload anything sensitive. Your disclosure statement can stay short, but your internal documentation should note the account type and the kind of material you shared.
A better disclosure than a one-line acknowledgment
A one-line acknowledgment may satisfy a form. It rarely gives your coauthors or readers enough context.
An AI Usage Card gives you a cleaner record. You can document:
- the tool
- the task
- the inputs
- the outputs
- the human checks
- the limits you set on AI use
That record helps when one journal wants an acknowledgments note, another wants a methods statement, and your department wants a thesis declaration. You write it once, then adapt it.
If you are writing a thesis, pair this article with How to Disclose AI Usage in Your Thesis. If you work in a field with text-heavy methods, read AI Disclosure for Qualitative Research and AI Disclosure for Social Science Research.
The safest rule
If Microsoft Copilot changed your words, your structure, your summaries, your code, or your presentation of the research, disclose it.
You do not need a dramatic confession. You need a record that another person can read and understand.
Generate that record now with the AI Usage Card generator. Fill in the tool, the task, the inputs, and the review steps. Then copy the result into your manuscript, thesis, or submission materials.
Generate Your AI Usage Report
Create a standardized AI Usage Card for your research paper in minutes. Free and open source.
Create Your AI Usage Card