Applies To

Note: A Microsoft 365 Copilot license is required to use Copilot Notebooks, which is currently rolling out. Additionally, accounts must have a SharePoint or OneDrive license (service plan) in order to create notebooks. Copilot Notebooks is not available for personal Microsoft accounts, like Microsoft 365 Personal and Family subscribers. Learn more about Microsoft 365 Copilot licensing and Microsoft 365 Copilot plans.

Why do my Loop shared pages and search results open in Copilot Pages?

Copilot Pages and My Workspace share the same storage. Pages shared from Loop workspaces (like My workspace or other shared spaces) will open in Copilot Pages via share links or search results. If you prefer, you can still get to those pages from Loop.

What’s the difference between Copilot Notebooks and OneNote?

Copilot Notebooks are AI-powered spaces designed to gather and selectively ground on the references in the notebook, using Copilot to answer questions or create context-aware content. 

OneNote notebooks focus on collecting and organizing notes, ink, and other information that you can share with others. They serve as long-term knowledge repositories. 

To learn more about how Copilot Notebooks and OneNote compare, see Compare Microsoft 365 Copilot Notebooks and Microsoft OneNote.

Does Copilot Notebooks use my data to train the large language model?

Copilot Notebooks doesn't use user data to train the large language model.

Where is Copilot-generated data stored for  Copilot Notebooks?

Copilot generates Copilot Pages (.page files) and podcasts, both of which are stored inside the notebook in which they’re created.

Where can I learn more about privacy with Copilot?

Microsoft 365 Copilot is built on the Microsoft comprehensive approach to security, compliance, and privacy. 

For more information, refer to the following: 

  1. If you’re using ​​​​​​​​​​​​​​Microsoft 365 Copilot in your organization (with your work or school account), see Data, Privacy, and Security for Microsoft 365 Copilot.

  2. ​​​​​​​If you're using ​​​​​​​​​​​​​​Microsoft 365 apps at home as part of Copilot Pro (with your personal Microsoft account), see Copilot Pro: Microsoft 365 apps and your privacy.

Can I trust that the answers are always accurate?

Generative AI features strive to provide accurate and informative responses, based on the data available. However, answers may not always be accurate as they are generated based on patterns and probabilities in language data. Use your own judgment and double check the facts before making decisions or taking action based on the responses.

While these features have mitigations in place to avoid sharing unexpected offensive content in results and prevent displaying potentially harmful topics, you may still see unexpected results. We're constantly working to improve our technology to proactively address issues in line with our responsible AI principles.

What should I do if I see inaccurate, harmful, or inappropriate content?  

Copilot includes filters to block offensive language in the prompts and to avoid synthesizing suggestions in sensitive contexts. We continue to work on improving the filter system to more intelligently detect and remove offensive outputs. If you see offensive outputs, please submit feedback in using the thumbs-up/thumbs-down icons so that we can improve our safeguards. Microsoft takes this challenge very seriously and we are committed to addressing it.​​​​​​​

How is Copilot Notebooks evaluated?

Microsoft 365 Copilot Notebooks was evaluated through extensive manual and automatic testing on top of Microsoft internal usage and public data. More evaluation was performed over custom datasets for offensive and malicious prompts (user questions) and responses. In addition, Copilot Notebooks is continuously evaluated with user online feedback. For more information, please visit Transparency Note for Microsoft 365 Copilot to learn more.

What operational factors and settings allow for effective and responsible use of Copilot?

Copilot Notebooks has been reviewed by our Responsible AI (RAI) team. We follow RAI principles and have implemented:  

  1. Responsible AI handling pipeline to mitigate risks like harmful, inappropriate content.

  2. In product user feedback with which users can report offensive content back to Microsoft

For more information, please review the Microsoft Responsible AI Transparency Report and our commitment to ISO/IEC 42001:2023 Artificial Intelligence Management System Standards.

More ways to work with Copilot Notebooks​​​

​Get started with Microsoft 365 Copilot Notebooks

How Microsoft 365 Copilot Notebooks works

Add reference files to your Microsoft 365 Copilot Notebooks

Get answers and insights about your notebook

Need more help?

Want more options?

Explore subscription benefits, browse training courses, learn how to secure your device, and more.