Frequently asked questions about Microsoft 365 Copilot Pages
Applies To
Note: Microsoft 365 Copilot Pages is available to all Entra ID (work or school) accounts with SharePoint or OneDrive storage, including those without a Microsoft 365 Copilot license. Learn more about Microsoft 365 Copilot licensing and Microsoft 365 Copilot plans. ​​​​​​​Copilot Pages is not yet available to personal Microsoft accounts, like Microsoft 365 Personal and Family subscribers.
When I create a Copilot page, where is it saved? Â
A Copilot Page is saved as a .page file in a new user-owned SharePoint Embedded container. Learn more about Copilot Pages storage.
How can oversharing be prevented with Copilot Pages? Â
There are a few ways to control oversharing in your organization. Learn more in the Summary of governance, lifecycle, and compliance capabilities for Copilot Pages and Loop experiences.
What’s the difference between Copilot Pages, Microsoft Loop, and Microsoft 365 Copilot Notebooks?
Copilot Pages and Microsoft Loop
Copilot Pages are .page files and share the same capabilities as Loop pages. When you share a page link to others from chat, recipients of the page link will open the Copilot page in the Loop app.
Copilot Pages and Copilot Notebooks
Copilot Pages is a collaborative, editable canvas in Copilot chat, letting users turn AI responses into shareable pages for team collaboration.Â
Copilot Notebooks is an AI-powered notebook that centralizes all relevant resources such as Copilot Chat, files, pages, meeting notes, and links. It provides focused answers tailored to your content and keeps references updated as your project evolves.
To learn more, see Compare Microsoft Loop, Copilot Pages, and Copilot Notebooks.
Where can I learn more about privacy with Copilot?
Copilot and Microsoft 365 are built on Microsoft's comprehensive approach to security, compliance, and privacy.
For more information about privacy, see the following information:
-
If you're using Microsoft 365 Copilot in your organization (with your work or school account), see Data, Privacy, and Security for Microsoft 365 Copilot.
-
If you're using Copilot in Microsoft 365 apps for home (with your personal Microsoft account), see Copilot in Microsoft 365 apps for home: your data and privacy.
Can I trust that the answers are always accurate?
Generative AI features strive to provide accurate and informative responses, based on the data available. However, answers may not always be accurate as they are generated based on patterns and probabilities in language data. Use your own judgment and double check the facts before making decisions or taking action based on the responses.
While these features have mitigations in place to avoid sharing unexpected offensive content in results and prevent displaying potentially harmful topics, you may still see unexpected results. We're constantly working to improve our technology to proactively address issues in line with our responsible AI principles.
What should I do if I see inaccurate, harmful, or inappropriate content? Â
Copilot includes filters to block offensive language in the prompts and to avoid synthesizing suggestions in sensitive contexts. We continue to work on improving the filter system to more intelligently detect and remove offensive outputs. If you see offensive outputs, please submit feedback in using the thumbs-up/thumbs-down icons so that we can improve our safeguards. Microsoft takes this challenge very seriously and we are committed to addressing it.​​​​​​​
How is Copilot Pages evaluated?
Copilot Pages was evaluated through extensive manual and automatic testing on top of Microsoft internal usage and public data. More evaluation was performed over custom datasets for offensive and malicious prompts (user questions) and responses. In addition, Copilot Pages is continuously evaluated with user online feedback.Transparency Note for Microsoft 365 Copilot to learn more.
For more information, please visitÂWhat operational factors and settings allow for effective and responsible use of Copilot?
Copilot Pages has been reviewed by our Responsible AI (RAI) team. We follow RAI principles and have implemented: Â
-
Responsible AI handling pipeline to mitigate risks like harmful, inappropriate content.
-
In product user feedback with which users can report offensive content back to Microsoft
For more information, please review the Microsoft Responsible AI Transparency Report and our commitment to ISO/IEC 42001:2023 Artificial Intelligence Management System Standards.
Where can I provide feedback for Copilot Pages?
In Copilot Pages, select Settings and moreÂ
More ways to work with Copilot Pages​​​​​​​​​​​​​​
Get started with Microsoft 365 Copilot Pages
Add Microsoft 365 Copilot Chat responses to multiple Copilot Pages
Draft content with Microsoft 365 Copilot Chat and Copilot Pages