Note: This article refers to privacy and safety topics related to Microsoft Copilot. It does not apply to the use of Microsoft 365 Copilot when signed in with Entra ID. For more information on this read about Enterprise Data Protection.
Conversation History
Are my Copilot conversations saved?
By default, the conversations you have with Copilot are saved, and you can view and access past conversations. For example, if you’ve created a detailed trip itinerary with Copilot, the conversation you have will be saved. You can return to and reference that conversation by accessing your conversation history in the Copilot app.
How will my Copilot conversations be used?
Your privacy and trust are paramount to us. Microsoft will only use your conversations for the limited purposes explained in the Microsoft Privacy Statement to troubleshoot problems, diagnose bugs, prevent abuse, and to monitor, analyze, and improve performance, and so we can provide Copilot to you.
We give you the power to decide if you want us to use your conversations for other purposes. You control whether we can use your conversations to:
-
Personalize your experience with Copilot and provide you with a more tailored and useful experience that meets your needs. You can disable personalization at any time.
-
Train our generative AI models to create a better experience for you and others. You can opt out of use of your conversations for model training at any time.
How long are my Copilot conversations saved for?
By default, we store conversations for 18 months. You can delete individual conversations or your entire conversation history at any time.
Can I delete my past conversations?
Yes, you can delete past conversations in Copilot.
You can delete individual conversations within your conversation history or delete your entire conversation history.
Will my Copilot conversations become visible to other users?
No, nothing you say to Copilot will be made public. Your conversations and data will never be shared with other users.
How will my privacy be protected when I share a file with Copilot?
If you share a file with Copilot (for example, uploading a document and asking Copilot to summarize it), the file will be stored securely for up to 30 days and then automatically deleted. Regardless of your privacy settings, we do not train our Copilot generative models on your uploaded files. And any conversations you have about the file will be treated just like any other conversation, subject to your choices about whether to permit model training and personalization, and which you can delete at any time.
How should I engage with Copilot on confidential or sensitive topics?
You shouldn’t provide any confidential or sensitive personal data that you would not want Microsoft to use for purposes explained in this FAQ and in the Microsoft Privacy Statement. This includes data that might reveal, for example, your race, religion, sexual orientation, or health status.
At any time, you can delete a prior Copilot conversation from your conversation history. You can also control whether Copilot uses your conversations for model training or to personalize your experience.
What if I have questions about privacy or data protection?
To learn more about Microsoft’s commitment to privacy, visit the Microsoft Privacy Statement and Privacy at Microsoft.
Personalization & Memory
How does Copilot personalize my experience?
If personalization is enabled for you, Copilot will start remembering key details from your conversations and will use those memories in future conversations to provide you with a more tailored and relevant experience.
What does Copilot remember about me?
If personalization is enabled, Copilot remembers key details you share, such as your name, interests, and goals. Copilot does not remember demographic or other sensitive data.
You can edit or delete what Copilot remembers or turn off personalization entirely at any time.
How can I see, change, or delete what Copilot remembers about me?
You can control what Copilot remembers about you at any time.
Is personalization on by default?
If personalization is available to you, it will be on by default. Personalization is not currently available for users in Brazil, China (excluding Hong Kong), Israel, Nigeria, South Korea, and Vietnam, or for users who are not signed in to Copilot via their Microsoft Account or other third-party authentication.
When you first begin using Copilot you will see a notice at the start of your first few conversations, and periodically thereafter, if personalization is enabled for you. To check or change your personalization settings, click the link in that notice to opt out, or disable personalization in your Copilot settings.
How can I turn off personalization?
You can turn off personalization at any time. Read about how to control personalization in Copilot.
Please note that these options are only available to users who are signed in, as personalization is disabled for unauthenticated users.
If you turn off personalization, Copilot will forget its memories of your conversations. You can still view your past conversations; however, your future Copilot experience will no longer be personalized. If you later turn personalization on again, Copilot will begin remembering details that you’ve shared with it and will begin personalizing your experience again.
Does the Copilot personalization setting control personalized advertising?
The personalization setting in Copilot does not control whether you receive personalized ads, which is a separate choice you can make. You can change your personalized ads setting, which applies to Copilot and other applicable Microsoft services, at any time.
In certain countries, we may also present you with a prompt that will ask you to decide whether to receive personalized ads in Copilot. If you have chosen not to receive personalized ads, through either the personalized ads setting or the Copilot-specific prompt, we will not deliver personalized ads to you in Copilot.
If your settings allow us to deliver personalized ads in Copilot, and if Copilot’s personalization setting is enabled, we will use your Copilot conversation history to help further personalize the ads you already receive. Note that regardless of your settings, Copilot does not serve personalized advertising to authenticated users under the age of 18.
If I turn off personalization, does that delete my conversation history?
You can turn off personalization within Copilot settings at any time. Turning off personalization won’t delete your conversation history, but Copilot will forget its memories of your conversations and will stop personalizing your experience.
Does Copilot personalize interactions based on sensitive topics?
Regardless of your user settings, Copilot is designed to never personalize interactions with you based on certain sensitive topics like your personal attributes (e.g. age, gender, race/ethnicity, sexual orientation), health-related information, and political affiliation and preferences. This protects your privacy and prevents the use of potentially sensitive information. We have measures in place to filter out content from your past conversations that may be considered sensitive, even if personalization is turned on in your settings.
However, please use caution when sharing information with Copilot that you consider to be sensitive. At any time you can control what Copilot remembers about you.
Model Training
What is “model training”?
Generative AI refers to a category of AI models that analyze data, find patterns and use these patterns to generate or create a new output, such as text, photo, video, code, data, and more. “Training” a generative AI model means providing it with information to help it learn to make predictions or decisions. Training is a broad concept that includes many different activities to help models provide more appropriate results.
These models use training data to learn general relationships in language, not to memorize specific conversations. They do not store or have access to the original training data. Instead, generative AI models are designed to generate new expressive works and content.
We also take additional steps to prevent these models from inadvertently reproducing their training data, such as conducting testing and building filters that screen out previously published or used material.
What external data does Microsoft use for training Copilot?
Microsoft uses publicly available data, mostly collected from industry-standard machine learning datasets and web crawls, like search engines. We exclude sources with paywalls, content that violates our policies, or sites that have utilized industry-standard methods to opt out of training. On top of this, we do not train on data from domains listed in the Office of the United States Trade Representative (USTR) Notorious Markets for Counterfeiting and Piracy list.
Does Microsoft use my data to train AI models?
Except for certain categories of users or users who have opted out, Microsoft uses data from Bing, MSN, Copilot, and interactions with ads on Microsoft for AI training. This includes de-identified search and news data, interactions with ads, and your voice and text conversations with Copilot. This data will be used to improve Copilot and our other products and services to create a better user experience for you and others.
By using real-world consumer data to help train our underlying generative AI models, we can improve Copilot and offer a more reliable and relevant experience. The more diversity in conversations our AI models are exposed to, the better they will understand and serve important regional languages, geographies, cultural references, and trending topics of interest to you and other users.
You can also control whether Copilot uses your conversations for model training.
What data is excluded from model training?
We do not train Copilot on data from the following types of users:
-
Users signed into Copilot with an organizational Entra ID account. You can learn more about Enterprise Data Protection.
-
Users of Copilot within Microsoft 365 apps with Personal or Family subscriptions.
-
Users who are not signed into Copilot (either using a Microsoft Account or other third-party authentication).
-
Users under the age of 18 who are signed in to Copilot.
-
Users who have opted out of model training.
-
Users in Brazil, China (excluding Hong Kong), Israel, Nigeria, South Korea, and Vietnam. This means that Copilot will be available in some of those markets, but no user data will be used for generative AI model training in those locations until further notice.
We also limit the types of data we use for training. We do not train AI models on:
-
Personal account data like your Microsoft account profile data or email contents.
-
The contents of files you upload to Copilot, although any conversations you have with Copilot about the file will be handled just like any other conversation (and subject to your choice about whether to permit model training on your conversations).
-
Identifying information in uploaded images. If any images are included in your Copilot conversations, we take steps to de-identify them such as removing metadata or other personal data and blurring images of faces.
-
Information that may identify you, like names, phone numbers, device or account identifiers, sensitive personal data, physical addresses, and email addresses.
How does Microsoft protect my data when training AI models?
Your personal interactions with our services are kept private and are not disclosed without your permission. We remove information that may identify you, like names, phone numbers, device or account identifiers, sensitive personal data, physical addresses, and email addresses, before training AI models.
Your data remains private when using our services. We will protect your personal data as explained in the Microsoft Privacy Statement and in compliance with privacy laws around the world.
How can I control whether my data is used for model training?
If you are logged into Copilot with a Microsoft Account or other third-party authentication, you can control whether your conversations are used for training the generative AI models used in Copilot. Opting out will exclude your past, present, and future conversations from being used for training these AI models, unless you choose to opt back in. If you opt out, that change will be reflected throughout our systems within 30 days.
If you are not logged into Copilot with a Microsoft Account or other third-party authentication, we do not train Copilot on your conversations.
Read how to control whether your conversations are used for model training.
We may eventually expand model training and opt-out controls to users in certain countries where we do not currently use conversation history for model training (see What data is excluded from model training?). But we will do so gradually, to ensure we get this right for consumers and to ensure we comply with local privacy laws around the world.
Can I opt-out of model training and still have a personalized Copilot experience?
Yes, you can opt-out of model training and still have personalization turned on. In this case, Copilot will remember key details from your conversations to give you a more personalized response, but Microsoft will not use your conversations for training Copilot’s generative AI models.
Does model training apply to Copilot Pro or Microsoft 365?
AI training applies to consumer Copilot offerings including Copilot Pro. It excludes users of Copilot with organizational Entra ID accounts, and any Microsoft 365 consumer users or Copilot conversations integrated within Microsoft 365 consumer apps such as Word, Excel, PowerPoint or Outlook. Users of those products will not see this setting, and their conversations will not be used for training the generative AI models we offer in Copilot or other products.
We recognize that commercial customers have varying compliance requirements across industries and around the world. Microsoft will continue helping these organizations use tenant boundaries and other controls we provide to identify and manage data they own, separately. To learn more about how Microsoft handles model training for commercial customers, read about Enterprise Data Protection.
Are Copilot conversations human reviewed?
Some Copilot conversations are subject to both automated and human review for product improvement and digital safety purposes. We may also review conversations flagged as a violation of the Code of Conduct in the Terms of Use. Our Code of Conduct prohibits use of the Copilot service to create or share inappropriate content or material. Some conversations are reviewed when a violation of the Code of Conduct is suspected.
Can I opt-out of having my Copilot conversations human reviewed?
Limited human review is required as part of the investigation process when a violation of the Code of Conduct is suspected. To ensure that our services are safe and secure for everyone, an opt-out of human review is not available.
Responsible AI
How is Microsoft approaching responsible AI for Copilot?
At Microsoft, we take our commitment to responsible AI seriously. Copilot is developed in accordance with our AI principles, which demonstrate commitment to making sure AI systems are developed responsibly and in ways that warrant people’s trust. We've designed the Copilot user experience to keep humans at the center and developed safeguards to minimize errors and avoid misuse, and we are continually improving the experience. To learn more about how to use Copilot responsibly, please review our Terms of Use and Transparency Note.
How does Copilot use the internet to find its sources of information?
Copilot searches for relevant content across the web and then summarizes the information it finds to generate a helpful response. It also cites its sources, so you’re able to see and explore links to the content it references.
Are Copilot’s responses always accurate?
Copilot aims to respond with reliable sources, but AI can make mistakes, and third-party content on the internet may not always be accurate or reliable. Copilot may misrepresent the information it finds, and you may see responses that sound convincing but are incomplete, inaccurate, or inappropriate. Use your judgment and double check facts before making decisions or taking action based on Copilot’s responses. Reviewing Copilot’s citations is a good place to start checking for accuracy.
To share site feedback or report a concern, select ‘Give feedback’ in Settings or use the flag icon below each response in the mobile app and Copilot web page. In the Copilot app, you can also long press the response and select ‘Report’. We will continue to review your feedback to provide a safe search experience for all.
What sort of safeguards does Copilot have to make sure content is safe?
Responsible AI is a journey, and we are continually evaluating and improving Copilot to make the experience even safer. We apply rigorous content filtering to the information we use to train Copilot, and we employ measures to evaluate Copilot’s responses for potential safety risks before the response is presented to you. We also have systems in place to detect and prevent abusive behavior.
Additionally, to improve Copilot’s safety and ability to detect risks, we partner with external research organizations to review and evaluate Copilot conversations. For example, external research organizations can help review Copilot conversation logs to understand the variety of queries used to seek extremist content, compare trends across the industry, and advise techniques for better finding and mitigating harm.
You can learn more about our approach to safety in our Transparency Note.
What should I do if I see unexpected or offensive content?
While we have designed the Copilot experience to avoid sharing unexpected offensive content or engaging with potentially harmful topics, you may still see unexpected results. We’re constantly working to improve our technology to prevent harmful content.
If you encounter harmful or inappropriate content, please provide feedback or report a concern by clicking ‘Give feedback’ in Settings or use the flag icon below each response in the mobile app and Copilot web page. In the Copilot mobile app, you can also long press the response and select ’Report’. We will continue to review your feedback to provide a safe experience.
How does human review of Copilot conversations improve your experience?
Trained AI experts may review Copilot conversations to build, evaluate, and improve the accuracy and safety of our AI models. We use human feedback to ensure quality output to a user’s prompt, improving your experience. Your data remains private when using our services. We will protect your personal data as explained in the Microsoft Privacy Statement and in compliance with privacy laws around the world.