Applies ToMicrosoft Office

Copilot in Stream is a feature designed to enhance your video-watching experience within Microsoft 365. It works on either Teams meeting recordings that are played back in Stream, or any Stream video stored in OneDrive (work or school) and SharePoint. This feature allows you to:

  • Summarize Videos: Offers a concise overview of the video's main topics.

  • Extract Key Information: Highlights important points and generates meeting notes based on details discussed during a video meeting.

  • Create Action Items: Identifies tasks that need to be addressed.

  • Get Answers: Responds to queries based on video content.

  • Identify Outstanding Issues: Lists any questions raised in the video that remain unanswered.

  • Link Timestamps: Allows you to jump directly to specific sections of the video.

To access these features, select the Prompt Guide within the Stream Web App. Choose the desired action such as Summarize the Video and Review the results Copilot generates.

Please note that Copilot in Stream uses video transcripts, so if the video you're watching doesn't have a transcript, you'll need to generate one first. Users with edit permissions for the video file can create transcripts directly in Video Settings. To learn more, see Welcome to Microsoft Copilot in Stream.

Select a heading below for more information.

  • Copilot requires at least 100 words in the video transcript to summarize the content. If there is a very short transcript, Copilot will not function.

  • Copilot is only able to read the first transcript generated for the video. In a case where multiple transcripts or translations have been added to the video, Copilot will default to the original transcript.

  • Copilot does not generate any new content or edit the video. It simply answers questions based on the existing video transcript.

Copilot in Stream cannot be customized but users can choose what questions to ask based on their needs.

Copilot in Stream will respond best when you do the following: 

  • Stay on topic: Only ask questions related to the video's content, as Copilot relies on the transcript to provide answers.

  • Language support: While Copilot supports these multiple languages, for best results, use English.

  • Transcript accuracy: Ensure the transcript's language matches the video. If not, generate a new one in the correct language under Video Settings.

Copilot was evaluated through extensive manual and automatic testing on top of Microsoft internal usage and public data. Additional evaluation was performed over custom datasets for offensive and malicious prompts (user questions, directions, or instructions) and responses. Copilot is also continuously evaluated through online user feedback.

Providing feedback via the thumbs-up and thumbs-down icons will aid in teaching Copilot which responses are and are not helpful to you as a user. We use this feedback to improve Copilot, just like we use customer feedback to improve other Microsoft 365 apps and services. We don't, however, use this feedback to train the foundation models used by Copilot. See Manage Microsoft feedback for your organization for more information.

Copilot supports a wide range of languages. However, please note that there may be some app-specific variations. For the latest information on language support across all Copilot apps, see Microsoft Copilot supported languages.

Copilot has been reviewed by our Responsible AI (RAI) team. We follow RAI principles and have implemented the following:

Responsible AI handling pipeline to mitigate the risks like harmful, inappropriate content, inciting or encouraging harmful actions, stereotyping/demeaning or over- and underrepresenting social groups, prompt injection "jailbreak," and cross-domain prompt injection attacks.

In-product user feedback where users can report offensive content back to Microsoft.

These safeguards are designed to promote effective and ethical use of Copilot's capabilities.

Microsoft has been on a responsible AI journey since 2017, when we defined our principles and approach to ensuring this technology is used in a way that is driven by ethical principles that put people first. Read more about our responsible AI journey, the ethical principles that guide us, and the tooling and capabilities we've created to assure that we develop AI technology responsibly.

The content generated by Copilot is based on a wide array of language data and is designed to be coherent and contextually relevant. However, it may not always be entirely original, as it draws from common language patterns and existing information. It is a tool meant to assist and provide a foundation for content creation, but the final content should always be reviewed and verified for originality and accuracy.

Copilot and Microsoft 365 are built on Microsoft's comprehensive approach to security, compliance, and privacy.

For more information about privacy, see the following information:

Microsoft 365 Copilot generates information and explains it in a fluent, grammatically correct way, but the content it generates can be inaccurate or inappropriate. It can't understand meaning or evaluate accuracy, so be sure to read over what it writes, and use your judgment.

As with any AI-generated content, it's a great start, but we know that the decisions you make have a big impact. This makes it critical for you to review, edit, and verify anything Microsoft 365 Copilot creates for you.

Yes. Navigate to myaccount.microsoft.com, go to Settings & Privacy, select the Privacy tab, and then Copilot interaction history. There will be a button to clear your history. For details on this feature, see Delete your Microsoft Copilot interaction history. Copilot data will also be removed across all other Microsoft 365 applications. 

Copilot includes filters to block offensive language in the prompts and to avoid synthesizing suggestions in sensitive contexts. We continue to work on improving the filter system to more intelligently detect and remove offensive outputs. If you see offensive outputs, please submit feedback in using the thumbs-up/thumbs-down icons so that we can improve our safeguards. Microsoft takes this challenge very seriously and we are committed to addressing it.​​​​​​​

Need more help?

Want more options?

Explore subscription benefits, browse training courses, learn how to secure your device, and more.