How to avoid the risks of exposing sensitive information with Copilot

Microsoft 365 Copilot is a powerful tool that can help users write faster and better by providing suggestions and insights based on the context and content of their documents. Copilot uses artificial intelligence to learn from the data that users have access to, such as files, emails, chats, and web pages, and generates relevant and coherent texts that can improve productivity and creativity.

However, Copilot also poses some challenges and risks for businesses that lack appropriate governance for their data, especially when it comes to sensitive information. Without proper data classification, protection, and monitoring, Copilot can potentially expose confidential or regulated data to unauthorized users or leak it to external parties. This can result in legal, reputational, and financial damage, as well as loss of trust and compliance.

What are sensitivity labels and why are they important?

Sensitivity labels are a feature of Microsoft 365 that allow companies to classify and protect their data based on its level of sensitivity. Sensitivity labels can be applied to documents, emails, and other types of data, either manually by users or automatically by policies. Sensitivity labels can also enforce encryption, access restrictions, watermarks, and other protection actions on the data, regardless of where it is stored or shared.

Sensitivity labels are important for several reasons. First, they help organizations comply with data protection regulations, such as GDPR, HIPAA, or PCI DSS, by ensuring that sensitive data is handled appropriately and securely. Second, they help prevent data loss or leakage, by preventing unauthorized access, copying, or sharing of sensitive data. Third, they help maintain data quality and integrity, by preventing accidental or malicious modification or deletion of sensitive data.

By applying sensitivity labels to their data, companies can benefit from Microsoft 365 Copilot without compromising their data security or privacy. Copilot will respect the sensitivity labels and will not suggest or generate texts that contain sensitive information that the user is not authorized to access or share. Copilot will also not learn from or store sensitive data that is encrypted or protected by sensitivity labels. This way, organizations can leverage Copilot’s capabilities while minimizing the risks of data exposure or misuse.

What are the risks of not having sensitivity labels applied before enabling Copilot?

If companies do not apply sensitivity labels to their data before enabling Copilot, they may face several risks and challenges. Some of these are:

Copilot may ingest and learn from sensitive data that is not protected by sensitivity labels, such as personal information, financial data, health records, trade secrets, or intellectual property. This may result in Copilot suggesting or generating texts that contain sensitive information that the user or the recipient is not authorized to access or share. For example, Copilot may suggest a customer’s name, address, or credit card number in an email, or a company’s confidential strategy or financial report in a document. This may violate data protection regulations, breach confidentiality agreements, or expose competitive advantages.

Copilot may not respect the traditional SharePoint permissions that companies rely on to control access to their data. SharePoint permissions are based on the location of the data and the role of the user, and they do not apply to the content or the context of the data. Copilot, on the other hand, is based on the content and the context of the data. This may result in Copilot suggesting or generating texts that contain information that the user or the recipient is not supposed to see or know. For example, Copilot may suggest a project status, a budget, or a feedback item that is only meant for a specific team or manager, or a sensitive issue or problem that is only known to a few people.

Copilot may enable users to access sensitive information that they do not have access to in SharePoint, through prompt engineering. Prompt engineering is the technique of crafting specific queries or prompts that can elicit specific responses from Copilot. For example, a user may ask Copilot to write a summary of a document, a list of key points, or a question-and-answer session. If Copilot has learned from sensitive data that is not protected by sensitivity labels, it may reveal that information in its responses, even if the user does not have access to the folder in SharePoint. This may allow users to bypass SharePoint permissions and access sensitive information that they are not supposed to see or know.

How can Microsoft Purview and overall data governance help?

Microsoft Purview is a unified data governance service that helps companies discover, catalog, map, and classify their data across Microsoft 365 and other sources. Purview can help organizations apply sensitivity labels and other data protection policies to their data, as well as monitor and audit their data usage and compliance.

Purview can help avoid the risks of not having sensitivity labels applied before enabling Copilot, by providing the following benefits:

Purview can help discover and catalog data across Microsoft 365 and other sources, such as Azure, Power BI, SQL Server, or third-party cloud services. Purview can also help users understand the lineage, relationships, and quality of their data, as well as the business terms and definitions associated with their data. This can help identify and prioritize the data that needs to be classified and protected by sensitivity labels, as well as the data that can be safely used by Copilot.

Purview can help map and classify data based on its level of sensitivity, using predefined or custom sensitivity labels. Purview can also help apply sensitivity labels and other data protection policies to their data, either manually or automatically, using rules, conditions, or machine learning. This can help ensure that data is consistently and accurately labeled and protected, regardless of where it is stored or shared, and that Copilot respects the sensitivity labels and does not expose or misuse sensitive data.

Purview can help monitor and audit data usage and compliance, by providing insights and reports on how their data is accessed, shared, and protected across Microsoft 365 and other sources. Purview can also help detect and respond to data incidents, such as data breaches, leaks, or violations, by providing alerts and notifications, as well as remediation actions and recommendations. This can help maintain visibility and control over data and ensure that Copilot is used in a responsible and compliant manner.

Overall data governance is an essential prerequisite for implementing Microsoft 365 Copilot, as it can help maximize the benefits and minimize the risks of using Copilot. By applying sensitivity labels and other data protection policies to their data, companies can ensure that Copilot respects data security and privacy and does not suggest or generate texts that contain sensitive information that the user or the recipient is not authorized to access or share. By using Microsoft Purview and other data governance tools, organizations can discover, catalog, map, classify, monitor, and audit their data across Microsoft 365 and other sources, and ensure that their data is consistently and accurately labeled and protected.

Conclusion

Microsoft 365 Copilot is a powerful tool that can help write faster and better by providing suggestions and insights based on the context and content of company data. However, Copilot also poses some challenges and risks for businesses that lack appropriate governance for their data, especially when it comes to sensitive information. Without proper data classification, protection, and monitoring, Copilot can potentially expose confidential or regulated data to unauthorized users or leak it to external parties.

To avoid these risks, companies need to apply sensitivity labels and other data protection policies to their data before enabling Copilot. Sensitivity labels can help classify and protect data based on its level of sensitivity and ensure that Copilot respects the sensitivity labels and does not suggest or generate texts that contain sensitive information that the user or the recipient is not authorized to access or share. Companies also need to use Microsoft Purview and other data governance tools to discover, catalog, map, classify, monitor, and audit data across Microsoft 365 and other sources, and ensure that data is consistently and accurately labeled and protected.

By following these best practices, organizations can benefit from Microsoft 365 Copilot without compromising data security or privacy. Copilot can help improve productivity and creativity, while data governance can help maintain compliance and trust.

Should your business require assistance with integrating Copilot into your systems, please feel free to contact us.

Virtuas

Virtuas

Our team @Virtuas

Leave a Reply