Table of contents:
Introduction
As organizations increasingly rely on AI tools like Copilot to streamline operations and enhance productivity, ensuring that these tools are used responsibly and securely becomes paramount.
Copilot plays a crucial role in decision-making. It is essential to audit and monitor outcomes to ensure they are readily available for investigating business processes. These requirements are vital for both legal investigations and business improvement initiatives.
This blog explores the importance of auditing Copilot, the implementation methods, and how BusinessGPT AI Firewall can provide enhanced auditing and risk management capabilities.
Why Auditing Copilot Matters
Auditing Copilot is essential for several reasons:
Compliance: Organizations must comply with various regulations that require monitoring and documenting interactions involving sensitive information. Auditing helps ensure that Copilot usage aligns with these legal and regulatory requirements.
Security: Auditing helps identify and mitigate potential security threats by monitoring interactions for inappropriate or malicious content.
Risk Management: By keeping track of how Copilot is used, organizations can identify potential risks and take proactive measures to address them.
Accountability: Auditing provides a clear record of interactions, helping to hold users accountable for their actions and ensuring transparency in how AI tools are utilized.
Methods for Auditing Copilot
Microsoft Copilot auditing uses two methods for auditing: communication compliance and eDiscovery Premium with Audit tools.
From our experience, the communication compliance did not achieve the expected results.
This blog covers using eDiscovery Premium from the compliance admin Center and Communication Compliance
Note: These options require a Microsoft E5 license.
Method 1 – eDiscovery Premium- using the compliance admin center
Method 2 – Communication Compliance
Method 1 – eDiscovery Premium
eDiscovery Premium offers a more robust approach to auditing Copilot interactions through case creation and data collection. Here’s a step-by-step guide:
Step 1: Create a Case
- Name the Case: Assign a unique name to the case.
- Default Settings: Proceed with the default settings unless specific adjustments are required.
- Add Members: Add relevant members to the case.
- Submit: Finalize the case creation process by clicking “Submit.”
Step 2: Create a New Collection
- Access the Case: Navigate to the case you previously created.
- Go to Collections Tab: Locate and select the “Collections” tab.
- Create New Collection: Click on “Create New Collection.”
- Name the Collection: Provide a relevant name to identify the collection’s purpose.
- Select Custodial Data Sources: Choose to include all custodial data sources available for the collection. Additionally, choose 365 groups.
- Define the Query: In the last step, define the query to filter for Copilot activity.
Step 3: Add Review Set
(set of documents where you can analyze, query, view, tag, and export data in a case).
- Access the Review Set Tab: Navigate to the “Review Set” tab within the case interface.
- Add Review Set: Click on “Add Review Set.”
- Name the Review Set: Provide a descriptive name for the new review set.
- Open the Review Set: Once created, click on the review set to open it.
- Commit Collection: Navigate to your previously created collection, and click “Commit Collection.” The system will add items from the collection to the review set.
- Access Data: Once the commit process is complete, navigate to the review set to find the data ready for review. The status will change from “Adding to review set” to “Committed.”
Filtering the Review set is possible, using queries (Date, user, ,,,)
You can filter the review dataset using queries such as date, user, and other parameters. Above the review table, you’ll find a filter option that allows you to create queries and search for specific data.
Below is an example of a filter query to get data between June.1, 2024, and June.2, 2024
Method 2 – Communication Compliance
Communication Compliance in Microsoft 365 is designed to detect and analyze interactions to identify inappropriate messages and potential policy violations. Here is how to set up a Communication Compliance policy for Copilot:
Policy Creation: Create a policy named: “Copilot for Microsoft 365 Interaction” for testing purposes.
Assignment: Apply the policy to the relevant user and assign a reviewer.
Detection Configuration: Configure the policy to detect messages across multiple Microsoft 365 locations, including Exchange, Teams, and Copilot.
Sensitive Data Detection: Specify the policy to scan for sensitive information such as Israel Bank Account numbers, Israeli National IDs, and IBANs.
However, this method may not always capture all Copilot-related content.
Extras – Securing Copilot
Classifiers & Labels
Step 1: Identify Sensitive Information
- Utilize over 300 different sensitive info types, such as bank accounts, ID numbers, and credit card info, to discover data in your environment. You can create sensitive info types manually using specific keywords (e.g., cache23).
Step 2: Create a Label
- Set up all the encryption within the label. Only certain people will have access. Auto-labeling can be configured to automatically apply the label when a new file containing the keyword (e.g., cache23) is created.
Data Loss Prevention (DLP)
- Create a policy targeting Exchange Email, SharePoint, and OneDrive.
- Define rules to prevent the sharing of files or emails containing the word or label (e.g., cache23) outside the organization.
Data Lifecycle Management
It is important to manage the lifecycle of data involved in Copilot interactions. Retain necessary data for as long as needed and delete outdated or irrelevant data to prevent Copilot from providing outdated information.
Create a Retention Policy:
- Apply it to the entire organization.
- Specify locations such as Teams Chats and Copilot interactions.
- Choose the retention period and decide whether to retain content, delete it, or both.
Proper data lifecycle management ensures that critical questions and responses with Copilot are not kept in user mailboxes longer than necessary, maintaining data relevancy and security.
This method provides detailed logs and allows for a more granular analysis of Copilot interactions.
Enhancing Auditing with BusinessGPT AI Firewall
While Communication Compliance and eDiscovery Premium offer valuable tools for auditing Copilot, BusinessGPT AI Firewall takes it further by adding more granular auditing, risk management, and security functionality. Here’s how:
- Compliance – Maintain compliance with regulations such as the EU AI Act NIST AI RMF.
- AI Monitoring: Auditing, mapping, and measuring your AI usage.
- Data taxonomy- Identify Usage -What is the activity (what is the actor doing/action performed) and Topics -what the activity is about.
- Shadow AI: Understand what your users are using AI for.
- Advanced Risk Rules- Use natural language to set rules defining AI usage in your company.
- AI Risk Management: Identifying and mitigating potential risks associated with misuse and hallucinations of AI.
- AI Governance policies – Enforce usage policies per group/user to ensure responsible AI usage.
- Input/Output validation: Block toxicity, harmful, inappropriate content.
- Business safeguarding – Protect against negative business outcomes due to using AI.
Conclusion
Auditing Copilot interactions is vital for compliance, security, and risk management. While existing methods like Communication Compliance and eDiscovery Premium offer solid foundations, BusinessGPT AI Firewall provides enhanced capabilities for more effective auditing and risk management. By integrating these tools, organizations can ensure that their use of AI remains secure, compliant, and accountable.