Adsf

Unlocking Secure AI: How Conditional Access Protects Microsoft Copilot Services

 

Unlocking Secure AI: How Conditional Access Protects Microsoft Copilot Services

Generative Artificial Intelligence (AI) services like Microsoft Security Copilot and Microsoft 365 Copilot offer tremendous value when used appropriately in an organizational context. However, protecting these services from misuse is crucial. Microsoft Entra Conditional Access policies provide an effective way to enforce access controls and safeguard these AI services.

Why Conditional Access for Generative AI Services?

Applying Conditional Access policies to these Generative AI services can be achieved using existing policies targeting:

  • All resources for all users.
  • Risky users or risky sign-ins.
  • Users with insider risks.

This Blog i will demonstrates how to target specific Generative AI services, such as Microsoft Security Copilot and Microsoft 365 Copilot, for tailored policy enforcement.

Step 1: Create Targetable Service Principals Using PowerShell

In this test case, I will demonstrate how to control Microsoft Security Copilot access using an Entra ID Conditional Access (CA) policy. First, let's check the status of Microsoft Security Copilot and the corresponding Service Principal in my tenant.

Security Copilot Status

The status below indicates that Security Copilot is currently active and operational.
Microsoft Security Copilot -My Sessions

Now lets check the Service principal status on Entra ID Enterprise Applications 
Entra ID Enterprise Applications

As you can see, there are no Service Principals available for Security Copilot at the moment. Next, let’s verify this through the Conditional Access Policy creation wizard.

Entra ID CA Policy Creation Wizard

To individually target these Generative AI services, you must create service principals, making them available in the Conditional Access app picker

Below are the steps to add these service principals using the Microsoft Graph PowerShell SDK:

Connect with the appropriate scopes to create service principals

Connect-MgGraph -Scopes "Application.ReadWrite.All"

Connect Microsoft Graph PowerShell
Create service principal for Security Copilot (Microsoft Security Copilot)

New-MgServicePrincipal -AppId bb5ffd56-39eb-458c-a53a-775ba21277da

Security Copilot App Created

Use the following command to register the Microsoft 365 Copilot Service Principal:

Command:
Register the Service Principal for the Enterprise Copilot Platform (Microsoft 365 Copilot).

New-MgServicePrincipal -AppId fb8d773d-7ef8-4ec0-a117-179f88add510

Step 2: Create Conditional Access Policies

To adopt these services securely, Conditional Access policies should enforce requirements such as:

  • Phishing-resistant MFA for all users of Generative AI services.
  • Access only from compliant devices when Insider risk is moderate.
  • Blocking access when Insider risk is elevated.

Example: Require Phishing-Resistant MFA

Sign in to the Microsoft Entra admin center as a Conditional Access Administrator.
Navigate to Protection > Conditional Access > Policies.
Select New policy.
Name the policy (e.g., "Phishing-Resistant MFA-for-Security-Copilot-Access").
Under Assignments > Users or workload identities, include All users and exclude Emergency access accounts.
Under Target resources > Resources (formerly cloud apps) > Include > Select resources, choose:

Security Copilot bb5ffd56-39eb-458c-a53a-775ba21277da (Microsoft Security Copilot)

CA Policy Target Resource
Under Access controls > Grant, select:

Grant access.
Require authentication strength and choose Phishing-resistant MFA from the list.

Set the policy to Report-only mode initially.
Select Create.
After validation in report-only mode, switch the Enable policy toggle to On
.


CA Policy Grand Control

Best Practices: User Exclusions

While configuring Conditional Access policies, consider excluding the following accounts to prevent unintended lockouts:

  • Emergency access or break-glass accounts: These accounts are critical for recovering access in case of misconfigurations.
  • Service accounts and service principals: These accounts are used for non-interactive tasks and should be targeted with Conditional Access for workload identities instead of user-based policies.

For more information, refer to Manage emergency access accounts in Microsoft Entra ID.

Security Copilot Access Testing

Now, let's test the Security Copilot access by navigating to the Security Copilot interface using a user who has not registered any phishing-resistant MFA methods and observe the experience.

Sign in to Security Copilot using the specified user account.
Microsoft Security Copilot Login Page

The user will encounter their regular sign-in experience. In this case, I am using passwordless authentication with the Authenticator app, as no passkeys are registered for this account.
Entra ID User Sign-in

After signing in, our phishing-resistant MFA Conditional Access policy will be triggered, prompting the user to set up a phishing-resistant MFA method, such as a passkey, to access Security Copilot.
More Information Required Prompt

After clicking "Next," a screen will appear prompting the user to set up a passkey.
Passkey Setup Prompt

We need to proceed with the passkey setup prompt. For more details about the passkey setup process, you can refer to my previous blog. A Step-by-Step Guide to Configuring FIDO2 Security Keys
Passkey Setup

The passkey setup is complete, and we need to assign a name to the passkey for reference purposes only.
Passkey Name setup

Now, when accessing Security Copilot, you will be prompted to select the passkey and complete the verification to log in
Passkey Verification

We have successfully accessed the Security Copilot dashboard by meeting the requirements of the phishing-resistant MFA.
Security Copilot Dashboard

If you already have a phishing-resistant MFA configured on your account, you can use it directly from the login page instead of proceeding with password or passwordless options through the Authenticator app.
Entra ID Sign-in Page

If you use a password or other MFA methods that are not phishing-resistant, you will be prompted to complete passkey sign-in after successfully signing in. To avoid multiple prompts, it’s recommended to use a passkey as a passwordless option from the start.
Passkey Verification Prompt

Note: The following Conditional Access policies apply to security Copilot standalone experiences but do not affect embedded experiences.

Conclusion

By implementing Conditional Access policies for generative AI services, you can enforce advanced security measures such as phishing-resistant MFA, ensuring that only trusted authentication methods are used. These policies allow you to limit access to compliant devices based on risk levels, adding an extra layer of protection. Additionally, they help safeguard your organization from potential misuse of AI services by restricting access to authorized users and devices. Tailoring these policies to your organization’s unique needs maximizes the value of tools like Microsoft Security Copilot and Microsoft 365 Copilot, all while protecting your data and users in an increasingly complex threat landscape.

Post a Comment

0 Comments

Add

Ad Code