You dont have javascript enabled! Please enable it! Augment Microsoft Sentinel Incident Investigation With Microsoft Copilot For Security And Logic Apps - CHARBEL NEMNOM - MVP | MCT | CCSP | CISM - Cloud & CyberSecurity

Augment Microsoft Sentinel Incident Investigation with Microsoft Copilot for Security and Logic Apps

9 Min. Read

Fragmented security stacks, excessive alerts, and understaffing are some of the biggest challenges security teams face today. However, you can overcome these obstacles with Copilot for Security, a generative AI assistant that can boost the efficiency of security professionals by up to 22%, according to a recent study performed by experienced security professionals and expanded to randomized controlled trials, published by Microsoft in January 2024.

In this article, we will show you how to augment the investigation of the Microsoft Sentinel incident with Microsoft Copilot for Security and Logic Apps.

Introduction

As we have all seen in the news, Microsoft announced Microsoft Copilot for Security and made it generally available on April 1st, 2024, with new capabilities. Over the past year, we have been evaluating Copilot for Security in preview and are happy to see it publicly available.

Copilot for Security
Copilot for Security

Microsoft Copilot for Security is the first security product to enable defenders to move at the machine speed and scale of AI. It combines the most advanced large language models (LLMs) from OpenAI with large-scale data and threat intelligence, including more than 65 trillion daily security signals. Security Copilot also delivers an enterprise-grade security and privacy-compliant experience as it runs on Azure’s hyperscale infrastructure.

When a security professional prompts Security Copilot, it uses the power of a security-specific model to deploy skills and queries that maximize the latest language model capabilities, which is unique to security use cases. The cyber-trained model adds a learning system to create and tune new skills. This helps the Security Copilot catch what other approaches might miss and augment an analyst’s work. In a typical incident, this boost improves the quality of detection, investigation, speed of response, and ability to enhance security posture.

Security Copilot is a system that integrates with Microsoft Security products like Microsoft Sentinel, Microsoft Defender XDR, Microsoft Entra, Microsoft Purview, Microsoft Intune, Microsoft Defender Threat Intelligence, and Microsoft Defender External Attack Surface Management, and will soon expand to include third-party products. It is not just a large language model but rather a learning system that enables organizations to investigate and defend at machine speed.

Microsoft Copilot for Security also includes a Logic Apps connector that allows you to call into Copilot from a Logic Apps workflow, which you can integrate into your automation workflows. You can automate and submit a natural language prompt to create a new Copilot for Security investigation, such as Microsoft Sentinel Incident Investigation (the purpose of this article). After completion, the evaluation result will be returned to your workflow, where you can export/automate further.

Prerequisites

To follow this article, you need to have the following:

1) Azure subscription — If you don’t have an Azure subscription, you can create one here for free.

2) Log Analytics workspace — To create a new workspace, follow the instructions to create a Log Analytics workspace. A Microsoft Sentinel workspace is required to ingest data into Log Analytics. You also must have read and write permissions on this workspace; you need at least contributor rights.

3) Microsoft Sentinel — Follow the instructions to enable Microsoft Sentinel at no additional cost on an Azure Monitor Log Analytics workspace for the first 31 days. Once Microsoft Sentinel is enabled on your workspace, every GB of data ingested into the workspace can be retained at no charge (free) for the first 90 days.

4) Microsoft Copilot for Security enabled (more on this below) — Copilot for Security capacity is billed monthly via a new Security Compute Unit (SCU) at the rate of “$4 per SCU per hour”. Microsoft recommends provisioning 3 SCUs ($4 X 3 = $12 per hour) to start Copilot for Security exploration. If we do a quick math, this will land at 730h X $4 (1 SCU as a minimum) = $2,920/Month. There is no free trial for Copilot for Security.

5) To create Copilot for Security capacity, you must be an Azure Owner or Contributor, at least at the resource group level. You must also have a Global Administrator or Security Administrator role to set up the default environment.

  • Contributors can access Copilot, but Copilot responses will vary based on existing user permissions to Microsoft Security products. After setup, Owners can manage access from the role assignment page. Using security groups instead of individual users to assign Copilot for Security roles is highly recommended, which reduces administrative complexity. Learn more about Copilot for Security access.

Important! — For those that might not be aware, when you provision Copilot for Security for the first time, the group “everyone” is automatically added to the Contributor role, as shown in the figure below. This needs to be changed immediately since the Contributor role provides much access to areas with Copilot for Security. Microsoft decided to enable the embedded experience for everyone, but it can catch administrators off guard and prevent them from realizing what a Contributor role gives you.

Check the support documentation for what permissions the role Contributor gives in Copilot for Security (Understand authentication in Microsoft Copilot for Security), and see the screenshot below about what groups have defaulted to the Contributor role and the Owner role in Copilot for Security.

Copilot for Security - Role assignment
Copilot for Security – Role assignment

6) Microsoft Copilot for Security Logic App (more on this below).

  • User Authentication — At the time of writing, the Copilot for Security Logic App connector only supports delegated permissions via the OAuth Authorization Code flow. When designing the logic app, the user who establishes a connection to the connector must have access to Microsoft Copilot for Security. Managed Identities are not supported yet.

7) Configure Playbook permissions and create Microsoft Sentinel Automation Rule (more on this below).

Assuming you have all the prerequisites in place, take the following steps:

Set up Copilot for Security Capacity

You can enable Microsoft Copilot for Security in the Azure portal or directly visit Microsoft Copilot for Security portal.

1) Sign in to Copilot for Security with your account and select Get Started.

Welcome to Microsoft Copilot for Security
Welcome to Microsoft Copilot for Security

2) Next, you need to set up your security capacity. As shown in the figure below, select the Azure subscription, associate capacity to an existing resource group or create a new one, add a name to the capacity, and select the prompt evaluation location (Australia, UK, Europe, or United States).

Set up security capacity
Set up security capacity

3) Next, specify the number of Security Compute Units (SCUs). The number of SCUs is provisioned hourly; the minimum is to provision 1 SCU, and the recommended is 3 SCUs. The estimated monthly cost is  $2,880/month for 1 SCU, as shown in the figure below. Then, confirm that you acknowledge and agree to the terms and conditions, then select Continue.

Select the number of Security Compute Units (SCUs)
Select the number of Security Compute Units (SCUs)

4) After you’ve created the capacity, it will take a few minutes to deploy the Azure resource on the back end. The data is always stored in your Azure home tenant geography, in this case, Europe. Select Continue.

Copilot for Security Azure home tenant geography
Copilot for Security Azure home tenant geography

5) You could choose whether to share data gathered from your organization’s use of Microsoft Copilot for Security—including user prompts, the security information accessed, and Copilot’s responses with Microsoft. You can change these settings at any time later. Select Continue.

Once the compute capacity is created, you can use Microsoft Copilot for Security.

Microsoft Copilot for Security Promptbooks
Microsoft Copilot for Security Promptbooks

Creating Azure Logic App for Copilot

Once the Microsoft Copilot for Security instance is up and running, the second step is to create an Azure Logic App for Copilot for Security.

Take the following steps to create a Logic App workflow to investigate Microsoft Sentinel high-severity incidents with Copilot:

1) Sign in to the Azure portal with your Azure account.

2) In the Azure search box, enter logic apps and select Logic apps.

3) On the Logic apps page, select Add.

Add logic app
Add logic app

4) On the Create Logic App pane, on the Basics tab, provide the following basic information about your logic app:

  • Subscription
  • Resource Group
  • Logic App name

5) Before making selections, under Plan type, select Consumption so that you view only the settings that apply to the Consumption plan-based logic app type. A consumption plan is best for entry-level, and you pay only as much as your workflow runs. Copilot for Security Logic Apps Connector is available in both “Standard” and “Consumption” plan types.

6) Continue by setting the desired Azure region and set NO to Enable log analytics.

7) When you’re done, your settings look similar to the following figure.

Create logic app
Create logic app

8) When you’re ready, select Review + Create.

9) On the validation page, confirm all the information you provided and select Create.

10) Select Go to the resource after deploying your logic app. Alternatively, you can find and select your logic app resource by typing its name in the Azure search box.

11) Select the Identity page and toggle the Status switch to On to enable system-assigned managed identity, as shown in the figure below. We’ll use the registered logic app in Microsoft Entra ID to grant permission access to Microsoft Sentinel.

Enable system-assigned managed identity
Enable system-assigned managed identity

12) Select the Logic app designer and then click Add a trigger.

Add a trigger
Add a trigger

13) This example uses the Microsoft Sentinel Incident trigger. In the Add a Trigger search box, enter Microsoft Sentinel. Select the Microsoft Sentinel incident trigger from the Triggers list, as shown in the figure below. You can also use Microsoft Sentinel entity and Microsoft Sentinel alert triggers.

Microsoft Sentinel Incident trigger
Microsoft Sentinel Incident trigger

14) Then select the Authentication type as ‘Managed Identity,’ give a connection name, and create a new connection using a system-assigned managed identity, as shown in the figure below.

Microsoft Sentinel incident trigger (create a new connection)
Microsoft Sentinel incident trigger (create a new connection)

15) Next, insert a new step after the Microsoft Sentinel incident trigger, and choose Add an action. In the Add an action search box, enter Condition. Select the Condition action from the Actions list, as shown in the figure below.

Add condition step
Add condition step

16) Then, on the Parameters tab, provide the values to compare and select the operator to use. In this example, we are interested in looking for Microsoft Sentinel with an Incident Severity equal to High. Select the first value field, pick ‘Incident Severity‘ from the Dynamic Content, and enter ‘High‘ next to equal to, as shown in the figure below. You can create new items with additional conditions to suit your needs.

Incident Severity condition
Incident Severity condition

17) Next, insert a new step after the True step for condition action, and choose Add an action. In the Add an action search box, enter Microsoft Copilot for Security. Select the Microsoft Sentinel incident action from the Actions list, as shown in the figure below.

Add Microsoft Copilot for Security action
Add Microsoft Copilot for Security action

18) In the Create Connection window to Submit a Copilot for Security prompt, select the Authentication type as ‘OAuth,’ and sign in with an account to create a connection to Microsoft Copilot for Security. As noted in the prerequisites section, the Copilot for Security Logic App connector only supports delegated permissions via the OAuth Authorization Code flow; it does not support ‘Managed Identity‘ yet.

19) Once the connection is created to Copilot for Security. On the Parameters tab, you need to enter the prompt to be evaluated by the security copilot. So, on the Prompt Content field, you can enter, for example, ‘Summarize Sentinel incident‘ then select the data from the previous step and search for ‘Incident Sentinel ID‘ from the Dynamic Content), as shown in the figure below. You can also rename the prompt step as needed.

Enter the prompt to be evaluated by the security copilot (Incident Sentinel ID)
Enter the prompt to be evaluated by the security copilot (Incident Sentinel ID)

20) You can also add additional copilot security prompts to the same step as needed (i.e. ‘Tell me about the entities associated with that incident.‘)

Add additional copilot security prompts
Add additional copilot security prompts

Once you’ve completed all the steps, click Save. The final Logic App workflow will look like this.

Copilot for Security Logic App workflow
Copilot for Security Logic App workflow

Note: You can be more creative and add additional step(s) to the Logic App to get the response back from Copilot for Security, then add it as a comment to the incident in Microsoft Sentinel, and so on.

Configuring Playbook Permissions

Next, we must configure playbook permissions and assign the Microsoft Sentinel Automation Contributor role to the resource group where the Logic App was created.

The Microsoft Sentinel Automation Contributor is not a user role but a role that needs to be assigned to a Microsoft Sentinel identity so that an automation rule can run a playbook as an action. This role is assigned to Microsoft Azure resource groups, where playbooks reside. For example, if we have five resource groups containing playbooks and want to attach them as an action, we need to assign this permission to all five Azure resource groups.

To assign the Microsoft Sentinel Automation Contributor permission, we need to go to the Microsoft Sentinel instance, go to Settings, then Settings again, and then, under Playbook permissions, click on Configure permissions.

Then, in the next window under (Manage permissions), choose the resource groups you want to assign the Microsoft Sentinel Automation Contributor role and click Apply, as shown in the figure below.

Configure Playbook Permissions
Configure Playbook Permissions

Once permission is applied at the resource group level, we can attach playbooks from that resource group to an automation rule as an action (more on this in the next step).

Creating Microsoft Sentinel Automation Rule

The final part is to create a Microsoft Sentinel Automation Rule that triggers the Logic App, and we do that with Automation rules. Automation rules allow you to manage all incident handling automation centrally. They streamline automation use in Microsoft Sentinel and enable you to simplify complex workflows for your incident orchestration processes.

To run the Copilot for Security playbook automatically on incident creation, we need to add it to the automation rule by using the “When an incident is created” trigger and Run the playbook as the action.

To create a new automation rule, switch to the Microsoft Sentinel instance, go to the Automation page, click ‘+ Create,’ and select ‘Automation rule.’ Give the automation rule a name, set the trigger to “When an incident is created,” and add a Condition with Severity equal to High because we want to submit a copilot for the Security prompt for high-severity incidents only. Then, we choose our playbook under Actions, and finally, click Apply, as shown in the figure below.

Create a new automation rule
Create a new automation rule

Please note that you can also run the playbook manually on demand without creating an automation rule.

Testing Copilot for Security Logic App

The final step is to verify and test Copilot for the Security Logic App. You could run the playbook manually on any incident with a High Severity by right-clicking on the incident, selecting Run playbook, and choosing the Playbook that we created in the previous step, as shown in the figure below, or wait for an incident to be created where the automation rule automatically kicks in.

Run playbook on incident
Run playbook on incident

If we switch to the Playbook Run History page, we see that the Logic App ran successfully and was completed in about 1 minute.

Augment Microsoft Sentinel Incident Investigation with Microsoft Copilot for Security and Logic Apps 1

Let’s look at the Microsoft Copilot for Security portal under My Sessions. We can see how Copilot for Security automatically summarized the incident perfectly and listed all associated entities. This is just a sample of the Microsoft Sentinel incident for testing purposes.

Copilot for Security session prompt output
Copilot for Security session prompt output

If we scroll to the bottom of the output prompt, we can see under References a deep link to the ‘Incident Page‘ in Microsoft Sentinel, where you can navigate directly to the specific incident in question. This is awesome!

Copilot for Security session prompt output with Deep Link to Incident Page
Copilot for Security session prompt output with Deep Link to Incident Page

It’s worth mentioning that the sessions or prompts you submit to Copilot for Security, manually or automatically, as we’ve seen in this article, can be deleted from Microsoft Copilot for Security portal, as shown in the figure below. So, suppose you want to keep track of the submitted prompts and outputs in Copilot for Security. In that case, you need to export those to a robust compliance platform, such as LogLocker, built on top of blockchain storage with tamper-proof immutability.

Delete Copilot for Security sessions
Delete Copilot for Security sessions

That’s it. Happy Microsoft Sentinel Auto Incident Investigation with Microsoft Copilot for Security and Logic Apps!

In Summary

This article has explored how Copilot for Security, a generative AI assistant, can significantly enhance the efficiency of security professionals. By integrating Copilot for Security with Microsoft Sentinel incidents and Logic Apps, security teams can automate and streamline incident investigation processes and improve overall security posture. With Copilot for Security’s ability to automate incident handling and provide actionable insights, security teams can effectively tackle the challenges posed by fragmented security stacks, excessive alerts, and resource constraints.

Copilot for Security follows Microsoft’s responsible AI principles and protects your Customer Data with some of the industry’s most comprehensive compliance and security controls. Read about Security Copilot privacy and data security.

__
Thank you for reading my blog.

If you have any questions or feedback, please leave a comment.

-Charbel Nemnom-

Photo of author
About the Author
Charbel Nemnom
Charbel Nemnom is a Senior Cloud Architect with 21+ years of IT experience. As a Swiss Certified Information Security Manager (ISM), CCSP, CISM, Microsoft MVP, and MCT, he excels in optimizing mission-critical enterprise systems. His extensive practical knowledge spans complex system design, network architecture, business continuity, and cloud security, establishing him as an authoritative and trustworthy expert in the field. Charbel frequently writes about Cloud, Cybersecurity, and IT Certifications.
Previous

Microsoft Certified Trainer 2024-2025

Update Microsoft Sentinel Analytics Rules at Scale

Next

Let us know what you think, or ask a question...