You dont have javascript enabled! Please enable it! Query And Send Results To A New Table In Log Analytics Workspace - CHARBEL NEMNOM - MVP | MCT | CCSP | CISM - Cloud & CyberSecurity

Query and Send Results To a New Table in Log Analytics Workspace

5 Min. Read

Have you encountered a scenario where you want to run a KQL query in the Log Analytics workspace and send the results to a new custom table? This article will show how easy it is to query and send results to a new table within your workspace for further analytics.


A Log Analytics workspace is a dedicated environment that stores log data from Azure Monitor and other Azure services, including Microsoft Sentinel and Microsoft Defender for Cloud. Each workspace has its data repository and configuration, but it can combine data from multiple services.

You might see the term Microsoft Sentinel workspace used in different places. This workspace is the same as Log Analytics, but it’s enabled for Microsoft Sentinel. Querying and sending results to a new searchable table in Log Analytics will apply whether you have a Microsoft Sentinel-enabled or standalone workspace.

You might encounter a scenario where you want to query a large dataset that could run for hours and send the results into a new table. Search jobs in Azure Monitor help you query and fetch records using parallel processing and asynchronous searching on your large dataset.

When you run a search job, the results are sent to a new table created in the same workspace as the source data. The results (new) table becomes available when the search job starts running, but it may take some time to appear.

The search job results table is a type of Analytics table that can be accessed for log queries and other Azure Monitor features that utilize tables in a workspace. This table follows the retention value set for the workspace. However, you can modify this value after the table has been created. For example, you can increase and decrease the data retention for that new table by up to 12 years or change the data plan from Analytics to Basic logs, including Archive.

Change data retention settings on a single table
Change data retention settings on a single table

Related: Reduce and Optimize Costs in Microsoft Sentinel.

The search jobs don’t impact the workspace’s performance or availability, and the results are stored in a new table with a “*_SRCH” suffix appended.


To follow this article, you need to have the following:

1) Azure subscription — If you don’t have an Azure subscription, you can create one here for free.

2) Log Analytics workspace — To create a new workspace, follow the instructions to create a Log Analytics workspace. You must have read and write permissions on this workspace; you need at least contributor rights. Specifically, to run a search job, you can create a new custom role with the following permissions “Microsoft.OperationalInsights/workspaces/tables/write” And “Microsoft.OperationalInsights/workspaces/searchJobs/write permissions” to the Log Analytics workspace.

Related: See how to create a new custom role with the least privileges.

3) Following this guide requires at least one table with ingested data in your workspace.

4) Search jobs will incur additional charges. The search job will cost around $0.00675 per GB of data scanned, as published on the Azure Monitor pricing page.

For example, if you have a table that contains 500 GB of data per day and you want to search it for a period of 30 days, you will be charged for scanning 15,000 GB of data, which is known as Search job execution. Furthermore, if the search job identifies 1,000 records that match your search query, you will also be charged for ingesting these 1,000 records into the results table known as Search job results on top of Search job execution (data scanned). To optimize costs, write and optimize your query in interactive query mode before running the search job!

Assuming you have all the prerequisites in place, take the following steps:

Query and Send Results to a New Table

Let’s see how to run a search job to fetch records from large datasets into a new search results table in your workspace.

Please note that you can perform the same steps described in this article using the Azure portal, REST API, or the Azure CLI. For the remainder of this article, we will use the Azure portal.

Run a Search Job

Launch the Azure portal and search for your Log Analytics workspace:

1) From the Log Analytics workspace menu, select Logs.

2) Select the ellipsis menu (three dots) on the right-hand side of the screen and toggle Search job mode to on, as shown in the figure below.

Toggle Search job mode
Toggle Search job mode

3) Next, use the time picker to specify the search job date range. The range can be up to one year at a maximum. Then, type the search job KQL query and select the Search Job button.

4) Azure Monitor Logs prompts you to provide a name for the result set table and informs you that the search job is subject to billing. Enter a name for the search job result new table and select Run a search job, as shown in the figure below.

Run a Search Job
Run a Search Job

5) Next, Azure Monitor Logs runs the search job and creates a new table for search job results in your workspace. The search job can run up to a 24-hour window; if it exceeds the 24-hour, it will error out. When the new table is ready, select Viewtablename_SRCH‘ to view the table in Log Analytics, as shown in the figure below.

View Search Results Table
View Search Results Table

6) You will see the search job results as they begin flowing into the newly created search job results table.

Get Search Job

Now, to get the search job status and details for the newly created table, take the following steps:

1) From the Log Analytics workspace menu, select Logs.

2) From the Tables tab, select Search results to view all search job results tables, as shown in the figure below.

Get search job
Get search job

3) The new search table is now ready with all the records that match the search query, as shown in the figure below. The search results table schema is based on the source table schema and the specified query. You can use the following 4 columns from the search results to help you track the source records:

  • _OriginalType:  Type value from the source table.
  • _OriginalItemId: _ItemID value from the source table.
  • _OriginalTimeGenerated: TimeGenerated value from the source table.
  • TimeGenerated: Time at which the search job ran.
Search job details
Search job details

4) The queries you perform on the results (new) table will appear in log query auditing but not the initial search job. The query auditing is enabled with a diagnostic setting on the Log Analytics workspace. This allows you to send audit data to the current workspace or any other workspace in your subscription, to Azure Event Hubs to send outside of Azure, or to Azure Storage for archiving.

Enable query auditing in the Log Analytics workspace
Enable query auditing in the Log Analytics workspace

5) Once you finish your search, you can delete the new search table when you no longer need it to save on costs. This reduces workspace clutter and extra charges for data retention. You can delete the search results table by going to the Log Analytics workspace menu and selecting Tables. Then select the ‘tablename_SRCH‘ and click Delete, as shown in the figure below.

Delete the search results table in Log Analytics
Delete the search results table in Log Analytics

If you have a Microsoft Sentinel-enabled workspace, you can delete the search results table by going to Microsoft Sentinel workspace, selecting the Search menu, and then selecting the Saved Searches tab. Last, select the search card and delete the ‘tablename_SRCH‘ table, as shown in the figure below.

Delete the search results table in Microsoft Sentinel
Delete the search results table in Microsoft Sentinel

That’s it, there you have. Happy Log Analytics Searching!

In Conclusion

This article has demonstrated the straightforward process of running KQL queries in a Log Analytics workspace and transferring the results to a new custom table. Following the outlined steps, you can efficiently execute search jobs, create new tables for search results, and access valuable data for further analysis within your Log Analytics and Sentinel environment.

Remember that these search jobs are not free. The cost of a search job is determined by the amount of data scanned during its execution (Search job execution), including the amount of data found during the search job and added to the results table (Search job results) is also taken into account, based on the regular log data ingestion prices.

Thank you for reading my blog.

If you have any questions or feedback, please leave a comment.

-Charbel Nemnom-

Photo of author
About the Author:
Charbel Nemnom
Charbel Nemnom is a Senior Cloud Architect with 21+ years of IT experience. As a Swiss Certified Information Security Manager (ISM), CCSP, CISM, Microsoft MVP, and MCT, he excels in optimizing mission-critical enterprise systems. His extensive practical knowledge spans complex system design, network architecture, business continuity, and cloud security, establishing him as an authoritative and trustworthy expert in the field. Charbel frequently writes about Cloud, Cybersecurity, and IT Certifications.

How To Create a Self-Hosted Agent for Azure Pipelines

Microsoft Certified Trainer 2024-2025


Let us know what you think, or ask a question...