Home NewsX How to use Log Analytics log data exported to Storage Accounts

How to use Log Analytics log data exported to Storage Accounts

by info.odysseyx@gmail.com
0 comment 14 views


introduction

Exporting logs from Sentinel or Log Analytics to an Azure Storage account blob provides benefits such as immutability and geographic redundancy for legal holds, as well as inexpensive long-term retention.

However, if an incident or legal case arises, you may need to retain data in that storage account blob to aid in the investigation.

Team under investigationTeam under investigation

How do I retrieve and analyze that data? This blog will answer that very question. Hint, this includes Azure Data Explorer clusters. I will also briefly explain how the data is stored in that blob in the first place.

**Note: ADX is often used to refer to “Azure Data Explorer” in this blog post.

How to export Log Analytics to a storage account blob

Option 1: Export workspace

The easiest way is to enable export in the Log Analytics workspace itself. To do this, in the Log Analytics workspace blade, go to Settings \ Data Export and select “New Export Rule”. You can select which tables (if necessary, multiple) you want to export and the destination storage account.

Simone_Oor_0-1728290279549.png

In the picture above, the export occurs at ingestion. This means that you will have duplicate data, at least for the duration of the Log Analytics retention period. Exports occur continuously at 5-minute intervals.

The resulting storage account has the following structure:

  • A container is created for each table in the storage account with a name followed by the table name (“am-azureactivity”).
  • Blobs are stored in the 5-minute folder in the following path structure: WorkspaceResourceId=/subscriptions/subscription-id/resourcegroups//providers/microsoft.Operationalinsights/workspaces//y=/m=/d=/h=/m=/PT05M.json. Additions to blobs are limited to 50K writes. More blobs are added to the folder as PT05M_#.json*. Where ‘#’ is the increment blob number.

source: Export Log Analytics workspace data from Azure Monitor – Azure Monitor | microsoft run

The export itself has a cost per GB, so take that into account. Pricing – Azure Monitor | Microsoft Azure

Option 2: Logic app

Or, you can build a logic app or function app that exports exactly what you want (tables and columns, data first summarized into smaller sets, etc.), when you want it. This option is described here. Use Logic Apps to export data from your Log Analytics workspace to a storage account.

Unlike the previous option (Export from workspace), this example uses a name defined by your logic app and leads to a flat folder with all the JSON listed in the same folder.

Simone_Oor_1-1728290279593.png

How do I access the data now?

As you can see below, querying data from a storage account can be done in a variety of ways.

External data() KQL

If you use the externaldata() kql function in Log Analytics or Azure Data Explorer (adx), your query will look similar to the one below. In this example, two different JSON files are read from the storage account container “loganalytics-data” (option 2 above).

Simone_Oor_0-1728293917360.png

You will need a URL and SAS for each JSON, which can be obtained by navigating to each JSON in your storage account container and selecting “Generate SAS” and “Generate SAS Token and URL” as shown below (or use your preferred scripting language). :

Simone_Oor_3-1728290279645.png

The pros and cons of this option are:

merit:

  • There is no need to deploy additional components such as Azure Data Explorer clusters.
  • Blob content can be queried “on the fly”.

disadvantage:

  • A SAS token is required (Entra ID authentication is always better).
  • It involves “sub-tasks” in the form of defining fields and copying and pasting a URL containing the SAS token.
  • If a lot of data is queried (a lot of JSON), performance can deteriorate.

external_table in ADX

The next option is to leverage Azure Data Explorer (ADX) and create an “external table” in ADX. Then point to your storage account and the associated container that holds the exported logs in JSON format. Deploying Azure Data Explorer clusters is beyond the scope of this blog. More information can be found here. Create an Azure Data Explorer cluster and database.

Once secured, your Azure Data Explorer cluster can be put into a “quiet” state when not needed, saving you money. Be sure to start before testing the next step (and stop again when you’re done and no one needs it!).

Simone_Oor_4-1728290279671.png

The data in the cluster can be accessed by navigating to the URI (see “Copy to Clipboard” in the image above). The detailed steps are as follows: Create an external table in Azure Data Explorer using the Azure Data Explorer web UI wizard.

Here are some steps to get started:

To create an external table, go to Queries on the left side of the screen, then right-click on the database and select “Create External Table.”

Simone_Oor_5-1728290279675.png

Follow the wizard, giving your external table a name in the destination part, then selecting a container in the source part, and navigating to Storage Subscription, Storage Account, Container. Here we have selected the container created in option 1 above (the container created by export from Log Analytics).

Simone_Oor_6-1728290279679.png

ADX can then read all the JSON files (it spans the hierarchy so you don’t have to browse them yourself) and select one of them as a template to generate the schema for the external table you want to create. If you are satisfied with the proposed schema, continue with the wizard. The result will be similar to this:

Simone_Oor_7-1728290279696.png

(My data comes from Azure activity logs.)

External tables in ADX can be queried using: external_table(‘table name’). The pros and cons of this option are:

merit:

  • not needed intake Add it to your Azure Data Explorer database.
  • You can query blob content as if it were a native table.

disadvantage:

  • If a lot of data is queried (a lot of JSON), performance can deteriorate.

Collect with ADX

The final options covered here are: ingest Add data to a table in ADX. The difference with an external table that is actually read from a storage account is that in this case the data will actually reside in a table in the ADX database. It’s explained here. Import data from Azure Storage.

Here are some advanced steps to get started:

Right-click the database again and this time select “Import Data.”

Simone_Oor_8-1728290279700.png

Follow the wizard to select Azure Storage and then connect to containers in your storage account via URI or by selecting an Azure component. You will need to create a new table as shown in the example below (“DataFromBlobs”).

Simone_Oor_9-1728290279712.png

ADX is smart enough to inspect the JSON in the container hierarchy and generate a schema. You can change this by clicking ‘Edit Column’ or ‘Advanced’.

Simone_Oor_10-1728290279720.png

I’m happy enough for the purposes of this blog, so I’ll click Finish for now. ADX will now take on the task of collecting JSON files from your storage account. This may take some time depending on the amount of data to be collected. At the end, you’ll see “Successfully Collected” next to each blob (JSON file).

You can now query the data table directly.

Simone_Oor_11-1728290279742.png

The pros and cons of this option are:

merit:

  • Once data is ingested, this is the fastest query experience in terms of performance.

disadvantage:

  • Data collection costs and resulting ADX security responsibilities.

cleaning

To remove the data collected in the last step, click the three dots next to the table name and select “Drop table” or run a query command.

.drop table table name, such as .drop table DataFromBlobs).

for external The table does not have a right-click option to delete the table. Instead, run the following query command:

.drop external table table name (e.g. .drop external table external data)

Don’t forget to stop your Azure Data Explorer cluster to save money.

conclusion

In this blog post, we explored several options for accessing logs archived in Azure storage account containers through export from Log Analytics and Sentinel or through custom logic apps.

This is intended to address exceptional cases where archived data is required, such as historical context during an investigation.





Source link

You may also like

Leave a Comment

Our Company

Welcome to OdysseyX, your one-stop destination for the latest news and opportunities across various domains.

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

Laest News

@2024 – All Right Reserved. Designed and Developed by OdysseyX