How to use Log Analytics log data exported to Storage Accounts by info.odysseyx@gmail.com October 7, 2024 written by info.odysseyx@gmail.com October 7, 2024 0 comment 14 views 14 introduction Exporting logs from Sentinel or Log Analytics to an Azure Storage account blob provides benefits such as immutability and geographic redundancy for legal holds, as well as inexpensive long-term retention. However, if an incident or legal case arises, you may need to retain data in that storage account blob to aid in the investigation. Team under investigation How do I retrieve and analyze that data? This blog will answer that very question. Hint, this includes Azure Data Explorer clusters. I will also briefly explain how the data is stored in that blob in the first place. **Note: ADX is often used to refer to “Azure Data Explorer” in this blog post. How to export Log Analytics to a storage account blob Option 1: Export workspace The easiest way is to enable export in the Log Analytics workspace itself. To do this, in the Log Analytics workspace blade, go to Settings \ Data Export and select “New Export Rule”. You can select which tables (if necessary, multiple) you want to export and the destination storage account. In the picture above, the export occurs at ingestion. This means that you will have duplicate data, at least for the duration of the Log Analytics retention period. Exports occur continuously at 5-minute intervals. The resulting storage account has the following structure: A container is created for each table in the storage account with a name followed by the table name (“am-azureactivity”). Blobs are stored in the 5-minute folder in the following path structure: WorkspaceResourceId=/subscriptions/subscription-id/resourcegroups//providers/microsoft.Operationalinsights/workspaces//y=/m=/d=/h=/m=/PT05M.json. Additions to blobs are limited to 50K writes. More blobs are added to the folder as PT05M_#.json*. Where ‘#’ is the increment blob number. source: Export Log Analytics workspace data from Azure Monitor – Azure Monitor | microsoft run The export itself has a cost per GB, so take that into account. Pricing – Azure Monitor | Microsoft Azure Option 2: Logic app Or, you can build a logic app or function app that exports exactly what you want (tables and columns, data first summarized into smaller sets, etc.), when you want it. This option is described here. Use Logic Apps to export data from your Log Analytics workspace to a storage account. Unlike the previous option (Export from workspace), this example uses a name defined by your logic app and leads to a flat folder with all the JSON listed in the same folder. How do I access the data now? As you can see below, querying data from a storage account can be done in a variety of ways. External data() KQL If you use the externaldata() kql function in Log Analytics or Azure Data Explorer (adx), your query will look similar to the one below. In this example, two different JSON files are read from the storage account container “loganalytics-data” (option 2 above). You will need a URL and SAS for each JSON, which can be obtained by navigating to each JSON in your storage account container and selecting “Generate SAS” and “Generate SAS Token and URL” as shown below (or use your preferred scripting language). : The pros and cons of this option are: merit: There is no need to deploy additional components such as Azure Data Explorer clusters. Blob content can be queried “on the fly”. disadvantage: A SAS token is required (Entra ID authentication is always better). It involves “sub-tasks” in the form of defining fields and copying and pasting a URL containing the SAS token. If a lot of data is queried (a lot of JSON), performance can deteriorate. external_table in ADX The next option is to leverage Azure Data Explorer (ADX) and create an “external table” in ADX. Then point to your storage account and the associated container that holds the exported logs in JSON format. Deploying Azure Data Explorer clusters is beyond the scope of this blog. More information can be found here. Create an Azure Data Explorer cluster and database. Once secured, your Azure Data Explorer cluster can be put into a “quiet” state when not needed, saving you money. Be sure to start before testing the next step (and stop again when you’re done and no one needs it!). The data in the cluster can be accessed by navigating to the URI (see “Copy to Clipboard” in the image above). The detailed steps are as follows: Create an external table in Azure Data Explorer using the Azure Data Explorer web UI wizard. Here are some steps to get started: To create an external table, go to Queries on the left side of the screen, then right-click on the database and select “Create External Table.” Follow the wizard, giving your external table a name in the destination part, then selecting a container in the source part, and navigating to Storage Subscription, Storage Account, Container. Here we have selected the container created in option 1 above (the container created by export from Log Analytics). ADX can then read all the JSON files (it spans the hierarchy so you don’t have to browse them yourself) and select one of them as a template to generate the schema for the external table you want to create. If you are satisfied with the proposed schema, continue with the wizard. The result will be similar to this: (My data comes from Azure activity logs.) External tables in ADX can be queried using: external_table(‘table name’). The pros and cons of this option are: merit: not needed intake Add it to your Azure Data Explorer database. You can query blob content as if it were a native table. disadvantage: If a lot of data is queried (a lot of JSON), performance can deteriorate. Collect with ADX The final options covered here are: ingest Add data to a table in ADX. The difference with an external table that is actually read from a storage account is that in this case the data will actually reside in a table in the ADX database. It’s explained here. Import data from Azure Storage. Here are some advanced steps to get started: Right-click the database again and this time select “Import Data.” Follow the wizard to select Azure Storage and then connect to containers in your storage account via URI or by selecting an Azure component. You will need to create a new table as shown in the example below (“DataFromBlobs”). ADX is smart enough to inspect the JSON in the container hierarchy and generate a schema. You can change this by clicking ‘Edit Column’ or ‘Advanced’. I’m happy enough for the purposes of this blog, so I’ll click Finish for now. ADX will now take on the task of collecting JSON files from your storage account. This may take some time depending on the amount of data to be collected. At the end, you’ll see “Successfully Collected” next to each blob (JSON file). You can now query the data table directly. The pros and cons of this option are: merit: Once data is ingested, this is the fastest query experience in terms of performance. disadvantage: Data collection costs and resulting ADX security responsibilities. cleaning To remove the data collected in the last step, click the three dots next to the table name and select “Drop table” or run a query command. .drop table table name, such as .drop table DataFromBlobs). for external The table does not have a right-click option to delete the table. Instead, run the following query command: .drop external table table name (e.g. .drop external table external data) Don’t forget to stop your Azure Data Explorer cluster to save money. conclusion In this blog post, we explored several options for accessing logs archived in Azure storage account containers through export from Log Analytics and Sentinel or through custom logic apps. This is intended to address exceptional cases where archived data is required, such as historical context during an investigation. Source link Share 0 FacebookTwitterPinterestEmail info.odysseyx@gmail.com previous post Microsoft Copilot for Security Achieves HITRUST Certification next post SAP on Azure Product Announcements Summary – SAP TechEd 2024 You may also like Believe Hyp about Quantum Protection: Report March 11, 2025 Google Jemi is coming to Android Auto but the rollout is hassle March 10, 2025 How the drones are transmitting security on the US southern border March 7, 2025 Remember a uninterrupted tech trailballs: Tom Mitchell March 7, 2025 New HMD X 1 ‘Safe’ Phone: Protection for Parents, Great Factors for Kids March 5, 2025 Opera adds Agent AI to his browser March 4, 2025 Leave a Comment Cancel Reply Save my name, email, and website in this browser for the next time I comment.