Home NewsX Demystifying Log Ingestion API – Microsoft Community Hub

Demystifying Log Ingestion API – Microsoft Community Hub

by info.odysseyx@gmail.com
0 comment 8 views


In this blog, we will take a closer look at how to use the Log Ingestion API technology to collect application logs and transform them into a log analysis workspace. Before we get into the details, let’s look at the different types of collection techniques based on the application log type.

  • When applications and services log information to text files instead of standard logging services such as Windows Event Log or Syslog. custom text log You can utilize the Log Analysis workspace to collect these text file logs.
  • When applications and services log information to JSON files instead of standard logging services such as Windows Event Log or Syslog, Custom JSON log You can utilize the Log Analysis workspace to collect these text file logs.
  • By understanding how your application sends data to an API, you can take advantage of: Log collection API Send data to your Log Analytics workspace.

Custom text/JSON logs are beyond the scope of this blog. I may write a separate blog covering these techniques later. In this blog we will focus on using the Log Ingestion API to stream data into a log analytics workspace and transform the data for optimal use.

memo: The purpose of this blog is to demonstrate how to collect logs using the log collection API. To help you understand, I will refer you to the public documentation.

Log Collection API Overview

Azure Monitor’s log collection API allows you to send data to your Log Analytics workspace using one of the following: REST API calls or client library. The API allows you to send data to: Supported Azure tables or Custom tables you create.

Any application that can call a REST API can send data to the log collection API. This may be a custom application you create, or it may be an application or agent that understands how to send data to an API.

memo: Data sent from your application to the API must be in JSON format and match the structure expected by DCR. A DCR does not necessarily have to match the structure of the target table, as it can include: conversion Transform the data to match the table structure.

The diagram below shows the data flow from the application to the data collection pipeline where transformations are applied, and the transformed data is ingested into the Log Analysis workspace.

Apsharan_15-1728271389022.png

Advantages of Log Collection API

Now that we’ve covered the overview of the Log Ingestion API, let’s understand why you should consider the Log Ingestion API instead of the traditional Data Collector API.

  • This is because collecting data requires DCR-based custom tables. Log collection API support conversionallows you to modify data before it is ingested into the target table, including filtering and data manipulation.
  • Data can be sent to multiple destinations.
  • You can manage the target table schema, including column names, and whether new columns are added to the target table when the source data schema changes.
  • The HTTP Data Collector API is deprecated. It is a solution that can be used in the future as it has been replaced by the Log Ingestion API.

Log Collection API Prerequisites

There are certain prerequisites to configure the Log Ingestion API:

  • App Registration and Secrets: Application registration is for authenticating API calls. You must be granted permission for DCR. The API call includes the application’s application (client) ID, directory (tenant) ID, and application secret values.
  • Data Collection Endpoint (DCE): Configuration requires the DCE’s endpoint URI. The endpoint URI for a data collection endpoint (DCE) is the connection through which a data source sends collected data to Azure Monitor for processing and collection.
  • Tables in the Log Analytics workspace: You must have a table in your Log Analytics workspace to send data. You can use one of the following: Supported Azure tables Alternatively, create a custom table using any of the available methods.
  • Data Collection Rules (DCR): Azure Monitor Data Collection Rules (DCR) Understand the structure of incoming data and what to do with it.

Now that we’ve covered the ‘what’ and ‘why’ parts, let’s understand the ‘how’ part.

Let’s see how to send data to Azure Monitor using the Log Ingestion API. We’ve covered composition-based approaches in detailed public documentation on this topic. Azure portal and ARM template.

I will be utilizing public documentation on this blog. Let’s get started:

Register your app and create a secret

  • Go to Entra ID > App Registration > Create New Registration as shown below.

Apsharan_16-1728271389030.png

  • we will need Application (client) ID and Directory (Tenant) ID At a later stage.
  • Now you need to create an application client secret, similar to creating a password to use with your username. Select Certificates & Secrets > New Client Secret. Give your secret a name to identify its purpose and choose an expiration period.

Apsharan_17-1728271389043.png

  • Make a note of the secret value as it cannot be recovered once you navigate away from this page.

Create a data collection endpoint

  • Go to Data collection endpoints and create a new data collection endpoint as shown below.

Apsharan_18-1728271389053.png

  • Copy the endpoint URI from DCE. You will need it in a later step.

Apsharan_19-1728271389067.png

Apsharan_20-1728271389087.png

Apsharan_21-1728271389098.png

Create a new DCR-based custom table in your Log Analytics workspace

  • To avoid duplication on your blog, we recommend that you: public document At this stage. The steps for creating DCR-based custom tables and data collection rules are very well documented in the documentation.
  • In my lab the custom table name is “CustomLogDemo_CL“attached to”Demo-DCR-UI“Data collection rules.

Apsharan_22-1728271389110.png

  • I also tried some Parsing and Filtering Get better visibility into your data.
  • Find the data collection rule created for this exercise and copy the immutable ID. You will need it in a later step. The immutable ID is a unique identifier for the data collection rule.

Apsharan_23-1728271389125.png

Assign permissions to DCR

  • The final step is to grant your application permission to use DCR. Any application using the correct application ID and application key can now send data to the new DCE and DCR.
  • Go to Data Collection Rules > IAM > Add Role Assignment > Select Monitoring Metric Publisher > Select the application you created.

Apsharan_24-1728271389151.png

  • Select Review and assign permissions.
  • Follow the instructions in the public document to: Generate sample data.
  • Once sample data is generated send sample data Follow our public documentation to add it to Azure Monitor.

Let’s see if it really works

If all configurations are correct, your logs should appear in your Log Analytics workspace as shown below.

Apsharan_25-1728271389163.png

Apsharan_26-1728271389177.png

data conversion

Now that we’ve successfully sent data to a DCE-based custom table using the Log Ingestion API, let’s implement some transformations.

For better understanding, you can refer to my blog about conversion techniques.

I’ll run the transformation myself to implement in my custom table.

In my use case, I will split the data into two tables. CustomLogDemo_CL and Split_CustomLogDemo_CL. Split_CustomLogDemo_CL The table only has rows with ResponseCode of 200 and the rest of the data is CustomLogDemo_CL table.

The detailed steps are as follows:

  • Go to DCR > Select the DCR created for this exercise > Automation > Export Template > Deployment > Edit Template.
  • I created two streams for this task, following the template below.
    • One stream transmits all data except when the response code is 200. CustomLogDemo_CL table.
    • Another stream is sending data with response code 200. Split_CustomLogDemo_CL table.

Apsharan_27-1728271389193.png

Let’s verify that it works as expected.

  • Go to Log Analytics workspace > Logs and view both tables as shown below.

Apsharan_30-1728271586111.png

Apsharan_31-1728271586123.png

As you can see, we have split the incoming data as desired. Transformations allow you to optimize your tables as needed.

I would like to thank my colleague Kaustubh Dwivedi. @Kausd Thank you for sharing his expertise and co-authoring this blog with me.

References:





Source link

You may also like

Leave a Comment

Our Company

Welcome to OdysseyX, your one-stop destination for the latest news and opportunities across various domains.

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

Laest News

@2024 – All Right Reserved. Designed and Developed by OdysseyX