Table of Contents
Azure Log Analytics is a powerful monitoring and analytics tool within the Azure platform, offering the ability to collect, analyze, and act on telemetry data from cloud and on-premises environments. Custom logs in Azure Log Analytics provide a flexible approach to capturing custom data that is not automatically collected by Azure Monitor agents. This capability is valuable for security operations analysts who might need to log custom events, performance data, or telemetry from various applications and services for analysis and compliance purposes.
Custom logs allow an organization to define a new data type in Azure Log Analytics, to which data from text files can be uploaded. For instance, you may have logs generated by a custom application or any non-standard logs you’d like to analyze alongside your standard Azure telemetry. These logs can then be used in queries, alert rules, and other Log Analytics features.
To create custom logs in Azure Log Analytics, follow these key steps:
Once the custom log is defined, it should be collected by the Azure Monitor agent:
Custom logs can be queried just like any other data type in Log Analytics. Here’s an example of how you might write a Kusto Query Language (KQL) query for your custom log data:
CustomLog_CL
| where TimeGenerated > ago(1d)
| where LogLevel == “Error”
| summarize count() by bin(TimeGenerated, 1h), Component
| render timechart
The above query counts the number of error-level logs from the CustomLog_CL
custom log over the past day, grouped by hour and component, displaying the results in a timechart.
As with any data collection, ensure compliance with data governance and security policies:
Custom logs can be particularly beneficial in several scenarios, such as:
By leveraging custom logs in Azure Log Analytics, security operations analysts have a robust tool to extend their monitoring and analytics capabilities, ensuring they can effectively track and respond to potential security incidents in their organization’s IT environment.
Answer: B) False
Explanation: Azure Log Analytics can store logs from Azure services as well as custom logs that include data from other sources.
Answer: D) All of the above
Explanation: Custom logs can be created using the Azure Portal, PowerShell, the REST API, or Azure CLI.
Answer: D) A and B
Explanation: Azure Log Analytics supports creating custom logs using data in JSON and CSV formats.
Answer: B) False
Explanation: After a custom log is uploaded to Azure Log Analytics, it may take up to one hour for the data to be available for querying.
Answer: D) Azure Log Analytics Agent
Explanation: The Azure Log Analytics Agent collects telemetry from a variety of sources and forwards it to Azure Log Analytics.
Answer: B) False
Explanation: Storing PII data in Azure Log Analytics must be done in compliance with data privacy laws and Azure policies. Users should take care to handle such data appropriately.
Answer: C) Custom Log Name
Explanation: Before ingesting custom data, a Custom Log Name, which will act as the target table in Azure Log Analytics, must be defined.
Answer: A) API Key
Explanation: When using the HTTP Data Collector API to send logs to Azure Log Analytics, an API Key (along with the Workspace ID) is required for authentication in the request header.
Answer: D) All of the above
Explanation: Azure Automation Accounts, Azure Logic Apps, and Azure Stream Analytics can all be used to automate the process of collecting and submitting custom logs to Log Analytics.
Answer: A) True
Explanation: After custom data has been ingested into Azure Log Analytics, the schema (custom log name, fields, etc.) is fixed and cannot be modified.
Azure Log Analytics is a service that allows you to collect and analyze log data from a variety of sources in Azure and other platforms.
A custom log is a log that you can create in Log Analytics to store data that is specific to your environment or application.
You might want to create a custom log to store data that is not already collected by one of the built-in logs in Log Analytics, or to store data that is unique to your environment or application.
To create a custom log, you need to define a data source, create a log definition, and then start sending data to the log. The exact steps will depend on the data source and the type of log you are creating.
A data source is a source of log data that you want to collect and analyze in Log Analytics. Data sources can include virtual machines, containers, applications, and more.
You can define a data source in Azure Log Analytics by creating a data collection rule that specifies the type of data you want to collect, the machines or applications you want to collect it from, and any filters or transformations you want to apply.
A log definition is a schema that defines the structure of a custom log in Azure Log Analytics. The log definition specifies the name and data type of each field in the log.
You can create a log definition in Azure Log Analytics by defining a JSON schema that describes the structure of the log. You can then use this schema to create the log in Log Analytics.
Some potential benefits of creating a custom log in Azure Log Analytics include having more control over the data you collect and store, being able to store data that is unique to your environment or application, and being able to query and analyze this data using Log Analytics.
You can query and analyze data in a custom log using Log Analytics queries, which allow you to filter, group, and aggregate data in a variety of ways. You can also use Log Analytics workspaces to create dashboards and reports based on the data in your custom log.
If this material is helpful, please leave a comment and support us to continue.