Hot questions for Using Azure in logging
I recently discovered that there was log4j extension for application insights. So following the example online I attempted to configure application insights and log4j to log items from my servlets living in an azure hosted tomcat.
Well, the example seems very incomplete as it never makes mention of the key at all. From looking through the source I see an example (test?) that uses
<param> within the log4j.xml but not much explanation of how to use or debug the actual logger.
Does anyone out there have any pointers on how to actually use/implement the ApplicationInsightsAppender for log4j?
Here's the source on github https://github.com/Microsoft/ApplicationInsights-Java
You don't have to configure the instrumentation key for the appender, it will be done automatically if you properly configured the AI SDK.
As mentioned in the first section of the log4j extension for application insights article, is it assumed that you already configured Application Insights for Java and more specifically configured the instrumentation key using the AI configuration file.
Once you've done this, the instrumentation key will be taken from the configuration file.
The appenders indeed have hidden API for setting the instrumentation key, though not documented.
I have created a java MVC web app and deployed on the Azure cloud. Now I am trying to capture my web application logs into the text/CSV file and store that text/CSV file in Azure Blob Storage. Can anyone tell me how to do this? How to access Azure Blob Storage. I went through this article but was not of much help.
Please anyone help.
Note- In on premises application we can do the same using properties file & log4j jar.
I want to do the same in Azure web App.
Based on my understanding, I think the simple way to satisfy your needs is using a log4j appender for storing the logs into Azure Blob Storage. It only need to change the
log4j.properties file to enable the appender for Azure.
There are two unoffical project on GitHub for implementing log4j appender for Azure Table Storage, not for Blob Storage.
They are as below.
- saksham/log4j-azure: https://github.com/saksham/log4j-azure
- JMayrbaeurl/azure-log4j: https://github.com/JMayrbaeurl/azure-log4j
You can try to refer to these codes to implement your appender for Blob Storage, such as Append Blobs.
But I think using Azure Table Storage is the simplest way for logging, exactly as the authors thought.
We are trying to develop a spark java application in Azure HDInsight linux cluster. We have been able to submit the application through Livy and it is working fine.
The problem we are facing, is related to logging. How can we use the log4j here, because if we use RollingFileAppender then we have to put the path of the output .log file. In our case then we have to write the log in the blob storage. But we think normal logging is not going to work.
We have found some logging mechanism provided by Azure itself through Azureinsights but all of them are made for web project I guess and is not working from our java application.
May you please help me here how we can implement application logging from a spark java application? May be in future people from the organization may think to use Splunk like tools to work on the logs.
Looking forward to your help.
Based on my understanding, I think you want to write logs in the blob storage or HDFS based on Blob Storage.
So for logging into the container of blob storage, my suggestion is that you can try to use the third-party appender for
log4j, such as AzureLogAppender. Or the other way for logging into the table storage is that using these appenders for
log4j, such as log4j-azure and azure-log4j.
If you want to write logs into HDFS on Azure, there are two ways as below.
I'm developing an application using Azure's Java SDK and Maven. This application sends data to an IoT Hub and some other functionalities that are not important for the scope of the question.
I implemented my own logging inside the application by using
log4j2 and I'm fine with that since I can modify and change it however I want.
The problem arose when I checked this warning that was coming up in my application's console output:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Thanks to this SO question I was able to do the correct move and add the dependency inside my
pom.xml file like so:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.project.myProject</groupId> <artifactId>myProject</artifactId> <packaging>jar</packaging> <version>1.0.0</version> ... <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-jdk14</artifactId> <version>1.7.25</version> </dependency> ...
After this addition the Azure's SDK started printing to console a lot of information that I don't really want to see. This should be the class that originates the logging. Following, some output that gets written to console by itself.
... Jun 07, 2018 8:09:18 PM com.microsoft.azure.sdk.iot.device.CustomLogger LogInfo INFO: IotHubConnectionString object is created successfully for iotHub.azure-devices.net, method name is <init> Jun 07, 2018 8:09:19 PM com.microsoft.azure.sdk.iot.device.CustomLogger LogInfo INFO: DeviceClientConfig object is created successfully with IotHubName=iotHub.azure-devices.net, deviceID=device01 , method name is <init> Jun 07, 2018 8:09:20 PM com.microsoft.azure.sdk.iot.device.CustomLogger LogInfo INFO: DeviceIO object is created successfully, method name is <init> Jun 07, 2018 8:09:20 PM com.microsoft.azure.sdk.iot.device.CustomLogger LogInfo INFO: Setting SASTokenExpiryTime as 2400 seconds, method name is setOption_SetSASTokenExpiryTime ...
I've already tried to disable the
Logger but with no success (followed this SO question).
I would like to know if someone has ever had this problem and if so how can I disable the logging features or else suppress the warning? Thanks a lot in advance!
There is a blog
How to Configure SLF4J with Different Logger Implementations which you can refer to to configure your
slf4j-jdk14 logger implementation, as below.
Using slf4j with JDK logger
The JDK actually comes with a logger package, and you can replace pom.xml with this logger implementation.<dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-jdk14</artifactId> <version>1.7.5</version> </dependency>
Now the configuration for JDK logging is a bit difficult to work with. Not only need a config file, such assrc/main/resources/logging.properties, but you would also need to add a System properties -Djava.util.logging.config.file=logging.properties in order to have it pick it up. Here is an example to get you started:level=INFO handlers=java.util.logging.ConsoleHandler java.util.logging.ConsoleHandler.level=FINEST deng.level=FINEST
There are two ways to avoid outputing these
INFO logs to console.
- Upgrade the logging level from
SEVERE, you can refer to the Oracle Javadoc for class
Level, as below, then to not output low level logs.
The levels in descending order are:SEVERE (highest value) WARNING INFO CONFIG FINE FINER FINEST (lowest value)
- To change the
ConsoleHandler, there are four other handlers which you can use, as below, please see
- ConsoleHandler: This Handler publishes log records to System.err.
- FileHandler: Simple file logging Handler.
- MemoryHandler: Handler that buffers requests in a circular buffer in memory.
- SocketHandler: Simple network logging Handler.
- StreamHandler: Stream based logging Handler.
For example, to output logs to a file
handlers=java.util.logging.FileHandler java.util.logging.FileHandler.level=INFO java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter java.util.logging.FileHandler.limit=1024000 java.util.logging.FileHandler.count=10 java.util.logging.FileHandler.pattern=logs/mylog.log java.util.logging.FileHandler.append=true