In my previous blog, we have done ELK installation on windows 10 and we have even tried to push messages from input console to Elastic Search and finally viewed on Kibana Server.
I will write a separate blog on why do we need ELK?
In this blog, I’ll show you how can we push spring boot application log directly to Elastic search using Logstash which we can analyze on Kibana and If you don’t know how to install ELK on windows 10 then you can refer my previous blog and start Elastic Search and Kibana server.
Prerequisite
- Elastic Search and Kibana running on your machine
- Basic knowledge of Spring boot application
If you don’t want to start your application from scratch then you can download one spring boot application from my GitHub repository as well.
I am assuming that the Elastic Search and Kibana server are running on your machine and you have a fair idea of how to start the Logstash server and what is Logstash conf file.
So, to push spring boot logs continuously to Elastic Server, We have to open one TCP port in Logstash server and for that we have to create one Logstash config file (say elklogstash.conf) under ${LOGSTASH_HOME}/conf directory mentioning on which port TCP port should be listening under input filter and where to push the data once we received under Output filter.
For simplicity, I am skipping the filter tag as it is optional.
elklogstash.conf
Now start the Logstash server bypassing newly created conf file.
bin\logstash -f .\config\elklogstash.conf
Cool! Now Logstash server is also up and running and if you observe the log, you will realize that it is also listening on port 4560 as mentioned in the conf file. Configure the newly created index (elkbootlogs) on Kibana as we have done during the ELK setup.
Now let's do some changes to spring boot application so that it can push all the logs to 4056 TCP port.
For this tutorial, I am using spring-logger project from my Github repository.
Add below dependency to the pom.xml file. We need Logstash encoder to encode messages.
<!-- Added for logstash Encoder-->
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>6.2</version>
</dependency>
Open logback-spring.xml file which is under the resource folder and create new appender (say elk). The task of this appender is to push logs to the destination TCP socket and under this appender, compulsory use LogstashEncoder.
<appender name="elk" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>localhost:4560</destination>
<!-- encoder is required -->
<encoder class="net.logstash.logback.encoder.LogstashEncoder" />
</appender>
Add new appender to root level
<!-- LOGGING everything at INFO level -->
<root level="info">
<appender-ref ref="RollingFile" />
<appender-ref ref="Console" />
<appender-ref ref="elk" />
</root>
Save all files and start your application. So, we are done with all the setup. Its time to check whether all the changes are done properly or not.
Open Kibana on your browser (http://localhost:5601) and select your index under the Discover tab. You will see all logs are populating on Kibana as well.
Congratulations! Our configuration is working absolutely fine and it is pushing logs to Elastic Search.
You can download the source code from here, ELK code chnages are under elkstack branch.