Join us
@jefsterjr ă» Dec 09,2021 ă» 2 min read ă» 3902 views ă» Originally posted on medium.com
Example of implementing an API using Spring Boot, Elasticsearch, Logstash, Kibana, and Docker.
In this article, we will cover an example of an application that uses Springboot + ELK + Docker. It is a simple application, only intended to demonstrate the concepts. I wonât go into a detailed explanation of all the elements (if youâre not familiar with this stack, I suggest doing a little research before doing this tutorial).
Pre-requirements
Hands-On
Springboot Application
Iâll start with the application; I created a REST API with 4 endpoints: This is how the applicationâs Dockerfile was:
Alternatively, you can use a Dockerfile like this:
As we are going to use several containers, we are going to use docker compose. At the root of your project, you will create the docker-compose.yml file. For now, docker-compose.yml will look like this.
For now, docker-compose.yml will look like this.
To test, run from the root of the project: docker-compose up
Test by calling the endpoints via Postman, browser, or others of your choice.
Elasticsearch
I wonât go into details about Elasticsearch (I suggest you look into it if you donât know about it).
Here we are setting Elasticsearch settings like port, memory variables, base directory in docker, etc. We also created a network called elk and added our service to it. To test, run: docker-compose up
, open this URL http://localhost:9200/, and you will have a similar result to this:
This shows us that Elasticsearch is working correctly.
Logstash
For Logstash, the process is a little different: first letâs create a folder called .logstash, to store some settings. Inside it, we will create the logstash.conf file, which will have the following information:
Here we will have the operating mode, which can be TCP or per file. In TCP mode, logstash will get real-time data from the port specified in the logback-spring.xml file, inside the project in the resources package.
logback-spring.xml file:
It also contains the output which, in our case, will be for Elasticsearch, as described below. In the output information, we define the destination, elastic user and password (in this case it is with the default values), and index. The index will serve to filter the information only from this application in kibana. In docker-compose.yml, the container will look like this:
Kibana
Kibana has its simple configuration, just add to docker-compose.yml:
After that, it is now possible to access kibana through the browser at URL: http://localhost:5601/. When using kibana, you will need to add the index we created earlier in logstash to get the information. To do this, access the kibana and enter: http://localhost:5601/app/management/kibana/indexPatterns Right menu/ Stack Management/ Index Patterns/ Create index pattern. As an example, the application we cited as an example was: library-logstash-*.
Code available at: https://github.com/jefsterjr/library/tree/main
Originally published at http://github.com.
Join other developers and claim your FAUN account now!
Software Engineer, Midway
@jefsterjrInfluence
Total Hits
Posts
Only registered users can post comments. Please, login or signup.