Monday, December 24, 2018

Jenkins & SonarQube

In this post we will use Jenkins to run code analysis on SonarQube. Below are steps needed:
  • Download Jenkins from its official website. I downloaded Jenkins war version from the following options and deploy it to Apache Tomcat webserver.


  • Run Tomcat and follow installation instruction. You will be prompted options to install commonly used Jenkins plugins. Choose to install them. You will also need to setup administrator login.
  • Next we need to install some more plugins. To do this click Manage Jenkins and then Manage Plugins.

  • The Plugin Manager screen will be displayed. Make sure the following plugins have been installed: Git, SonarQube, Email



  • Next we will make configurations of the installed plugins. To do that click Manage Jenkins and then Configure System.

  • The Configuration screen will be displayed. First we need to configure SonarQube as shown below. Enter value for Name, this can be any name to identify this SonarQube configuration. Enter our SonarQube URL in Server URL field. And lastly, enter Server Authentication Token. This token is available in our SonarQube user profile as shown below. We can also generate a new one if it doesn't exist.


  • Next we will configure email as shown below. Here we configure outgoing SMTP server for sending email.

  • Next step is to configure tools. Here we will configure JDK and SonarQube Scanner. To do that click Manage Jenkins menu and then Global Tool Configuration as shown below.

  • First we will configure JDK as shown below. Here we tell Jenkins where to find our JDK home.

  • Next we need to configure SonarQube Scanner as shown below. The SonarQube Scanner is needed to do code analysis. In SonarQube there are many type of scanner, for example scanner for maven which we have tried in this post. Here we use common scanner that will work for various type of project, not only maven.

  • Now that  we are all set, we will create a Jenkins project that will do the following tasks: pull source code from git, compile and run the code, ask SonarQube to analyse the code, send email if anything fails. To do that click New Item menu and choose Freestyle Project as shown below.

  • The project configuration screen will open. Here we need to do, again, several configurations. First we need to specify our git project information as shown below. Note that the project only contains one java class that prints Hello Jenkins on console as shown below. Jenkins will pull our source code from git before building them.


  • Next we will tell Jenkins to do two tasks, to execute some java commands, and to ask SonarQube to analyse the code. First, we will execute java command javac to compile a class, and command java to run the java code as shown below.
  • Second task is to execute Sonar Scanner as shown below. Here we put some properties. sonar.projectKey is used to identify this project on SonarQube. sonar.sources is folders where our source code are available. sonar.projectBaseDir is base folder where SonarQube should look for our code to analyse.
  • Last step is to send email when execution fails as shown below.
  • Now we can build the project. Click Build Now and the build will be available on Build History. Clicking on a single build history will bring us to build menu. Here we can click on Console Output to see the execution console.


  • Below is the output console. Here we can see that Jenkins runs our javac and java command. It also asks SonarQube to run an analysis. Email will only be sent if there is a build error.

  • In SonarQube we can also see that the project has been analysed as shown below.

Sunday, December 9, 2018

Spring Boot and Kafka

Let's now play around with Spring Boot and Kafka. We use Kafka as messaging system. Here are differences between ActiveMQ and Kafka.

ActiveMQ

In ActiveMQ we have topic and queue. Topic works like publish-subscribe while queue works like producer-consumer. In a topic, a message is published to many subscribers. After all subscribers, including durable subscribers, receive the message then the message is removed. In a queue, a message can only be consumed by a consumer. It works like point to point. Once a message is consumed, it is removed. Other consumers won't receive it.

Kafka

Kafka only supports topic but somehow looks like the combination of ActiveMQ topic and queue. Messages in Kafka are kept for a number period of time even after consumed. A topic in Kafka consists of many partitions. A message sent to a topic is saved in one of the partitions. A consumer then reads from the partitions. The difference is that there is a concept of consumer group in Kafka. If one or more consumer has same group id, we can see as if it is a consumer that has multiple instances. This is better explained with images as shown below.


This is a topic that has one partition. There is a consumer group named Group 1 that has one consumer.


This is a topic with one partition and one consumer group. The group contains two consumers. Messages in Partition 1 will be sent to either Consumer 1 or Consumer 2. So the other consumer will be idle and does nothing? This is not allowed in Kafka. We can't have consumer more than partition.


This topic has two partitions and a consumer group. The consumer group contains two consumers. A possible approach is messages in Partition 1 will be sent to Consumer 1 and messages in Partition 2 will be sent to Consumer 2.


This topic has two partitions and two consumer groups. Each group contains two consumers. A possible approach is messages in Partition 1 will be sent to Consumer 1 of Group 1 and Group 2, messages in Partition 2 will be sent to Consumer 2 of Group 1 and Group 2.


This topic now has three partitions. A possible approach is message in Partition 1 and Partition 3 will be sent to Consumer 1 of each group, and messages in Partition 2 will be sent to Consumer 2 of each group.

Having this knowledge, we can conclude that in Kafka:
  • The number of consumer in a group must be less than or equal to the number of partition in that topic. If not, the consumer will be idle.
  • A partition is consumed by one consumer in a group.
  • Ordering of messages in a partition is guaranteed by Kafka since one partition is consumed by one consumer in a group
Start Coding

Now we will use Kafka in our Spring Boot project. First we need to modify our pom.xml as shown below.


Then we will create a configuration class. We do this since we want to be able to send a Customer object to Kafka instead of a string. We annotate the configuration class with @Configuration to indicate that this is a Spring configuration class and @EnableKafka to indicate that we want to enable Kafka messaging listener.


Next we create private variables which values are defined in application.properties. Group id is the consumer group id. We have discussed it above that in Kafka consumers can be grouped together. We can see it as a consumer that has many instances. Each instance is responsible for a partition. Then we also have bootstrap server which is the Kafka server name and port. Lastly we have trusted packages flag. This flag is required to let Kafka know that all classes are trusted to be sent to Kafka server as message and be consumed.



Next we have to create producer factory bean. Here we supply the bootstrap server and key-value serializer class.


Next we have to create consumer factory bean. Here we supply the bootstrap server, key-value deserializer class, consumer group id, and also trusted package flag.


Next step is to create a Kafka template bean as shown below. It takes producer factory bean as the parameter. Also note that the key is string and value is Customer.


Last configuration to setup is Kafka listener container factory. It takes consumer factory bean as parameter.


Next things to see is the sender and receiver classes. The KafkaSender class is annotated with @Component to indicate that it is a Spring component class. It has Kafka template bean object and a method named send() to send message to topic. Note that the message is a Customer object.


Then KafkaReceiver class is responsible to receive messages from Kafka topic. It has a method that is annotated with @KafkaListener. Here we define the topic name. Note that the message we receive is a Customer object.


To send a message to Kafka we will create a Kafka sender bean in CustomerController. Then when a new Customer is added we send the new Customer data as message to Kafka in CustomerController.addCustomer().



To test it we need to run Kafka server first. Apache provides a quickstart page here to install and setup Kafka for the first time. The steps are:

  • Download and unzip the Kafka files.
  • Start Zookeeper server using command:
bin\windows\zookeeper-server-start.bat config\zookeeper.properties
 
  • Start Kafka server using command:
bin\windows\kafka-server-start.bat config\server.properties
 
  • Create a topic. Provide topic name and number of partition. In this example we set only one partition.
bin\windows\kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic mq.request
  • Check whether the topic has been created using command
bin\windows\kafka-topics.bat --list --zookeeper localhost:2181
 
  • Add more partitions using command
bin\windows\kafka-topics.bat --zookeeper localhost:2181 --alter --topic mq.request --partitions 3
 
  • Check whether the partitions have been added using command
bin\windows\kafka-topics.bat --describe --zookeeper localhost:2181 --topic mq.request

Now start our application and create a new customer. The following message will be printed on console.


 

©2009 Stay the Same | by TNB