Thursday, November 29, 2018

Spring Boot - Unit Test, Coverage Test, Sonarqube

What is more interesting than writing unit test and be sure that there is no bug in our program? In this post we will create a unit test for our Spring Boot application. After that we will do a coverage test, and use Sonarqube to see what we missed.

First, modify our pom.xml by adding spring boot starter test dependency.


Create a package for testing and a test case class.


Next, modify the test case class as shown below. We annotate the class with @RunWith(MockitoJUnitRunner.class) to indicate that it is a JUnit class. JUnit will run our test case class, CustomerTestCase, in MockitoJUnitRunner runner class. Here we use MockitoJUnitRunner because we want to mock some objects that are managed by Spring. By mocking a class means creating a fake object of that class. An example is mocking a DAO class to unit test a service or controller. A DAO class is managed by Spring, created and destroyed by Spring. When unit testing a service, it is not possible to ask Spring to inject that DAO. Instead, we can create a mock object of that class.


Since we want to test our CustomerController, we create a CustomerController object and annotate it with @InjectMocks. This means we want to create an object of this type. We also create ICustomerDao and Customer object and annotate them with @Mock. This means fake objects of ICustomerDao and Customer are created and will be injected into CustomerController. Note that CustomerController has variable type ICustomerDao and Customer.


We also have two methods and annotate them with @Before/@After. This means those methods will be run before/after all test methods have been run.


Lastly, we have two methods annotated with @Test. These are the real test case methods. The method must be a public void method. The testInsert() method calls CustomerController.addCustomer() method. The testUpdate() calls the same method, but before calling the method, it asks Mockito to replace customer.getId() with value 1. This is very useful if we want to simulate a customer that already has an id.


However when we run it, it gives an error on this line of CustomerController.addCustomer() method: FacesContext.getCurrentInstance().addMessage(null, facesMessage);. This happens because FacesContext.getCurrentInstance() returns null on unit test. What we can do is surround that code with an if condition to check for null value as shown below. This is, I think, the drawback of Mockito. Somehow we have to modify our code to make Mockito satisfied.


Running it will produce the following result on Eclipse.


Next we will use Eclipse code coverage tool. Eclipse Photon for Java EE Programmer is equipped with code coverage plugin. All we need to do is right click our test case class and run JUnit test coverage as shown below.


Once the test finished, our test case class will be highlighted green as shown below. It means the methods have been run completely.


And our controller is highlighted like this: Red means it never gets executed. Yellow means it is an if-else condition and gets executed. Green means it is executed.



The coverage percentage is shown below. It means in total only 11.6% of our code unit tested.


Next tool we will try is Sonarqube. We can download it from its website. At the moment of writing this post, the last version available is 7.4. They provide a setup information on their website to get sonarqube running as shown below.


Next we will see how to analyze our project. Once we logged in, click on the following Documentation sub menu as shown below.


Click left sub menu Analyzing Source Code.


SonarQube provides many code scanner depending on our project. Since we use maven as our build tool, we will choose Scanner for Maven link.


Once we click the link, a new page is displayed. We need to follow Initial Setup instruction on how to setup settings.xml file on our maven home folder. Here we add two sections, <pluginGroup> and <profile>.


Next we need to run the following maven command on our project folder.


Once the build success, go to SonarQube admin page and see if our project has been analyzed as shown below.


Monday, November 12, 2018

Spring Boot and Primefaces

Let's now continue adding Primefaces to our Spring Boot project. First we need to modify pom.xml to include Primefaces' themes dependency and repository as shown below.


Next we modify our index.xhtml and add the following xmlns.


Next we add a jsf form <h:form id="customer-form">. Inside the form we add a Primefaces panel <p:panel id="pnlCustomer" header="Add New Customer">. And inside the panel we add a Primefaces panel grid component <h:panelGrid columns="2" cellpadding="4">. We divide the panel grid into two columns. For each row in panel grid, we add a label and an input component. The structure can be seen below.


Now we add a label and an input text to enter customer's first name as shown below. The value in input text #{customerController.customer.firstName} means we get the value from or save the value to a bean named customerController that has property named customer which has a property firstName. We will see the bean later.


Next we will see a calendar component. We use it to choose a date of birth. In addition to the value, we can define its maximum date, date pattern, etc as shown below.


We also need to have a submit button to save the form. To do that we have a command button with an action listener. The listener is fired when the button is clicked. After the listener finish running, a JSF component named "growl" will be updated. This is where we display success/error message to user.



Customer id can be saved in a hidden input as shown below.


Before creating customerController bean, we have to understand the concept of bean scope in JSF. Below is a good explanation provided by someone in Stack overflow.

As of JSF 2.x there are 4 Bean Scopes:
@SessionScoped
The session scope persists from the time that a session is established until session termination. A session terminates if the web application invokes the invalidate method on the HttpSession object, or if it times out.

@RequestScoped
The request scope is short-lived. It starts when an HTTP request is submitted and ends after the response is sent back to the client. If you place a managed bean into request scope, a new instance is created with each request. It is worth considering request scope if you are concerned about the cost of session scope storage.

@ApplicationScoped
The application scope persists for the entire duration of the web application. That scope is shared among all requests and all sessions. You place managed beans into the application scope if a single bean should be shared among all instances of a web application. The bean is constructed when it is first requested by any user of the application, and it stays alive until the web application is removed from the application server.

@ViewScoped
View scope was added in JSF 2.0. A bean in view scope persists while the same JSF page is redisplayed. (The JSF specification uses the term view for a JSF page.) As soon as the user navigates to a different page, the bean goes out of scope.

Now we will create a package and put all JSF bean classes there. The structure of our project is shown below.


Our customerController is shown below. We annotate it with @ViewScope since we want the bean to exist as long as the xhtml is open. @Named indicates that we want to create a bean named customerController.


We then add our ICustomerDao and annotate it with @Autowired. We also have an object Customer which is used in our form, dateOfBirthMax & dateFormat which are used by calendar. We also create a method addCustomer() which is referenced in command button's action listener. The method is annotated with @Transactional so that any thrown Exception results database rollback. Here we check if the customer's id is empty, then we save it. Otherwise it is not a new customer and error message is shown. The success or error message will be displayed in growl Primefaces component.



Then we create a method and annotate it with @PostConstruct. This means the method will always be executed after the customerController bean is created. Here we can do any initialization as shown below.

In Spring Boot application class, add @ComponentScan annotation so that our controller's package is scanned by Spring.


Running this project gives you the following result.


There are so many other components we can try as example displaying data on a grid, changing themes, etc.

Sunday, November 11, 2018

Node.js and PostgreSQL

In this post we will create a REST service using node.js and PostgreSQL database. First thing to do is to install PostgreSQL module using npm (Node Package Manager) if we don't have it yet as shown below.


Then we have to create an Eclipse project as shown in this article. Create two new folder controllers and routes. Inside controllers folder create a js file CustomerController.js, and CustomerRoutes.js inside routes folder. The project structure is shown below.


Now we need to modify server.js as shown below. Here we import express and body-parser package. We also import our own CustomerRoutes.js. Line 8-11 means we parse body part of a request as json and register our route CustomerRoutes.js. Line 13-15 means we return a string 'url not found' if a url doesn't exist. We will see an example later.


Now we will see the CustomerRoutes.js as shown below. Here we import our controller CustomerController.js. Here we have two url /customer and /customer/:id. The first url is bound to http get and post. The second url is bound to http get by id, put, and delete. We will see the logic in CustomerController.js.


We will now see the controller. Here we import pg module and create a connection pool as shown below in CustomerController.js


Next we will see the method to get all data from database and return them as json. Here we connect to database using connection pool. If it failed we release it by calling done() method and throw the error. If succeed we make a query to get all customer data. We release the connection soon after query is run. If the query failed we just print the error to console. If succeed we return the result as json to the response object.


The next method we will see is the method to get customer by its id. Here we connect to database using connection pool. If it failed we release it by calling done() method and throw the error. If succeed we make a query to get a customer data by specifying its id from request parameter. We release the connection soon after the query is run. If the query failed we just print the error to console. If succeed we return the result as json to the response object.


The following method is a method to insert data to PostgreSQL. First we get customer's first and last name from request body. We then make a database connection using connection pool. If it failed we release it by calling done() method and throw the error. If succeed we make a BEGIN statement, INSERT statement, and lastly COMMIT statement. Actually, if we only have single query we don't need to do BEGIN and COMMIT statement. Those statement are the way we do transaction in node.js using PostgreSQL. We surely can create a ROLLBACK statement if the query failed which we don't need now since we only have a single insert statement. We also have to release the connection whenever either a statement failed or a COMMIT statement is executed. If the query failed we just print the error to console. If succeed we return the result as json to the response object.


The method to update a customer data is shown below. First we get customer id from request parameter and customer data from request body. We then make a database connection using connection pool. If it failed we release it by calling done() method and throw the error. If succeed we make a BEGIN statement, UPDATE statement, and lastly COMMIT statement. Same logic applies, if we only have single query we don't need to do BEGIN and COMMIT statement. Those statement are the way we do transaction in node.js using PostgreSQL. We surely can create a ROLLBACK statement if the query failed which we don't need now since we only have a single update statement. We also have to release the connection whenever either a statement failed or a COMMIT statement is executed. If the query failed we just print the error to console. If succeed we return the result as json to the response object.


And lastly we will see a method to delete a customer data. First we get the customer id from request parameter. We then make a database connection using connection pool. If it failed we release it by calling done() method and throw the error. If succeed we make a BEGIN statement, DELETE statement, and lastly COMMIT statement. As mentioned, if we only have single query we don't need to do BEGIN and COMMIT statement. Those statement are the way we do transaction in node.js using PostgreSQL. We surely can create a ROLLBACK statement if the query failed which we don't need now since we only have a single delete statement. We also have to release the connection whenever either a statement failed or a COMMIT statement is executed. If the query failed we just print the error to console. If succeed we return the result as json to the response object.


Let's now run the project and see the result in Postman.


A request to get all customers is shown below.


Next we will create a new customer and then update it.



Now we will get the new customer and delete it.



Lastly, what if we request a wrong url? Below is the result.


 

©2009 Stay the Same | by TNB