This optional exercise provides a basic introduction to source code analysis using SonarQube. If you choose to do it, you should use your own PC, not a School of Computing lab machine.
Download the code for this exercise into the top level of your
repository and unzip the file. This should create a directory
exercise7. Use Git to stage and commit this directory, then
push the commit up to gitlab.com.
Visit the Try Out SonarQube page. Find the section on installing a local instance of SonarQube from a Zip file, and follow the instructions.
Note: DO NOT put the downloaded file in your repository, or unzip it there!
After issuing the command to run SonarQube, wait for the message ‘SonarQube is up’ to appear in the terminal window before you attempt to login via a web browser. This might take a few minutes…
After logging in to SonarQube in your browser, click the Create New Project button. Specify ‘exercise7’ as the project key and display name, then click Set Up.
Specify ‘exercise7’ as the token name and click the Generate button. This will generate a unique token that will be used to link the results of code analysis with the project you have created in SonarQube.
Click the Continue button, then specify Java as the main language of the project and Gradle as the project build system.
At this point, SonarQube will display a Gradle command that can be used to perform code analysis and update SonarQube with the results. Leave the page open in your browser so you can copy this command in the next phase of the task.
Open a terminal window if necessary and cd to the
in your repository.
Copy the Gradle command displayed by SonarQube (use the handy Copy button in the top-right corner of the panel). Paste the command into your terminal window and run it.
Gradle will run the unit tests, measure the code coverage provided by the tests, then perform further analysis of code quality. It will then upload the results to the SonarQube server.
When the Gradle task has completed, return to the browser. On the Overview tab you will now see a dashboard summarizing the results of code analysis.
Note the following features:
The quality gate status is ‘Passed’, coloured green. This indicates that the code passes the default thresholds on code quality set by SonarQube.
The code gets an ‘A’ rating for reliability, security and maintainability, but SonarQube has detected 5 code smells, which have incurred an estimated 52 minutes of technical debt – in other words, SonarQube reckons that it will take a developer nearly 1 hour to eliminate the code smells.
Code coverage by the unit tests is very good, but not 100%.
There are two blocks of code that SonarQube sees as duplicates of each other. Together, these blocks represent nearly 16% of the lines of code in the project.
Click on the Measures tab. As an overview of the project, SonarQube plots code coverage versus estimated technical debt for all the classes. Each class is plotted as a circle whose radius is proportional to the number of lines of code. You can click on a circle for more details about a class. The ideal position for classes is the bottom-left corner of the graph.
In the list of categories on the left of the screen, expand the ‘Maintainability’ category and click on the Overview link to see a plot of technical debt versus lines of code. As before, each class is circle. Circle colour indicates class maintainability, on a scale from green (‘A’-rated) to red (‘E’-rated). Circle radius is proportional to the number of code smells detected in the class. Small green circles at the bottom of the plot are best here.
In the list of categories on the left of the screen, expand the ‘Coverage’ category and click on the Overview link to see a plot of code coverage versus cyclomatic complexity. Small circles at the bottom of the plot, concentrated more towards the bottom-left corner, are best here.
Finally, expand the ‘Complexity’ category. This will show both the total cyclomatic complexity for the project and the total cognitive complexity. The distinction is important here. Increases in cyclomatic complexity need not be a serious concern unless they are accompanied by increases in cognitive complexity.
Click on the Issues tab to view the code smells in more detail:
The five smells are distributed across the
Expand the ‘Severity’ category on the left of the screen to see the severity filters. Use these filters to see which of the smells are regarded as minor issues and which are regarded as major issues.
If you are unsure about why SonarQube has identified these code smells, click on the ‘Why is this an issue?’ link.
Click on an issue to see its location in the code.
Start by investigating code coverage further. Although you can do this via SonarQube, you’ll be able to see coverage more clearly by viewing the report generated by the JaCoCo Gradle plugin.
build/reports/jacoco/test/html/index.html in a web browser and
navigate through the code via the links. You will find that one line of
Car class is highlighted in red, indicating that it has not been
touched by the unit tests.
Fix the issue by adding an appropriate unit test to the
Run the tests with
They should all pass.
Reopen the JaCoCo report in your browser. Coverage should now be 100%. Commit the code change to your repository.
Now return to SonarQube and examine the minor issue regarding failure
to use the
isEmpty() method. Fix this issue and make sure that
the unit tests still pass. The commit the code change to your repository.
Use Gradle to rerun the code analysis for SonarQube.
Examine the minor issue regarding use of an abstract class for
Implement the change suggested by SonarQube (i.e., convert it to an
getFrequentRenterPoints() as a default method.)
Make sure that the unit tests still pass. Tnen commit the code change
to your repository.
Use Gradle to rerun the code analysis for SonarQube.
Examine the results of the analysis. You should find that the Overview tab in SonarQube now shows 100% code coverage and only one remaining code smell. There will also be an Activity graph at the bottom of the page, showing how the number of issues has fallen over time.
Switch to the Measures tab and examine some of the visualizations seen earlier, to see how they have changed.