Big Data/Analytics Zone is brought to you in partnership with:

34, Agile Marketing for Startups and IT companies. I help startups & software companies to move from innovation to lead generation. Marketing & Product at @inWeboTech (Strong Authentication solutions » http://inwebo.com) Michael has posted 14 posts at DZone. You can read more from them at their website. View Full User Profile

Apache Hadoop: Decreasing Technical Debt through Refactoring

01.23.2013
| 5161 views |
  • submit to reddit

Technical Debt is worth nothing if no pragmatic action is taken into code, in order to control and tackle it. To illustrate the capability to automatically correct code defects that increase this unintended debt, we performed code refactoring on two subprojects of the Hadoop project : Hadoop Common and Hadoop Mapreduce. Thanks to Scertify, we were able to correct 25K defects in 2 minutes. In other words, 14% of the Technical Debt has been written-off without any human effort needed.

Initial analysis

According to Wikipedia, Apache Hadoop is "an open-source software framework that supports data-intensive distributed applications". This framework contains several projects, Common and Mapreduce are two important ones with respectively 120K and 162K lines of code (blank lines and comments excluded). The version we worked with is the last development version : 3.0.0-SNAPSHOT. We ran Scertify Refactoring Assessment, our open-source plugin for Sonar, on the projects, in order to get an overview of their technical debt.

Technical debt is defined as the amount of time needed to correct all defects detected. As you can see on screen-shots below, Common has a technical debt of 70 days and Mapreduce of 66 days. Scertify Refactoring Assessment also computes the potential of automatic correction of the technical debt : the debt write-off. They both have a good potential for automatic refactoring, respectively 38 and 36 days. So, the next step is to use Scertify  to perform this automatic refactoring.
By the way, if you would like to try it with your own source code, the installation & user guide of Scertify is available here.

 

Hadoop Common Original TechdebtHadoop Mapreduce Original Technical debt

We scrolled among the various errors and we chose 8 rules to perform the demonstration.


Refactoring rules for the demonstration

Here's a presentation of the refactoring rules we used in this demonstration. As you can see, some rules need parameters to be efficient. This is the case of rules regarding logging. The logging framework used in those project is Apache Common logging, so we configured the rules to use this framework.


AvoidPrintStackTrace

This rule reports a violation when it finds a code that catch an expression and print its stack trace to the standard error output. A logging framework should be used instead, in order to improve application’s maintainability. The refactoring replace a call to print stack trace by a call to a logging framework. The rule can also declare the logger in the class and make the required imports. Here's an example of the original code and the refactored code in the class GenericWritable.

Original code:
catch (Exception e) {
      e.printStackTrace();
      throw new IOException("Cannot initialize the class: " + clazz);
}
Refactored code:
catch (final Exception e) {
      LOG.error(e.getMessage(), e);
      throw new IOException("Cannot initialize the class: " + clazz);
}
In this case, LOG was not declared so it was added to the class and import were made :
private static final Log LOG = LogFactory.getLog(GenericWritable.class);

InefficientConstructorCall

Calling the constructor of a wrapper type, like Integer, to convert a primitive type is a bad practice. It is less efficient than calling the static method valueOf.


PositionLiteralsFirstInComparisonsRefactor

This rule checks that literals are in the first position in comparisons. The refactoring invert the literal and the variable. This ensures that the code cannot crash due to the variable being a null pointer.

AddEmptyStringToConvert

Using the concatenation of an empty string to convert a primitive type to a String is a bad practice. First of all, it makes the code less readable. It is also less efficient in most cases (the only case where the string concatenation is slightly better is when the primitive is final). Here's an example taken from class MD5MD5CRC32FileChecksum. Original code:
xml.attribute("bytesPerCRC", "" + that.bytesPerCRC);
Refactored code:
xml.attribute("bytesPerCRC", String.valueOf(that.bytesPerCRC));

GuardDebugLogging

When a concatenation of String is performed inside a debug log, one should check if debug is enabled before making the call. Otherwise, the String concatenation will always be done. The refactoring adds a guard before the call to debug. In this case, it is configured to use the method isDebugEnabled(), since we use Apache's log. Below is an example of refactored code taken from class ActiveStandByElector:
if(LOG.isDebugEnabled()){
        LOG.debug("StatNode result: " + rc + " for path: " + path + " connectionState: " + zkConnectionState + " for " + this);
}
 

IfElseStmtsMustUseBraces

This rule finds if statements that don’t use braces. The refactoring adds required braces.

UseCollectionIsEmpty

This rule finds usage of Collection's size method to check if a collection is empty. Rather than using size(), it is better to use isEmpty() making the code easier to read. The refactoring replace comparisons between size and 0 with a call to isEmpty().

LocalVariableCouldBeFinal

This method flags local variables that could be declared final and are not. The use of the final keyword is a useful information for future code readers. The refactoring adds the "final" keword. This is not a critical rule, but since it has a huge number of violations, it is useful to get rid of them quickly with automatic refactoring. 

Refactoring results

So we ran Scertify on both projects to detect and refactor those rules. On each project it took around 1 minute to perform the full process. Scertify generates an html report with information on errors detected and corrected. Below is a summary of all errors corrected in the two projects. Many minor things were corrected, but also more important ones. Overall, it took 2 minutes to correct 25392 defects. Not so bad isn't it? Those defects include both minor violations and more critical violations in term of maintainability, performance or robustness.

Violations refactored

As you can see on screen-shots below, with those defects corrected the technical debt of each project has been reduced of 10 days. Overall, that's 20 day of technical debt that have been written-off.

Refactored Common technical debtRefactored Mapreduce technical debt

Last but not least, Hadoop contains many unit tests and of course we made sure that they still succeed after the refactoring. To conclude, thanks to Scertify's refactoring features we were able to efficiently correct 25K defects in few minutes. We are glad to make the refactored code available to community, you can download it below. We will continue to do such refactoring on open-source applications, so if you have an idea for an open-source project that could leverage such refactoring, just let us know!


Download the source files

Published at DZone with permission of its author, Michael Muller.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Comments

Xiang Tao replied on Wed, 2013/01/23 - 8:44pm

very good.

Michael Muller replied on Fri, 2013/01/25 - 6:13am

Thank you Xiang. If you wanna try automated code refactoring on your own source code, please check this out » http://tocea.com/resources/scertify-trial-versions/2823/how-to-install-scertify-refactoring-eclipse-plugin

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.