Friday, December 2, 2011

A Technical Review for Your Thoughts

One lesson learned in software engineering is that a technical review is the most optimal method of project inspection. Let's face it: when things are written down, they're easier to remember and easier to understand. A walkthrough - briefly going over code or a problem with a teammate - will often leave many things said undone and doesn't provide full coverage of the system at hand.

The concepts of Issue Driven Project Management (IDPM) and Continuous Integration (CI) were introduced in the "Make No Assumptions" blog post. To recap, IDPM is a practice by which multiple developers can work efficiently and effectively by associating tasks with issues, keeping tabs on the statuses of these issues, and allowing all project team members to view all issues so they know exactly who is working on what and who to go to for specific questions on particular sections of the project. CI involves a tool such as the Jenkins server to automate that a system passes verification at all times and requires active and consistent participation throughout the project lifespan from all project team members.

IDPM and CI play key roles in this technical review of the same project requirements my team, Team Pichu, had worked under. (For details about Team Pichu's project, please refer to my prior blog post.) The project specifications are outlined on this webpage, under A24. To summarize, the task was to develop a command line interface program for understanding various aspects of energy and power consumption in the Hale Aloha Towers (freshman dorms) of the University of Hawaii at Manoa campus. The program creates a client to the WattDepot server and queries data that the server accumulates from meters in the Hale Aloha Towers. WattDepot was discussed in this blog post. This technical review addresses the Three Prime Directives of Open Source Software Engineering, which are bolded as section headers in the rest of this post and was discussed in an earlier blog post as a review of the Logism software.

Review Question 1: Does the system accomplish a useful task?


Yes, using every possible tower (Ilima, Lehua, Lokelani, Mokihana) and some lounges, the system exercises every key functionality available on Team Teams's home page: the commands current-power, daily-energy, energy-since, and rank-towers. In addition, the necessary support commands, help and quit, have been implemented. The system definitely accomplishes the useful task of providing energy and power consumption usage in the Hale Aloha Towers residence halls on the University of Hawaii at Manoa campus.

Review Question 2: Can an external user successfully install and use the system?


The home page, introduced above, provides a clear description of what the system is supposed to accomplish; however, the bold-faced commands are not explicitly stated as commands, their usage is not given, and there is no sample input and output. The User Guide wiki page, on the other hand, provides more details and is concise so that no user should have trouble downloading, installing, and executing the system. It also provides a link to the downloadable distribution, which was previously downloaded for Review Question 1, and the distribution does indeed provide an executable jar file in the top-level directory by the name of hale-aloha-cli-teams, which uses the proper naming convention for this project. Indeed, the zip file distribution contains major and minor version numbers stamped with the date and time in which the zip file was created.

Testing the system under valid inputs had already been accomplished for Review Question 1. To reiterate, here are the commands used for the test: current-power, daily-energy, energy-since, rank-towers, help, and quit. The following lists the towers tested: Ilima, Lehua, Lokelani, and Mokihana; the two lounges tested for each tower were A and E. The dates used for daily-energy and energy-since were 2011-11-30, while the start date used for rank-towers was 2011-11-30 and the end date used was 2011-12-01. The system responded well to the valid inputs provided by printing the data requested, but it could have handled invalid inputs better by prompting the user with a message (the program only did this sometimes). For example, it failed to handle the empty string "", in which no ">" cursor would show up, yet entering valid commands were still possible. It was the same result (the missing cursor) whenever the number of arguments was less than what was required, and nothing would print out at all if the number of arguments was more than required (but the cursor would at least be there to prompt the user again). Nevertheless, overall, yes, an external user can successfully install and use the system (downloading, unzipping, installing, and executing the system all takes less than a minute, in fact).

Review Question 3: Can an external developer successfully understand and enhance the system?


The Developer Guide wiki page does indeed provide clear and comprehensive instructions on building the system and even includes tips on IDPM, as well as the team's experience using it. Appearance-wise, it just would have been nice if the commands for steps 2.1 and 2.2 were placed onto the traditional light-gray background. Quality assurance tools such as Checkstyle, PMD, and FindBugs were used and mentioned in the Developer Guide, among other useful standards mentioned, such as using a specific format (a link is provided) in Eclipse. Since IDPM and Development Guidelines were given in great detail, a new developer can certainly ensure that any code they write shall adhere to such standards. And indeed, the guide provides a link to the Jenkins CI server associated with this project. In addition, JavaDoc generation is explained clearly as a subsection of its own on the Developer Guide.

After checking out the sources from SVN using the https URL indicated on this webpage, JavaDoc documentation was successfully generated through the command line by running the command ant -f javadoc.build.xml. The JavaDocs are certainly well-written and informative, provide a good understanding of Team Teams's architecture and the structuring of individual components. The names of the components certainly indicate their underlying purpose, and the system appears to be designed to support information hiding. The only lacking JavaDoc comments are for the default constructors of each class and the package private fields for the Processor class, but then, these fields are explained as in-line comments in the source code, and default constructors don't necessarily exist explicitly in the source code.

Next, the system was tested to build successfully from sources without errors by running the command ant -f verify.build.xml. Coverage information was also successfully generated using the command ant -f jacoco.build.xml. The JaCoCo coverage tool report stated that no tests in the Main and Processor packages were implemented but that 97% of the instructions and 82% of the branches were covered in the Command package. This is great, since the Command package is the meat of the project, but it would have been nice if the team had also provided JUnit test cases for the Main and Processor classes as well (although they do have a manual test class for the Processor class).

Unfortunately, whenever executing the class files from Eclipse, the error java.lang.UnsupportedClassVersionError would show up. I'm not sure if this is due to my Eclipse or the files in question. But as for visual inspection of the test cases in the test classes, overall, their test cases appear to cover nearly all Processor functionality, and there are multiple (five or greater) assert statements in each JUnit test class. In conclusion, the current set of test cases appear to prevent a new developer from making enhancements that would break pre-existing code.

As for the source code, coding standards appear to be followed and multiple but appropriate in-line comments as well as detailed JavaDocs are provided. The code is fairly easy to understand and the comments are just the right amount.

From the Issues page associated with this project, it is clear which of the three developers worked on what so that an external developer can use this page to determine who the best person to ask about a specific issue would be. As for gauging the equality of the dispersion of work, two developers appeared to do an equal amount of work and about half more work than the third developer.

Lastly, the Jenkins CI server associated with this project was examined. Apart from the known outages to the WattDepot server during November 21st to the 24th, there were no build failures. The project began on November 10th and ended on the 29th; overall, it was worked on in a consistent fashion, with only one day skipped a few times (meaning, the project never went for more than one to two days without being worked on). Moreover, by examining the Updates page, it appears that approximately 14 out of 49 revisions were not associated with an issue, meaning that only roughly 71% were, and this percentage is actually even less because quite a few of the revisions were not commits but were revisions to the wiki pages. This is not good, since at least 90% of the commits should be associated with issues.

In conclusion, a new external developer could successfully understand the current hale-aloha-cli-teams system with ease and therefore enhance it in future because of the well-written test cases, JavaDocs, in-line comments, and User/Developer Guide wiki pages. Furthermore, overall, Team Teams successfully accomplished the Three Prime Directives of Open Source Software Engineering.

No comments:

Post a Comment