Wednesday, December 14, 2011

The Ultimate Test: Adding New Features to an Existing System

This post concludes the trilogy to Issue Driven Project Management or IDPM (see Part I: Make No Assumptions and Part II: A Technical Review for Your Thoughts). This time around, we were given only three commands to implement and thus, less time to complete the project; nevertheless, this did not stop this project from being a challenging one. From Parts I and II, we have already established that IDPM was difficult enough as is to put into practice. But here, we were given the challenge of building three new features into the code base belonging to the team whose system we, Team Pichu, had reviewed.

For the project, the three commands we had to implement were set-baseline, monitor-power, and monitor-goal (the project specifications describe them in more detail). The set-baseline command takes a given date and stores the power consumed for each of the 24 hours in that day. The monitor-power command continuously outputs the current power consumption for a given source at specified intervals in seconds. And the last command, monitor-goal, outputs the current power and states whether or not it is meeting its power conservation goal, as set by the user-specified goal and a prior call to set-baseline. Fortunately, we were able to successfully implement all three commands, but at the cost of a late submission on my part.

Many issues arose during development, including the usage of two Java objects that were new to us: Timer and TimerTask. A Timer object may start and stop multiple TimerTask objects, running each one once or at specified intervals. Before I could work on monitor-goal, our team had to make sure that monitor-power was working first, since monitor-goal builds upon it. It was odd that for each interval, over 5 of the same values for the power consumed would show up. Ultimately, we learned to use Timer's schedule method instead of scheduleAtFixedRate and not to call TimerTask's run method, since Timer's schedule method already calls run once.

The monitor-goal command took me a lot longer to implement than expected. One of the challenges I faced was how to implement the check for each hour without having to hardcode all 24 if-else blocks (and a simple extra whitespace had me debugging for another hour). In the end, I chose to go with the if-else blocks only because I was pressed for time. Perhaps if I had had more time to think about it, I could have implemented it in a different manner. Due to the fact that final projects for other classes took up much of my time, I made working on Version 2 a lot more difficult for me than it should have been. Initially, I had thought I could improve upon the original code base and fix a few bugs with the error-handling, but since that wasn't a priority and time was out of hand, I had to drop it as an invalid issue. It is rather surprising that we weren't required to fix the original code base created by the team whose system we reviewed. Nevertheless, I anticipate that the opposite is true as an industry best practice.

Group communication between all members of the team, this time around, however, was much, much better. We all got back to each other within a day or right away, even if it was to say that we couldn't work on the project at the moment. While communication was a lot better, I learned that it alone wasn't enough to move the project along at an appropriate pace. Motivation, physical and emotional drain, and time management issues are also strong factors that can affect the overall quality of a software project. Overall, despite possible incurred costs, I believe that the quality of our implementation of new functionality to an existing system was quite decent. It could have been better, but then again, software engineering is a full-time job, and as full-time students, there had to a tradeoff. In relation to the Three Prime Directives of Software Engineering as elaborated on in Part II, the system we first reviewed and then improved upon definitely completes a successful task and is easy enough for the user to download, install, and use. And although the code structure could be improved to streamline the addition of new features, it was moderately simple for a developer to improve upon the system.

Friday, December 2, 2011

A Technical Review for Your Thoughts

One lesson learned in software engineering is that a technical review is the most optimal method of project inspection. Let's face it: when things are written down, they're easier to remember and easier to understand. A walkthrough - briefly going over code or a problem with a teammate - will often leave many things said undone and doesn't provide full coverage of the system at hand.

The concepts of Issue Driven Project Management (IDPM) and Continuous Integration (CI) were introduced in the "Make No Assumptions" blog post. To recap, IDPM is a practice by which multiple developers can work efficiently and effectively by associating tasks with issues, keeping tabs on the statuses of these issues, and allowing all project team members to view all issues so they know exactly who is working on what and who to go to for specific questions on particular sections of the project. CI involves a tool such as the Jenkins server to automate that a system passes verification at all times and requires active and consistent participation throughout the project lifespan from all project team members.

IDPM and CI play key roles in this technical review of the same project requirements my team, Team Pichu, had worked under. (For details about Team Pichu's project, please refer to my prior blog post.) The project specifications are outlined on this webpage, under A24. To summarize, the task was to develop a command line interface program for understanding various aspects of energy and power consumption in the Hale Aloha Towers (freshman dorms) of the University of Hawaii at Manoa campus. The program creates a client to the WattDepot server and queries data that the server accumulates from meters in the Hale Aloha Towers. WattDepot was discussed in this blog post. This technical review addresses the Three Prime Directives of Open Source Software Engineering, which are bolded as section headers in the rest of this post and was discussed in an earlier blog post as a review of the Logism software.

Review Question 1: Does the system accomplish a useful task?


Yes, using every possible tower (Ilima, Lehua, Lokelani, Mokihana) and some lounges, the system exercises every key functionality available on Team Teams's home page: the commands current-power, daily-energy, energy-since, and rank-towers. In addition, the necessary support commands, help and quit, have been implemented. The system definitely accomplishes the useful task of providing energy and power consumption usage in the Hale Aloha Towers residence halls on the University of Hawaii at Manoa campus.

Review Question 2: Can an external user successfully install and use the system?


The home page, introduced above, provides a clear description of what the system is supposed to accomplish; however, the bold-faced commands are not explicitly stated as commands, their usage is not given, and there is no sample input and output. The User Guide wiki page, on the other hand, provides more details and is concise so that no user should have trouble downloading, installing, and executing the system. It also provides a link to the downloadable distribution, which was previously downloaded for Review Question 1, and the distribution does indeed provide an executable jar file in the top-level directory by the name of hale-aloha-cli-teams, which uses the proper naming convention for this project. Indeed, the zip file distribution contains major and minor version numbers stamped with the date and time in which the zip file was created.

Testing the system under valid inputs had already been accomplished for Review Question 1. To reiterate, here are the commands used for the test: current-power, daily-energy, energy-since, rank-towers, help, and quit. The following lists the towers tested: Ilima, Lehua, Lokelani, and Mokihana; the two lounges tested for each tower were A and E. The dates used for daily-energy and energy-since were 2011-11-30, while the start date used for rank-towers was 2011-11-30 and the end date used was 2011-12-01. The system responded well to the valid inputs provided by printing the data requested, but it could have handled invalid inputs better by prompting the user with a message (the program only did this sometimes). For example, it failed to handle the empty string "", in which no ">" cursor would show up, yet entering valid commands were still possible. It was the same result (the missing cursor) whenever the number of arguments was less than what was required, and nothing would print out at all if the number of arguments was more than required (but the cursor would at least be there to prompt the user again). Nevertheless, overall, yes, an external user can successfully install and use the system (downloading, unzipping, installing, and executing the system all takes less than a minute, in fact).

Review Question 3: Can an external developer successfully understand and enhance the system?


The Developer Guide wiki page does indeed provide clear and comprehensive instructions on building the system and even includes tips on IDPM, as well as the team's experience using it. Appearance-wise, it just would have been nice if the commands for steps 2.1 and 2.2 were placed onto the traditional light-gray background. Quality assurance tools such as Checkstyle, PMD, and FindBugs were used and mentioned in the Developer Guide, among other useful standards mentioned, such as using a specific format (a link is provided) in Eclipse. Since IDPM and Development Guidelines were given in great detail, a new developer can certainly ensure that any code they write shall adhere to such standards. And indeed, the guide provides a link to the Jenkins CI server associated with this project. In addition, JavaDoc generation is explained clearly as a subsection of its own on the Developer Guide.

After checking out the sources from SVN using the https URL indicated on this webpage, JavaDoc documentation was successfully generated through the command line by running the command ant -f javadoc.build.xml. The JavaDocs are certainly well-written and informative, provide a good understanding of Team Teams's architecture and the structuring of individual components. The names of the components certainly indicate their underlying purpose, and the system appears to be designed to support information hiding. The only lacking JavaDoc comments are for the default constructors of each class and the package private fields for the Processor class, but then, these fields are explained as in-line comments in the source code, and default constructors don't necessarily exist explicitly in the source code.

Next, the system was tested to build successfully from sources without errors by running the command ant -f verify.build.xml. Coverage information was also successfully generated using the command ant -f jacoco.build.xml. The JaCoCo coverage tool report stated that no tests in the Main and Processor packages were implemented but that 97% of the instructions and 82% of the branches were covered in the Command package. This is great, since the Command package is the meat of the project, but it would have been nice if the team had also provided JUnit test cases for the Main and Processor classes as well (although they do have a manual test class for the Processor class).

Unfortunately, whenever executing the class files from Eclipse, the error java.lang.UnsupportedClassVersionError would show up. I'm not sure if this is due to my Eclipse or the files in question. But as for visual inspection of the test cases in the test classes, overall, their test cases appear to cover nearly all Processor functionality, and there are multiple (five or greater) assert statements in each JUnit test class. In conclusion, the current set of test cases appear to prevent a new developer from making enhancements that would break pre-existing code.

As for the source code, coding standards appear to be followed and multiple but appropriate in-line comments as well as detailed JavaDocs are provided. The code is fairly easy to understand and the comments are just the right amount.

From the Issues page associated with this project, it is clear which of the three developers worked on what so that an external developer can use this page to determine who the best person to ask about a specific issue would be. As for gauging the equality of the dispersion of work, two developers appeared to do an equal amount of work and about half more work than the third developer.

Lastly, the Jenkins CI server associated with this project was examined. Apart from the known outages to the WattDepot server during November 21st to the 24th, there were no build failures. The project began on November 10th and ended on the 29th; overall, it was worked on in a consistent fashion, with only one day skipped a few times (meaning, the project never went for more than one to two days without being worked on). Moreover, by examining the Updates page, it appears that approximately 14 out of 49 revisions were not associated with an issue, meaning that only roughly 71% were, and this percentage is actually even less because quite a few of the revisions were not commits but were revisions to the wiki pages. This is not good, since at least 90% of the commits should be associated with issues.

In conclusion, a new external developer could successfully understand the current hale-aloha-cli-teams system with ease and therefore enhance it in future because of the well-written test cases, JavaDocs, in-line comments, and User/Developer Guide wiki pages. Furthermore, overall, Team Teams successfully accomplished the Three Prime Directives of Open Source Software Engineering.