Wednesday, December 14, 2011

The Ultimate Test: Adding New Features to an Existing System

This post concludes the trilogy to Issue Driven Project Management or IDPM (see Part I: Make No Assumptions and Part II: A Technical Review for Your Thoughts). This time around, we were given only three commands to implement and thus, less time to complete the project; nevertheless, this did not stop this project from being a challenging one. From Parts I and II, we have already established that IDPM was difficult enough as is to put into practice. But here, we were given the challenge of building three new features into the code base belonging to the team whose system we, Team Pichu, had reviewed.

For the project, the three commands we had to implement were set-baseline, monitor-power, and monitor-goal (the project specifications describe them in more detail). The set-baseline command takes a given date and stores the power consumed for each of the 24 hours in that day. The monitor-power command continuously outputs the current power consumption for a given source at specified intervals in seconds. And the last command, monitor-goal, outputs the current power and states whether or not it is meeting its power conservation goal, as set by the user-specified goal and a prior call to set-baseline. Fortunately, we were able to successfully implement all three commands, but at the cost of a late submission on my part.

Many issues arose during development, including the usage of two Java objects that were new to us: Timer and TimerTask. A Timer object may start and stop multiple TimerTask objects, running each one once or at specified intervals. Before I could work on monitor-goal, our team had to make sure that monitor-power was working first, since monitor-goal builds upon it. It was odd that for each interval, over 5 of the same values for the power consumed would show up. Ultimately, we learned to use Timer's schedule method instead of scheduleAtFixedRate and not to call TimerTask's run method, since Timer's schedule method already calls run once.

The monitor-goal command took me a lot longer to implement than expected. One of the challenges I faced was how to implement the check for each hour without having to hardcode all 24 if-else blocks (and a simple extra whitespace had me debugging for another hour). In the end, I chose to go with the if-else blocks only because I was pressed for time. Perhaps if I had had more time to think about it, I could have implemented it in a different manner. Due to the fact that final projects for other classes took up much of my time, I made working on Version 2 a lot more difficult for me than it should have been. Initially, I had thought I could improve upon the original code base and fix a few bugs with the error-handling, but since that wasn't a priority and time was out of hand, I had to drop it as an invalid issue. It is rather surprising that we weren't required to fix the original code base created by the team whose system we reviewed. Nevertheless, I anticipate that the opposite is true as an industry best practice.

Group communication between all members of the team, this time around, however, was much, much better. We all got back to each other within a day or right away, even if it was to say that we couldn't work on the project at the moment. While communication was a lot better, I learned that it alone wasn't enough to move the project along at an appropriate pace. Motivation, physical and emotional drain, and time management issues are also strong factors that can affect the overall quality of a software project. Overall, despite possible incurred costs, I believe that the quality of our implementation of new functionality to an existing system was quite decent. It could have been better, but then again, software engineering is a full-time job, and as full-time students, there had to a tradeoff. In relation to the Three Prime Directives of Software Engineering as elaborated on in Part II, the system we first reviewed and then improved upon definitely completes a successful task and is easy enough for the user to download, install, and use. And although the code structure could be improved to streamline the addition of new features, it was moderately simple for a developer to improve upon the system.

Friday, December 2, 2011

A Technical Review for Your Thoughts

One lesson learned in software engineering is that a technical review is the most optimal method of project inspection. Let's face it: when things are written down, they're easier to remember and easier to understand. A walkthrough - briefly going over code or a problem with a teammate - will often leave many things said undone and doesn't provide full coverage of the system at hand.

The concepts of Issue Driven Project Management (IDPM) and Continuous Integration (CI) were introduced in the "Make No Assumptions" blog post. To recap, IDPM is a practice by which multiple developers can work efficiently and effectively by associating tasks with issues, keeping tabs on the statuses of these issues, and allowing all project team members to view all issues so they know exactly who is working on what and who to go to for specific questions on particular sections of the project. CI involves a tool such as the Jenkins server to automate that a system passes verification at all times and requires active and consistent participation throughout the project lifespan from all project team members.

IDPM and CI play key roles in this technical review of the same project requirements my team, Team Pichu, had worked under. (For details about Team Pichu's project, please refer to my prior blog post.) The project specifications are outlined on this webpage, under A24. To summarize, the task was to develop a command line interface program for understanding various aspects of energy and power consumption in the Hale Aloha Towers (freshman dorms) of the University of Hawaii at Manoa campus. The program creates a client to the WattDepot server and queries data that the server accumulates from meters in the Hale Aloha Towers. WattDepot was discussed in this blog post. This technical review addresses the Three Prime Directives of Open Source Software Engineering, which are bolded as section headers in the rest of this post and was discussed in an earlier blog post as a review of the Logism software.

Review Question 1: Does the system accomplish a useful task?


Yes, using every possible tower (Ilima, Lehua, Lokelani, Mokihana) and some lounges, the system exercises every key functionality available on Team Teams's home page: the commands current-power, daily-energy, energy-since, and rank-towers. In addition, the necessary support commands, help and quit, have been implemented. The system definitely accomplishes the useful task of providing energy and power consumption usage in the Hale Aloha Towers residence halls on the University of Hawaii at Manoa campus.

Review Question 2: Can an external user successfully install and use the system?


The home page, introduced above, provides a clear description of what the system is supposed to accomplish; however, the bold-faced commands are not explicitly stated as commands, their usage is not given, and there is no sample input and output. The User Guide wiki page, on the other hand, provides more details and is concise so that no user should have trouble downloading, installing, and executing the system. It also provides a link to the downloadable distribution, which was previously downloaded for Review Question 1, and the distribution does indeed provide an executable jar file in the top-level directory by the name of hale-aloha-cli-teams, which uses the proper naming convention for this project. Indeed, the zip file distribution contains major and minor version numbers stamped with the date and time in which the zip file was created.

Testing the system under valid inputs had already been accomplished for Review Question 1. To reiterate, here are the commands used for the test: current-power, daily-energy, energy-since, rank-towers, help, and quit. The following lists the towers tested: Ilima, Lehua, Lokelani, and Mokihana; the two lounges tested for each tower were A and E. The dates used for daily-energy and energy-since were 2011-11-30, while the start date used for rank-towers was 2011-11-30 and the end date used was 2011-12-01. The system responded well to the valid inputs provided by printing the data requested, but it could have handled invalid inputs better by prompting the user with a message (the program only did this sometimes). For example, it failed to handle the empty string "", in which no ">" cursor would show up, yet entering valid commands were still possible. It was the same result (the missing cursor) whenever the number of arguments was less than what was required, and nothing would print out at all if the number of arguments was more than required (but the cursor would at least be there to prompt the user again). Nevertheless, overall, yes, an external user can successfully install and use the system (downloading, unzipping, installing, and executing the system all takes less than a minute, in fact).

Review Question 3: Can an external developer successfully understand and enhance the system?


The Developer Guide wiki page does indeed provide clear and comprehensive instructions on building the system and even includes tips on IDPM, as well as the team's experience using it. Appearance-wise, it just would have been nice if the commands for steps 2.1 and 2.2 were placed onto the traditional light-gray background. Quality assurance tools such as Checkstyle, PMD, and FindBugs were used and mentioned in the Developer Guide, among other useful standards mentioned, such as using a specific format (a link is provided) in Eclipse. Since IDPM and Development Guidelines were given in great detail, a new developer can certainly ensure that any code they write shall adhere to such standards. And indeed, the guide provides a link to the Jenkins CI server associated with this project. In addition, JavaDoc generation is explained clearly as a subsection of its own on the Developer Guide.

After checking out the sources from SVN using the https URL indicated on this webpage, JavaDoc documentation was successfully generated through the command line by running the command ant -f javadoc.build.xml. The JavaDocs are certainly well-written and informative, provide a good understanding of Team Teams's architecture and the structuring of individual components. The names of the components certainly indicate their underlying purpose, and the system appears to be designed to support information hiding. The only lacking JavaDoc comments are for the default constructors of each class and the package private fields for the Processor class, but then, these fields are explained as in-line comments in the source code, and default constructors don't necessarily exist explicitly in the source code.

Next, the system was tested to build successfully from sources without errors by running the command ant -f verify.build.xml. Coverage information was also successfully generated using the command ant -f jacoco.build.xml. The JaCoCo coverage tool report stated that no tests in the Main and Processor packages were implemented but that 97% of the instructions and 82% of the branches were covered in the Command package. This is great, since the Command package is the meat of the project, but it would have been nice if the team had also provided JUnit test cases for the Main and Processor classes as well (although they do have a manual test class for the Processor class).

Unfortunately, whenever executing the class files from Eclipse, the error java.lang.UnsupportedClassVersionError would show up. I'm not sure if this is due to my Eclipse or the files in question. But as for visual inspection of the test cases in the test classes, overall, their test cases appear to cover nearly all Processor functionality, and there are multiple (five or greater) assert statements in each JUnit test class. In conclusion, the current set of test cases appear to prevent a new developer from making enhancements that would break pre-existing code.

As for the source code, coding standards appear to be followed and multiple but appropriate in-line comments as well as detailed JavaDocs are provided. The code is fairly easy to understand and the comments are just the right amount.

From the Issues page associated with this project, it is clear which of the three developers worked on what so that an external developer can use this page to determine who the best person to ask about a specific issue would be. As for gauging the equality of the dispersion of work, two developers appeared to do an equal amount of work and about half more work than the third developer.

Lastly, the Jenkins CI server associated with this project was examined. Apart from the known outages to the WattDepot server during November 21st to the 24th, there were no build failures. The project began on November 10th and ended on the 29th; overall, it was worked on in a consistent fashion, with only one day skipped a few times (meaning, the project never went for more than one to two days without being worked on). Moreover, by examining the Updates page, it appears that approximately 14 out of 49 revisions were not associated with an issue, meaning that only roughly 71% were, and this percentage is actually even less because quite a few of the revisions were not commits but were revisions to the wiki pages. This is not good, since at least 90% of the commits should be associated with issues.

In conclusion, a new external developer could successfully understand the current hale-aloha-cli-teams system with ease and therefore enhance it in future because of the well-written test cases, JavaDocs, in-line comments, and User/Developer Guide wiki pages. Furthermore, overall, Team Teams successfully accomplished the Three Prime Directives of Open Source Software Engineering.

Tuesday, November 29, 2011

Make No Assumptions: Software Engineering is Not Software Engineering Unless You've Been Through IDPM

The full impact of software engineering has not truly been gauged until now. It's multiple belts up from the white-belt WattDepot katas we previously tackled. It's the combination of our basic skills with WattDepot, our fleeting glimpse of Continuous Integration (CI), and our whirlwind introduction to Issue Driven Project Management (IDPM). And the most lethal ingredient of them all is teamwork, as implied by CI and IDPM. Lethal, but unavoidable. When working with others on a project, CI makes things a bit easier by assuring project team members that the system they are developing is in working condition at all times. The status of verification of the build is automated under a CI server such as Jenkins, which triggers an automatic verify test on the system in the event that a team member has recently committed changes to a Subversion (SVN) repository such as one hosted by Google Project Hosting. CI also requires team members to commit regularly, such as at least once every two days. Meanwhile, IDPM is designed to keep all project team members on task at all times by associating each task with a specific issue at hand, and ideally, issues should not take longer than a few days to complete. Successful IDPM requires frequent group interaction, always having another issue at hand after completing one, and general knowledge of where the project is headed at all times. In particular, Google Project Hosting (along with SVN and Jenkins) was used to drive IDPM; there, it is easy to see what tasks are (mainly, among other statuses) Accepted, Started, and Done by opening all issues under Grid view.

The project, which can be found at this website, was to create a command line interface program to understand various aspects of energy use in the Hale Aloha Towers (freshman dorms) of the University of Hawaii at Manoa. My team in particular consists of two other members. As in the real world, we did not get to choose who to work with. We had to choose a name for our team and, after mulling over a brainstorm-generated list, settled for "Pichu" (not necessarily after the Pokemon of the same name). We exchanged phone numbers and Skype usernames and had the project broken down into the important issues at hand, all in the first day the project was assigned to us.

Mentioned above, the dynamic trio of Google Project Hosting, SVN, and Jenkins was used to drive IDPM. Before each commit I made (except for one or a few times in which case I was positive the program would pass verify - but alas, lesson learned: never assume), I made sure to run verify on the system. Then, after a commit, I would check the associated Jenkins job to make sure that it built the system successfully. To address IDPM, after the disastrous surprise in-progress technical review, I made sure to associate each and every commit (known as a revision in Google Project Hosting) with the issues I addressed. To take it a step further, when I updated the issues themselves, I added links to the revisions that handled these issues.

Our team originally had the issues all planned out from the beginning, but as we finished our main issues, if we thought up more issues, those were simply added to the existing pool of issues. Because of IDPM and overall constant communication, we were able to gauge what everyone else was doing and hence finish the project in time. The basic program functionality has six commands; a description of each can be found on Team Pichu's home page. There were four main components to the program, namely: the Main class, the Processor class, classes that implement the Command interface, and JUnit test classes. Main contains the driver for the program, establishes connection to the WattDepot server, and creates instances of the Processor class for each command or invalid input entered by the user. Processor takes the input, parses it, and determines which Command-type class to call based upon the input; if an error arises with the input, then Processor returns the appropriate error message to Main. Each Command-type class must implement the Command interface's run, getOutput, and getHelp methods; in addition, each Command-type class should have a default constructor (in addition to constructors with parameters, if any). If the Command-type class takes one or more parameters to the constructor, then that constructor should throw an IllegalArgumentsException, and a checkArgs method should also be implemented. Lastly, each Command-type class should have an associated JUnit test class (Main and Processor have test classes as well).

Unfortunately, we were not able to implement Java Reflection, so most of the code is hardcoded. For example, if a Command-type class is added to the Command package, then changes to Processor and Help need to be made, and a corresponding JUnit test class needs to be created in the Test package. Hence, decreasing the number of places that code needs to be updated any time a new command or feature is added to the program is a future improvement that could be made. Overall, this experience has been completely meaningful and worthwhile because of the real-world experience of working with others in real-time to produce a single smooth and functioning program. Communication, which could certainly have been better, was definitely the key element to success, as well as consistently working on the project over a long period of time. I felt that both communication and consistency were quite challenging when one is out of sync with the other. Without constant feedback through communication, it becomes more difficult to gauge what each team member should be working on. Another critical learning experience is what to do in the unexpected event of some greater force negatively impacting the project at hand. The WattDepot server had crashed and there was a hard disk failure for the first time in about four years. Luckily, our group was able to step up to the plate and align our project with the newfound situation at hand. In conclusion, I feel that IDPM is much more difficult than it had first seemed; moreover, the overall quality of our software could have been better and less hardcoded.

Wednesday, November 9, 2011

Energy Drained is Energy Gained

WattDepot: A one-stop shop for all your energy and power needs

Complementing the previous blog entry on energy in Hawaii, I had recently gained firsthand experience in manipulating the output of energy data, all from the comfort of my own home, using the WattDepot API. WattDepot is a web service that allows a host computer to connect as a client to an existing web server. This web server collects energy and power usage data stored in meters and saves the information in a database, which the WattDepot client can access. The meters I was concerned with are located in the Hale Aloha Towers (freshman dorms) on the University of Hawaii at Manoa campus. The sheer proximity of the meters only made this task of working with live energy and power data more relevant and meaningful.

Not to say that it wasn't stressful.

Enter the WattDepot katas:

Kata 1: SourceListing

The task was to print out the names of each source of data (where the meters were installed) across from their respective descriptions. This kata was absolutely deceiving, only because it was so straightforward, given the wattdepot-simpleapp to start us off with. A simple copy-and-paste from the simple application, with minor tweaks to the format of the output, made this kata breeze by in about 5 minutes.

Kata 2: SourceLatency

This seemed fairly straightforward: print out the sources and their latency values (in seconds). Another simple copy-and-paste from wattdepot-simpleapp got me off to a quick start. Wonderful, it printed out the latency values and the sources just fine. But not-so-wonderful, it didn't print them out by ascending order of latency value. I ended up taking the longest time, around 4 or 5 hours (not including the breaks in between), on this kata because of time spent searching for a suitable Java Collections class to use that would allow duplicate keys to map to different values. I finally figured out that Java doesn't have any built-in libraries that hold for this simple concept. While Googling, I discovered the Guava libraries, created and maintained by Google employees, who use it as their core libraries for Java-based projects. It took less than a minute to download the jar file and import it into my WattDepot project in Eclipse. Yes, now I could work with the Multimap interface, which allows duplicate keys to map to more than one value.

Unfortunately, printing out the keys and values using a Multimap in the format required by this kata was not a task I wished to partake in, as it seemed to be too much trouble than it was worth; I felt the same way about making my own Collections class. In the end, it was the first proper Collections package that I wrote, CompactDiscCollection, that gave me inspiration to use a TreeSet data structure of String objects. The String value was a concatenation of latency and the source name. Yes, I would say this was hardcoding, as it relied on the fact that my latency values were 2 digits long, but it was quite the shortcut, alleviating much stress.

Kata 3: SourceHierarchy

The task was to output a text-based visual representation of the hierarchy of sources and their subsources. I probably spent the second-to-least amount of time on this kata, for about 1 to 2 hours. Browsing through the relevant topic on the class Google Group definitely helped. I, like others before me, wasn't sure whether or not to include certain sources as top-level sources, since some of them were already listed as subsources for other sources. Ultimately, I didn't list them to avoid redundancies in the output. The time-consuming (but not difficult) part of this kata was parsing all the information returned by certain WattDepot methods. But learning a little more about java.util.regex.Pattern was quite rewarding.

Kata 4: EnergyYesterday

Here, the task was to retrieve all of yesterday's energy consumption data for each source. I believe this kata took me longer to complete than the previous one. Since it took such a while, I decided to leave in the code I originally wrote to acquire yesterday's year, month, and date (commented out, of course). Apparently, to my surprise, I didn't read the Calendar API closely enough because there already exists two methods that can acquire yesterday's date. Setting the start and end timestamps to 00:00:00.000 AM and 11:59:59.999 PM was a simple manipulation of Calendar objects, but getting to know how these objects mix with XMLGregorianCalendar objects and how timestamps actually work was quite a bit to tackle all at once. What frustrated me most was trying to figure out which of the 5 existing Tstamp.makeTimestamp() methods from the WattDepot API to use.

Kata 5: HighestRecordedPowerYesterday

This kata was quite a change from the previous one, as we were tasked to calculate the highest power (not energy this time) for each source recorded the day before. The class Google Group and kata specification helped me determine that 15-minute time intervals for a total of 96 queries from the database would provide more accurate data. But there were 64 sources, so for each source to query the database 96 times, it took quite a while to produce the output, which was not at all fun and friendly. I had to move to a place with faster Internet speed and wait for a few minutes before acquiring decent output (yet not without red error messages from a catch clause due to miscellaneous errors). I believe I took about 2 hours on this. This kata was definitely more trouble than fun.

Kata 6: MondayAverageEnergy

The last task was to compute and print the average energy consumption during the two previous Mondays for each source. It was simple enough to copy over the contents from Kata 4 and begin computing the offset value required for one Monday ago, relative to the current date. After that minor calculation, adding another 7 days to the offset did the trick for two Mondays ago. This probably took about an hour to complete. Like Kata 5, I had more trouble with acquiring the data for this, perhaps due to the fact that I was querying data from over a week ago.

Conclusion

To be honest, the title of this blog post is the first thing that popped to mind. Yay, something catchy. Yay, it rhymes. But after getting this far in the post, I realized that the title is not without purpose, for indeed, energy drained (into these katas) is energy gained (I feel much, much more knowledgeable about Java APIs and programming lessons learned in general). I learned that the folks at Google are more awesome than they already are because they've recognized the deficiencies in the Java Collections API and have shared with the world their extensions to it. Despite this fact, I have also learned that no matter how many different types of collections are created by others in the world of open source, you will usually find that a certain promising-looking collection just doesn't meet your needs exactly as you wish. I have also practiced the art of lazy programming by taking advantage of String victims and Java TreeSet instead of going through the trouble of making a collection class to suit my needs. Hardcoding was my friend during this time of stress, as I was pressed beyond reasonable allowable time as it was. Regardless of it all, I have never been better off. Energy data manipulation is just plain cool. And something quite useful for the future of tomorrow.

Tuesday, November 1, 2011

Small Steps for Clean Energy in Hawaii

Growing up in Hawaii, my parents had always warn me to stop playing video games or turn off electric devices because the electricity cost was so high. But it wasn't until today, thanks to these screencasts, that I discovered just how high that cost was: roughly three times as much as the mainland's, at $0.30 per kilowatt-hour (kWh). Leaving these devices on for most of the day is certainly not such a smart idea, but it's quite difficult to break out of the habit when one is a computer science major. But I realized if I want to play a part in the Hawaii Clean Energy Initiative, now is a better time to start than never.

The Hawaii Clean Energy Initiative was signed off by former Governor Linda Lingle in January of 2008 as a partnership between the state of Hawaii and the U.S. Department of Energy. It requires full participation and support from all Hawaii state residents if, together, we are to achieve 70% clean energy by the year 2030. 70% clean energy means that 40% of the projected energy in 2030 is generated from renewable resources and 30% of the projected energy is reduced by means of conservation and other efficiency measures.

Yet 40% renewable energy is just the start. What makes Hawaii an even better place to live apart from its unique geography, climate, and mix of people is that, unlike any other state, it has the potential to generate all of its power from renewable energy resources and has almost every source of renewable energy available. This includes wind, wave, solar, geothermal, and more. Renewable resources are generally more cost-effective for Hawaii, and because of its small size (in comparison to other states), Hawaii's energy needs are modest enough that all-renewable energy resources is possible.

Negative consequences loom ahead if we do not pick up our act now. Hawaii still generates most of its energy from imported oil, which is extremely expensive, as we know it, and the price is only likely to get even higher as the years go by. Furthermore, Hawaii as it is now uses inefficient means to generate its energy: each island has its own source or sources of energy so that not one is connected to another. These unconnected energy grids aren't what we'd call teamwork. If we don't do something about this, not just the price of gas and electricity would go up, but the prices of water, food, and clothing, etc., would all go up as well.

In conclusion, if Hawaii is to remain the dear place we call home, we must take it upon ourselves to conserve energy, as much as we can, on a daily basis. Nothing big was ever achieved in a giant leap. It's the small steps that we take each day that truly count in the end.

Tuesday, October 25, 2011

The Art of Shining within Steel Confines

Freedom in America doesn't come cheap. You still have to follow the laws within society, which are as strict as steel, in order to be considered a good and civilized citizen. The same goes in the software engineering world. Most of you will find yourselves in a business that is not run by you. There is little freedom to code in whatever way you desire or to say no if a software review is requested of you when you have so many other matters of importance to attend to. Midway through this semester, I've learned that I've missed out on so much over the past two years. I thought that if I followed the norms and did all that was required of me in my classes, I'd be on the road to success. Apparently, that thinking had always been far off-key. The following questions and my answers to them reveal that shining within steel confines is an art, not a sequential computer program as I had thought it to be all along.

1) What is the difference between standards and feedback? What are the many ways in which you receive feedback in this class?


Standards and feedback are vital for the three prime directives of open source software. Standards are required to allow people to work together in a way that would eliminate unnecessary unexpected events and technical issues. With standards in place, people can worry less about doing the right thing and focus more on the quality of their communication with others. Feedback, on the other hand, is the key to improvement. Success in the software engineering world is not an end goal but constant improvement. With constant feedback, success can be achieved. While standards provide structure, feedback shapes personal and professional growth. Ways in which we receive feedback in class are from partners or group members, other class members, the professor, the designated mailing list, automated quality assurance tools such as Checkstyle or FindBugs, and the outside online community.

2) List the four ways in which your professional persona is assessed in class and elaborate on each of them.


Your professional persona is how the world views you as a professional - a person with valued thoughts and valued skills. Here are the four ways in which professional persona is assessed in class:

1. professional portfolio - A place you can boast all about yourself (and only yourself). Do so in a professional manner, of course. This means no going into endless anecdotes about your personal life. Instead, highlight your industry-relevant skills and achievements. Detailing projects you have worked on is valuable, as it could give potential employers deeper insight into what you're actually capable of.

2. online engineering log - A blazing hub for your professional thoughts on industry-relevant tools and matters. This time, the focus is not on yourself but on those tools and matters. Write for the world, but keep in mind that your target audience is potential employers. They would use this log to assess your ability to communicate effectively through writing.

3. participation in professional social networks - A portal into valuable unending knowledge. This allows potential employers to witness firsthand how you contribute to industry-relevant discussions and how much you care about colleagues and others in the business. Furthermore, this will contribute to your personal and professional growth, giving you a broad range of views that you simply cannot obtain from just working with one or a few companies.

4. participation in open source projects - A concrete showcase of your expertise. Anyone could say they could write code, but the question in most employers' minds is, How well could you write code? Open source projects allow employers to examine actual code you've written and shared with the world. It also gives them an idea of how well you assimilate into a group mold: Do you follow your own standards or the standards of the existing code?

3) Name one way, outside of class, that you are encouraged to enhance your professional persona. How would this benefit you?


Participation in technical societies and activities would only further enhance your professional persona. Such technical societies include IEEE, ACM, SWE, and Honolulu Coders. Some technical activities are to chat in the IRC, compete in TopCoder competitions, and practice information security skills on wargaming websites such as SmashTheStack.org. Participating in any of these and more is a remarkable benefit because you could gain skills and knowledge that you would otherwise not gain from reading a textbook or taking a class. Such participation gives you a lot more to talk about during job interviews. In addition, if you win in a technical competition, you could add that to your resume and professional portfolio.

4) What are the three questions to ask yourself when conducting a software review? Why is each of these important?


A software review is a vehicle for further growth of technical skills and knowledge. The ability to assess others' coding faults is crucial and helps you to better assess your own code. In addition, the software engineer's code you're reviewing may have functionality very much different from your own, even if the two of you are working on the same product. Therefore, in the long run, it's always best to understand at least the key functionality of the code you're assessing as well as where the vulnerabilities may lie so as not to repeat the same mistakes and gain a better understanding of the product as a whole. When conducting a "decent" software review, here are the three questions to ask yourself:

1. Is my review revealing problems?
Importance: A software review takes time away from writing more code. Why conduct a software review when it won't be accomplishing its purpose? Its purpose, in fact, is to reveal problems with someone else's code - problems they easily skip over or normally can't find on their own. On the job, it's a business because time is money. In general, make sure what you do is relevant and that you achieve the purpose. Otherwise, you're not helping the business, the other software engineer, the customers, or yourself.

2. Is my review increasing understanding?
Importance: What good is a software review when all you've written sounds like gibberish to its target audience - the software engineer who has written the code? All your efforts to improve the system would go to waste if you won't take a little extra time and effort to enhance the software engineer's understanding of their own code. In the process, you will only be helping yourself feel better about yourself for helping someone else and feel better in general for gaining deeper insight into the product under development.

3. Is my review doing (1) and (2) efficiently?
Importance: Yes, take a little extra time and effort - but not a whole lot! The key to "time is money" in a business is balance. There should always be a balance between every component of your role on a software development team. What good is a software review that takes two weeks to complete when a report summary of new updates to a system is required each week? Be able to use good judgment, common sense, and past mistakes to assess how long a particular review for particular pieces of code should take.

5) Give some examples of why the saying, "It's [the professor's] way or the highway," is quite prevalent in this class and relate it to the real world.


Part of the answer lies in the introduction to this post, in which there is little room for freedom when software engineers must conform to standards and follow norms. But of course, without it, chaos would ensue. As we all know, understanding and debugging another's code is quite different from doing so with your own. For the purposes of this class, one example of "the professor's way" is our use of the Eclipse IDE, which goes hand-in-hand with our use of Java. Eclipse is used mainly because it runs on multiple OS platforms, it is free open source, and it is easy to integrate its use with other tools (such as Ant, JUnit, and SVN). We code with Java, even though there are so many other useful languages, because it is one of the most supported languages today with a complete API and a smorgasbord of tools at its disposal. I can very well imagine if we were allowed to use other IDEs or languages in this class - we would certainly get less work accomplished and acquire software engineering skills and knowledge at a much slower rate. In relation to the real world, the chaos and destruction would be tenfold and much more serious and irreversible. Nevertheless, there's an art to shining within steel confines: Follow their way and never take it to scorn. If you learn what you can outside the edges, you're paving the highway and calling it your own. In time, others will recognize you for it.

Thursday, October 20, 2011

Submersion into Subversion

These days, when we hear brand names such as Apple, Microsoft, and Google, we think of one software, one hardware, one product. We hardly stop to think of what lies past the name, past the "one." But in fact, it is a multitude of people who makes a product, not just one company. And they make a product in time for its launch date, continuing to provide updates and support afterwards. Let's focus on the software for now. How is it possible not to transform into some wild green-eyed lost soul due to millions of lines of code, written not just by you, but by others as well? The answer lies in configuration management.

Configuration management tracks the state or configuration of a system at any point in time so that changes made to the system are logged and different versions of the system, old or new, can be utilized for different purposes. Configuration management aims to solve the issues that arise when multiple programmers work on a single software product simultaneously. Even with simple compilation or verification issues, the quick fixes may not be apparent until hours or days into debugging. But with a configuration management tool, this could be avoided, since the previous version of the system, before the incriminating changes were made, could be downloaded.

Google Project Hosting and SmartSVN for Mac (TortoiseSVN for Windows) is an excellent duo for amateur developers to kickoff with configuration management. Add Robocode or a working project into the mix along with classmates or friends for committers, then you could host your very own project in no time. Including easily accessible User and Developer Guides is the standard and allows others to know that you intend to treat them and your work on a professional level. The checkout URL repository on Google Project Hosting appears under the Source tab, and Subversion (SmartSVN or TortoiseSVN) provides the host and committers with commit and add access. The Changes link under the Source tab and the Updates link under the Project Home tab display any changes made on the site.

Overall, this was quite a refreshing learning activity and was none too difficult at all. The only minor hurdle was the initial attempt to get my project files uploaded to the trunk directory of my robocode-bma-[nameofrobot] project. Otherwise, brief communication with my classmates and watching the screencasts made this an easy dive into configuration management.