Wednesday, August 20, 2008

Pick the transformer

I finally got around to making a fun little website where you are shown two classic Transformer toys from the 80's and you have to pick which one you get to keep and which one ends up getting traded away. The site is called called Pick the transformer. It keeps track of the votes and is able to rank all of the toys and show some basic stats on the votes. It is fun to play with and easy to find yourself clicking and clicking.

Back in 2000 a site called amihotornot.com (now hotornot.com) was launched. The site showed the viewer a photo of a girl and you got to rank her 1-10. While not a perfect system it was simple enough to get its internet moment of fame. A little while later another site called pickthehottie.com was put up that showed two images and you had to select which girl you found more attractive. Girls submitted their own image because they want to see where they rank and guys, well they want to look at girls. The sites have expanded sense then to have all sorts of options and extras, but at the core it is just trying to rank photos. The algorithms between the two sites are very different and believe that pickthehottie is much better then hotornot.com. On hotornot you are faced to rank a photo between 1-10 with no guidelines as to how to make judgments. While it is simple enough to say that photo X is near the top if most votes put it at a 10, you can't know with very good accuacy where the photo ranks in relation to the other photos that are mostly 10. (or another way who is the hottest?) This is further problematic when you consider that most photos will cluster around several numbers such as 1, 5, 7, and 10. To rank anything from 1-10 you need to first get a metal baseline of what are the two extremes of 1 and 10. The first few images you have to vote on are very speculative and it can't be until you have seen a number of images that you can consistently rank images. But even once you are consistent you must constantly try to compare the current image with all previous images you have seen which can cause a 7 to become a 9 or 5 over time as you get more and more input or just change your opinion. Contrasting this with pickthehottie I think that many of those problems go away. Given the choice between two images users only have to pick the one that they like more. This is usually so easy that users don't even think about why they think one photo is hotter, they just know it. It is only when faced with the choice between two very similar photos that the user would even have to make a hard choice. Behind the scene every photo contains two lists of photos that are hotter or cooler then the current image. Just based upon the percentage of the two votes you could generate a ranking for each photo. Taking it one step further you can fine tweak the ranking by only counting the votes that were places when comparing two photos of similar percentages. This was a fun little two day project that came together very quickly and is fun to use.

Friday, August 15, 2008

Three years in Europe with Trolltech

After three years at Trolltech today was a sad day as it was my last day. When my wife and I originally decided to move to Norway we wanted to stay for three years and it is satisfying that we were able to reach that goal. The time here has been wonderful, sometimes stressful, but something I will always remember. I thought it would be worth going back and putting together a journal entry of projects I was involved in and photos I took.

In January 2005 My wife and I visited Oslo for an interview at Trolltech. The day before my interview my wife and I walked to the address to make sure I could find the building. There we discovered that Trolltech was located in a somewhat rundown building, but the next day when I went inside I found it to be much nicer and more importantly filled with interesting people working on captivating projects. I took the job and six months later we moved to Oslo. It turned out we arrived in Oslo on the day of the big Qt 4.0.0 party, unfortunately I didn't know about this and we stayed at the hotel. There was around thirty developers and Trolltech was still small enough that the computer at your desk had an actual internet IP.Being new to Trolltech and Qt4 I figured I would have been assigned a minor task to some rarely used bit of code, but after getting setup I was asked to add a feature to QWidget and my first commit was adding support for freedesktop.org startup notifications. After using Qt for eight years the idea of adding code, a feature nonless to QWidget was amazing.


The first three months we stayed at the Trolltech Markvien apartment which is just a ten minute walk from the office. When we flew to Norway we came with just four suitcases. There was no internet at the apartment and as my blogs from back then can attest to I took advantage of the Trolltech library and read through a number of the books. Every weekend for the first two months Jen and I went to one of the tourist attractions around Oslo because we knew that if we didn't do them then we wouldn't at all. We went to the ski jump, the art museum, viking museum, the fjord, the statue park, and many more places. Moving to Oslo we sold both our cars and were now walking everywhere. Along with visiting attractions we frequently would just go off walking around Oslo to see what we could find. There is a path next to a river that flows through Oslo with waterfalls that we would walk down to get to the center of town. It was a very different experience then we were used to back in the states, but the biggest shocker has to be women on bikes. It is sociably acceptable for every day adults to ride bikes, even a women in a skirt which was something you would never see in the states.

On the first day of work I asked Marius (another developer at Trolltech) where the grocery store was and he told me it was across the street on the way to the apartment. I was very skeptical of this as I had walked to work that morning and hadn't seen any grocery store. That evening on the way home I went into the place where he said it was to discover a store about the size of a seven eleven. This I was very quickly about to discover was the average size of a grocery store in Oslo. It has most everything you need, just not fifty choices of everything. Rather then one massive store per town there are many little stores all over that are within walking distance. It makes a lot of sense when you think about it, but coming from Super Stop-n-Shop's it was a bit of a culture shock. Of course everything tastes different too, but no doubt after living here for so long when I return I will fondly miss food that I could only get here.

A month after arriving I went to my first aKademy in Spain. We went for the full ten days and in typical American fashion Jen and I both brought a suitcase, while everyone else from Trolltech only had backpacks. A good lesson to learn for all of our future trips in Europe. As it was my first aKademy I finally got to meet everyone who I had interacted with online and on the mailing lists. I had a blast going to all of the presentations, discussing endless ideas and possibilities and even getting in a bit of hacking. And after many years I finally got a stuffed Konqi which has sat happily in my office. We made grand plans thinking KDE4 could be out in less then a year, but little did we realize how wrong we would be. For me post aKademy I spent a bunch of time cleaning up kdelibs and in particular reducing KApplication to the point where I was able to run a KDE application that used QApplication.

Just a few months later on December 19th, Qt 4.1.0 was released. The biggest addition for me was qtestlib. The super cool little test framework that had been part of Qt-addons was now part of Qt. During aKademy that year it was decided that including it in Qt would add a lot of value and reduce the duplicated effort that had begun within KDE to make their own test framework. I am very big on testing and ended up writing a tool to generate autotests stubs for you. All you had to do is pass in your class header and out comes a file just waiting for you to flesh out that tests your entire class. And when you think you are done writing tests you run it through valgrind and feed the callgrind file into the new code coverage tool I wrote to discover just how much of your class you didn't test. And for anyone making a custom QAbstractItemModel I wrote the Modeltest to help you make your model better and more bug free. One thing I always wanted to see was releasing the Qt autotests with Qt which might (unless it doesn't) happen later this year.

During this time Trolltech was growing and in December 2005 we moved to a brand new building with plenty of empty space for us to grow. Me and Marius traded a view of a waterfall for a view of a hill, both better then any office view I have had in the states. Over the next year the development team at Trolltech doubled as we expanded. The new office was big and much more in line with Trolltech becoming a company rather then a group of hackers, but that didn't stop us from having a paper airplane contest off our 6th floor walkway.

In June of 2006 Trolltech sponsered a cabin trip for the KDE Developers to hack on KDE4. Developers from all over the world came to a little town in Norway and spend a week hacking in a very big cabin. Since the last aKademy KDE4 had deteriorated and was in need of some work. The first two days were spent fixing up KDE left and right and finally It was there for the first time that I was able to see and run KDE4. It was also there that QtWebKit was born. It was amazing to watch as the guys discussed the future of KHTML, grabbed the WebKit source and hacked on it until at 2am on Zack's screen there was the Google homepage. One of those moments you can't plan for or buy tickets to. Meanwhile I was fixing something in the KDE about dialog, depressed at the comparison I went to bed. While there the world cup was going on, Trolltech IPO'd, and the various outdoor meetings we had were just fantastic. For me personally this cabin trip ranks at the top of my entire time in Europe.

A few short months later Qt 4.2.0 was released on October 10th 2006. That date had been set in stone in time for dev days and after the fact everyone agrees that it was pushed out too early, a mistake we tried not to repeat since, pushing back releases by a few weeks when we knew that the quality would be significantly better. Unsurprisingly Qt 4.2.1 followed very shortly after. At some point after 4.1.0 (I don't recall when exactly) Mathias gave one of his fantastic speeches which determined one of the big directions of 4.2 which was better integration with the desktops. For my part I wrote QDesktopServices, but there were many more such as QCleanlooksStyle, QNetworkInterface, QSystemTrayIcon, QKeySequence's standard shortcuts, integration with Glib, and more. Another new class I got involved with (and wrote a few older versions of) was QTimeLine, something I have had lots of fun using ever since it was created. 4.2 also saw that introduction of the eye candy changelog.

Around this time I got involved with the netflix prize and ended up making a framework that used Qt and included a little app that you could use the view the entire database. While making the tool I improved Qt so that QTableView could display all 101 million rows of data. I didn't win the prize, but I had a lot of fun, learned a lot and if I ever get access to a farm of computers know exactly what I am going to mess around with. And to top it off every few months in the mail I get a thank you book or dvd from my amazon wishlist which is very cool.

For the first time since moving to Oslo Jen and I returned to the states for Christmas. After living in Norway for a year and a half we could both speak very bad Norwegian. With no real Norwegians around we happily talked to each other using our horrible pronunciations and bad sentences structure. We could understand each other and we were happy. While in the states I got a MacBook and have had it automatically take two or three photos every day for the past two years. Looking through the photos there are many photos of people, books, and other images of every day life that I would normally not have taken a photos of. It is nice to have captures the memories that would have been lost.

Returning to Oslo I formed a group and wrote up a proposal to get a Wii for Trolltech. A month or so later a box arrived with a Wii and extra controllers which was promptly setup (The Wii being somewhat easier to obtain in Europe we only had to wait a month). We have had tennis tournaments and many after lunch Bomberman93 (an absolutely fantastic five person, ten minute, easy to learn, multiplayer game) sessions ever since.

Looking back I was surprised to discover that I have been using Git for over a year and a half now. One of the first things I used it for was to port an annoyingly addictive Qt2.x game to Qt4.x called Anigma. The big thing in my life was that I cut my hair which had been very long. The first few weeks were very freaky and I kept reaching back to move it out of the way or to play with my phantom hair.

On my creative Fridays I had been hacking on a little class called QColumnView. A nice little class that I am proud of that got into Qt 4.3 which was released June 1st 2007. Also in 4.3 I wrote a new QFileDialog. The majority of the work was in the model, but I also tried out a new look a feel for the dialog that was a hybrid of OS X's and Vistas with animations and other 'improvements'. Which everyone completely hated and got torn to shreds online so I reverted the interface back to the 4.2 one. The biggest thing for me in the 4.3 release was QtScript. The kick ass, fast scripting engine that I had been messing around with for several months and used to play around with genetic algorithms. I even wrote a QScript profiling tool. Overall Qt 4.3 was a good solid release with more improvements then features I would say.

In the fall of 2007 I finally sat down and started to learn Lisp something I have always wanted to do, but just never made the time to. Unfortunately a much more exciting project was just around the corner that took my free time away...

Shortly before Christmas while at lunch one day I had a conversation with Simon and Lars about the QtWebKit module which had just been merged into Qt. I had been asking about if there was plans to make a little browser that could be used to help test the module. In WebKit there is the QtLauncher, but it was more for developers and not something that can be used for day to day browsing. They were also interested in such an application and I offered to help make one. All through Christmas I hacked and my last blog entry of 2007 gave a hint about what I was working on and what would become the demo browser. Starting with the cookie jar I took each feature one by one, fleshing them out and only integrating them when there was a manual test and matching autotests. I spent all of my free time hacking on the browser and then part of my time at work tracking down and fixing bugs in QtWebKit and the new networking code. I was ruthless with any segfault I found and by the time 4.4.0 was released QtWebKit was pretty stable. One of the very first QtWebKit bugs I fixed was adding support for tooltips so xkcd's tooltip would work. Unlike FireFox2 which would cut off long tooltips QtWebKit tooltips could wordwrap. Just like my first commit to QWidget in Qt the feeling of committing a patch to WebKit was really cool. In February Trolltech had its annual ski trip in the mountains. Before we left I printed out all of the code for the demo browser and with the help of other developers did a complete code review on the bus and in the cabins. This helped improve the code for the announcement of the little project. One day I was asked if I would be interested in writing a QtWebKit article for Linux Journal. Having never been published in a magazine before I jumped at the option and a few months later got three Linux Journal magazines in the mail which was really neat to see an article I wrote in print. Shortly before Qt 4.4.0 was released I forked the demo browser and created Arora. Although many people had been interested in seeing it forked much sooner I didn't want to do that until 4.4.0 was released to get as much feedback from everyone trying out the Qt 4.4.0 beta.

Around this time it was announced that Nokia was going to buy Trolltech. For me the biggest bummer had to be when I saw them taking down the Trolltech sign off the building the night before we became Nokia. Overall Nokia has not caused any major changes of doom as some feared and I expect Qt will continue to improve, grow and be even more open in the future. Trolltech has a lot of smart people in it and they wont let anything happen to Qt.

On May 2nd Qt 4.4.0 was released. This was while I was in Italy and as Qt got lots of press the demo browser also got mentioned various places (with lots of questions ending in my inbox), but I was unaware of all of this until I got back as I didn't think the 4.4.0 packages were going to be released until the end of the month. I had a few stress filled days catching up before things returned to normal. Other then the demo browser I had also gotten in QSystemSemaphore, QSharedMemory, QLocalServer & QLocalSocker, and the QFileSystemModel. But my favorite addition has to be QFile::map, something I have implemented several times on personal projects. Qt 4.4.0 also had a number of brand new features and I think that this release was the best .0 that Trolltech has had to date. One feature of Qt 4.4.0 that I think is a hidden gem is QtConcurrent. I have used QtConcurrent in some personal projects since it was first announced and the ease that it lets me use both of my cores to double my speed still amazes me. With the possibility of having six and eight cores very soon the speedup in performance will not be insignificant.

I am leaving before 4.5 is released, but I have been able to squeeze in some stuff including the new QTabBar and QTabWidget improvements and implementing a cache for the http backend in QNetworkAccessManager. I am looking forward to the 4.5 release.

Around this time Jen and I knew that we would be returning to the states and so we made trips to Italy and the camping trip to Trolltunga. While living in Europe I got the opportunity to visit Spain, Germany, France, Ireland, Scotland, England, Italy, and Jen visited even more. Because they were spaced out we took the time in each place rather then rushing from one to the next during a month long Europe vacation. I have tried to limit my blog to technical related articles, while Jen's blog contains many more of our every day adventures from visiting cities to laughing about the food we would find.

While visiting Euope can be a lot of fun, the act of living here has open my eyes to all sorts of things. The idea that we could get by without a car is one we could not imagine three years ago and yet somehow it works out just fine. With so many people from work walking to get there is is no surprise that on an evening stroll or walk downtown that you will run into someone you know. The closeness of the community is very nice. Many of the guys are Trolltech have stories of when they were in Army, unlike in the states where you get to sign up, here they don't really have a choice and although I knew that some countries did that it didn't really hit me about what it means until talking to many of them.
Another difference was the banking system where in Norway it is far ahead of the states. No one has a checkbook and my bank didn't even accept checks, all your bills are paid online, and if you want to send money to a friend it is easy to do online. Even taxes are calculated for you by the government and if there are no errors you can send a SMS to confirm the form and you are done.
The metric system, 24 hour clocks, A4 paper, currencies, and other things you don't think about when visiting, but you have to deal with when moving were just a few other things we had to adjust to. At least with the 24 hour clock I plan on continuing to use that and getting a 24 hour alarm clock in the states so I will never set the alarm for 7pm when I meant 7am. Overall living in Europe has been a fantastic once in a lifetime experience and I am glad that I did it.

While at Trolltech they managed to put out consistent releases with new features and improvements. From the time I started to my last day they strive to be open, from making the Windows version of Qt GPL, the launch of labs.trolltech.com, creative Friday's and involvement in many open source projects such as webkit, git and KDE, they have risen the bar about what a company can do. But none of this would have been possible without the people behind the scene. Above everything else coming to Europe and Trolltech has been about the people I have met and worked with. Trolltech is filled with some of the smartest and most interesting folks I have met who enjoy what they do and it shows. They are friendly, welcome you into your home and will help you out when you are completely confused.

Even though I am leaving Trolltech and returning to the U.S. I plan on continuing to use Qt, and probably submit a patch or two. Arora has really taken off and has an active set of users and developers and I am really looking forward to what it will become. Hopefully I will return to Europe to attend aKademy and get to see everyone again.

And to wrap it all up I thought it would be fitting to have the Qt4 dance.

Thursday, August 07, 2008

Code Review Should Be Free

Depending on the definition "code review" can mean a wide variety of things such as formal code review or automated code analysis. For this article I am talking about when a developer has a patch to some code base and would like someone else to review it before it goes into the main repository.

Just the thought of having a code review will frequently cause developers to spend extra time making sure a patch is as good as it can be. Code reviews typically catch bugs, design flaws, missed edge cases, inconsistent or confusing code style, and more.

There are several core ideas for code reviews, the more of them that are followed the more successful the technique will be.
- Review of the code by one or more persons before committing it to the central repository.
- Record comments about a patch so suggestions are not forgotten as a patch changes.
- Patches should be easy to review and test.
- The developer shouldn't feel like the code review is holding them back or that they are waiting for a review before they can continue working.
- Reviews should integrate easily with the existing work flow.

With only these five requirements it shouldn't be too hard to do. So what are the most common ways that developers do code reviews?

Email pass-around

The most common form of code review I have seen is the email mailing-list of commits. It is so common that even when you find any other technique you will often still find this one running too. This is mostly because it is "the example" of how to add a hook to your revision control system and usually extremely easy to setup if not done for free. It works by having a hook in the primary code repository that sends an e-mail to a mailing-list with the diff of every commit. This of course fails the first and most important requirement which is to get the review before committing. The idea is that everyone reads commits they are interested in and replies to the mailing-list when they spot something wrong which another commit fixes. In practice most commits are not reviewed and most people read only a few random commits before marking them all as read. This of course only gets worse as the project grows and the volume of mail on this mailing-list increases to the point where no one reviews. To top it off when you see some code that works, but has a few confusing variable names you wont say anything. In fact it takes quite a bit for anyone to actually hit the reply and comment on a patch; code that doesn't compile, possible security bug, introduction of bad public API or some other major and obvious bug. And this works both ways, the committers know that they can get away with it so they don't make their commits as polished as they could.

Cons: By the time the e-mail goes out to the mailing list the patch is already committed into the repository. Once the code is in the repository it is much harder to ask someone to make smaller fixes and can even be taken as an insults. In practice there is an extremely low volume of actual reviews that are done.

Pros: Having nothing to do with code review this method is often best to occasionally let you see what is going on elsewhere in the project. In fact it has spawn a whole sub world that digest commits and write up reports such as the KDE commit digest.

Over the shoulder

Less common in the open source world (and in the corporate world) is the in person code review. This is where you have someone physically next to you while you explain the patch. While better it has its own drawbacks. When you go over to do a code review for someone else they have usually been sitting around for five minutes waiting for a code review and quickly walk you through the code. Because you are face to face, you can often ask any question and quickly determine if the developer had explored all the options you can think of. You can ask if the unit tests have run, if there is an example, etc. At the end of the day the results vary wildly with some reviews being in depth while others being just a quick glance. When this is the primary code review method and the person that would normally review some code is not available (vacation, sick, out of office) the commit might go in without any review.

Pros: Face to face reviews can be very quick and I can quickly turn into pair programming. The process of explaining the problem and patch can often times cause the developer to think of edge cases themselves to test. Unrelated this is one way to teach someone about a piece of code.

Cons: Many times the reviewer can feel rushed as they try to digest code that the developer quickly explained. The developers tells the reviewer the patch is correct even before the reviewer has reviewed the code. The developer who is best suited to review the patch might not be around and a developer who does not know the code and is not as able to properly review the code is drafted into doing it.

E-Mail diff

Bits of all of the above, a diff is passed around through e-mail which can spawn a thread of comments. The biggest downside is that there is a higher cost to entry and people are less likely to send smaller or simpler patches via e-mail. Often times the only patches that are sent via e-mail are larger ones that go through a number of revisions, and yet the code being e-mailed around isn't in a revision control system loosing out on the ability to see the diff between versions. Because the developer resends the entire patch every time they make a change the more revisions that are sent in a thread the less actual reviewing will be done.

PasteBin

In the past few years I have seen the rise of paste websites such as http://paste.lisp.org/ that provide a simple way to paste some text. Usually no login is required and it takes only seconds to do. Developers will generate a diff and paste it onto these sites and then ask for a review over IRC or another chat system. There are even command line tools to make pasting code straight onto the website even easier such as 'svn diff | lpaste'. Pasting while cheap and easy has drawbacks such as requiring the other person to be there right then and the paste is often missing full context such as what was just above a diff. Because pasting is usually done in conjunction with chat the paste sites don't provide a way to add comments to a patch.

Pros: Very easy to do. Can be done with someone that is remote. As it is just text it is not constrained by any system or process. Also used sometimes for a quick way to send patches or other files rather then e-mail. Because you are also chatting via IRC you know that patch is being reviewed right there and then vs e-mail which could be a week.

Cons: The patch is frequently only read and almost never applied to a working repository to test even if it even compiles. This is just another tool that has absolutely no integration with any other tools you use such as the source repository. Feedback is always done via another format such as IRC. Requires live interaction between both parties and requires the manually pushing of the diff up to the website in the first place to get a review. Many paste sites have an expiration of a month and sometimes developers set it to one day causing frustration the next day when the reviewer tries to access it.

WebTool

In the past year or two several code review website tools have popped up. http://www.review-board.org/ and Rietveld are the one that had obtained some blog press. The review tools all try to improve upon the paste idea by managing patches. You can add new patches, request reviews from developers, add comments to patches, and update the patch. When I first heard of the tool I though it was fantastic and played around with it, but ended up not using it for very long. Fundamentally it requires a lot of work on the part of the developer. For every patch there is a lot of manual work that needs to be done. Even when the patch is approved you still have to submit the patch and then mark it as committed in the tool. Not to mention you need to have yet another way to watch for updates to your patches and for patches you need to review. When testing out the systems I only ever used one patch, but in production I had many patches, some of them on the same file. This caused problems as the patches were approved and other patches got stale and the tool would get confused. Fundamentally a review system needs to have the foundations of a revision control system to properly work and be closely tied into the revision control system that is used. Most of the review sites out there are little more then paste projects that got scaled up, they will let you down. To further illustrate this: you can patch your patches. This causes the review system to keep what is a branch on top of the current tree that you can view. This eventually will break by not behave like your real revision control system does. The most common one I saw was where I wrote patch A, someone else submitted B and then I updated A and uploaded it. Rietveld would freak out and break. And because the review systems are so big they eventually go down for a day here or there. All of your patches are up on the system when the review system goes down you can't do anything either. Just as pasting got popular because it was so very easy, a web tool probably wont get very popular because it is just to much work.

Is there any better options?

So after that whirlwind tour of what is currently in use out there I want to present something I have been using the past few months that takes advantages of distributed revision control and a little known feature of GitHub.

One behavior I have witnessed is that once a developers pastes a patch they wont do anything until they get a review. They won't commit until they get a review and they don't want to work on the next task until they commit the first. Nice catch-22. Using a distributed revision control system such as Git (or any system where branches are cheap) a developer can commit to the branch, post the patch for review, and continuing hacking in another branch giving the reviewer(s) as much time as they want to review the code. Patches sitting in their own branch can be worked on and revises for as long as needed without feeling as pressured to merge it 'right now' as often happens when a system does not use branches.

So just switching to Git solves one annoying code review problem, but can it do more? Several months ago I started using GitHub for hosting Arora. Every developer has a fork of the main repository (including me). When a branch (often just one change) is ready it is pushed up to GitHub and a pull request is made. When I get a pull request I can view the patch which is the same as e-mail pass-around and paster, but because the patch is in the source repository I can do all of the following very cheaply:

- Review the patch in the context of the entire file. I can glance at just a patch, view the entire file, its history, other branches, and well anything I want because I have the full repository right there.
- Review the changelog entry to make sure it is following the project format, spelling, etc.
- Checkout the code and try it myself. The number of times I have actually downloaded and applied a patch from a paster is very small, maybe twice a year. But with Arora almost every time I have fetched the repository, built the branch, played with the new feature / bug fix, ran the auto tests, etc. It is *very* cheap and easy. When I get a pull request from user foo, I run git fetch foo to grab their changes and because the patch is in its own branch compiling and running the patch is as easy as checking out their branch. No patching required.

Another nice aspect is that when viewing the change on GitHub you can click on any line number and add a comment right there. Add as many comments as you want to the different issues you spot. When the developer commits the next version you don't have to review the whole patch, you only have to review the patch to the patch to make sure that what you commented on was fixed. Because the branch is a real Git branch and not some files on a website review tool it understands patches to patches and when upstream is rebased just fine and never has issues.

Unlike review-board there is no extra work involved to review code. I get code review features for free with doing my normal everyday committing and pushing. When I get a pull request that is perfectly fine I can just merge it and reply back with a "Thanks!". But if I want to go in-depth with a review and leave comments and have them upload a new versions I can. If I want multiple reviews of one of my changes I can easily get that with multiple pull requests (with a comment just to review and not merge). To get reviews of my code I already push up my branches so it is nothing more then going to github and clicking on the pull request button.

Reviewing with GitHub isn't perfect. I would like to have an RSS of all the comments I have left and another one of all the comments others have left on my repository. Currently they are simply placed in your news feed. But what is in place today works well enough and is far beyond in my opinion the other code review options.

With distributed revision control systems and sites like GitHub or Gitorious code review is such a simple add-on to already proven workflows, it's essentially free.

Photo by Sebastian Bergmann under the CC license.

Popular Posts