Wednesday, September 5, 2012

The Review Process

As a continuation of my blog about Software Review http://soa-java.blogspot.nl/2012/04/software-review.html



Review Process checklist
• Determine when to do the review in the software process e.g.
    * code review after finishing the code implementation before delivering to QA/test team (waterfall)
    * requirement/product backlog review before sprint planning (Scrum)
    * design review before starting coding (well... even with Agile process you need to start with some kind of design in your mind that you can discuss with your peer reviewers).
• Define the process/how review will be conducted e.g.
    * code reading (most effective)
    * formal review meeting (less effective)
    * informal walk-through (less effective)
    * customer demo (mainly for functional requirements).
• Determine entry criteria (e.g. specification documents are available)
• Determine exit criteria (e.g. approval by product owner & SOA governance board)
• Determine metrics (e.g. LOC/hour, time spend, error list with severity & type)
• Are tools available to assist review process (e.g. checkstyle, PMD, spelling checker, xml/html validator, test suites)?
• Determine communication channel (e.g. Trac wiki, bugzilla)
• Determine who will play the reviewer role, e.g. architects, security specialist, external auditor, customer. You may have several reviewers assigned to specific areas (e.g. security specialist, database specialist, customer to review use cases).
• Determine the time needed for review (based on code complexity/size/maturity, programmer's skills, risk analysis). Discuss the time/plan with project manager / team lead to obtain management support. Schedule the meetings. Set time-limits for meetings & other review works.
• Do the review, register the anomalies.
• Discuss whether or not a fix is needed.
• Discuss the fix, decide which version the fix should be done, who will do the reworks, estimate/plan the reworks
• Determine the exit decision e.g. re-inspection after required reworks, minor reworks with no further verification needed.
• Reschedule the follow up/ reinspection for reworks.
• Collect "lessons to learn" to improve the development & review process, for example in a company wiki knowledge repository.

Review Outputs
List of anomalies with severity/risk and types (e.g. missing functional requirement, doesn't conform standard/guidelines, security, performance, etc.). The list is documented for example in Trac/Bugzilla and made available to the developer team, QA team, product owner and management.
• List of actions for each anomalies (e.g. don't fix, fix for further release, fix immediately in the current Scrum sprint), who will implement, when the follow up is

Best practices
• use checklist, rather shorter than A4
• The process can be enforced by software infrastructure (eg deployment script, TracWorkflowAdmin plugin)
• the reviewer must be someone other than the author of the requirement / design / code.
limit the review time, make a clear agreeement / planning between developers, reviewes, project manager and customer. Waiting for a review shouldn't become an excuse to block the progress of the project.


Other tips
Read this article:
http://www.ibm.com/developerworks/rational/library/11-proven-practices-for-peer-review/
* don't review too much code at once
* take your time
* the quality of the code is improved if the author has done self-review and annotate (explain, defend the rationale) it before the review
* verify that the defects are actually fixed
* ego effect: the review process will motivate the author to be less sloppy even if you only review small percentages of the code
* use automated review tools (e.g. checkstyle in java eclipse)

No comments: