Action(edit)

The Code Review process covers:

Anchor(CodeReviewPurpose) PurposeBR We use code reviews to improve correctness and stability of our code. The main purposes of code reviews are traditionally

Another equally important effect of code-reviews is

Anchor(CodeReviewResponsible) ResponsibleBR The Task holder of [:Process/Implementation WithTitle:implementation] task which for example can be the [:Process Role/Module Owner WithTitle:Module Owner] for small corrections or another assigned developer.

Anchor(CodeReviewMethod) MethodBR A code review consists of 2 or more persons that read the code and in a structured way identify changes that will improve the overall quality of the code. Unit test are not usually included. Code review is the third phase of implementation (following unit test writing and implementation). It is normally done when the relevant part of the code is fully implemented, i.e. fulfills all the unit tests, has been sanity tested and documented. Our process for code review contain planning, review session and follow-up as described below.

Anchor(CodeReviewPlanning) Planning

  1. Review participants are specified in the current [Iteration task list]. Usually it is the implementor and another developer.
  2. The implementor specifies the code to be reviewed in Crucible
    1. Log on to Crucible (http://kb-prod-udv-001.kb.dk:8060). Please refer to [:Guidelines/GettingCrucibleAccount:Crucible Guidelines for sign-up] if you have no account already.

    2. Go to your Crucible DashBoard (using the link "My Crucible DashBoard")

    3. Click on "Create new review" in the top right corner
    4. Select a proper titel (e.g. Feature request Y, og Bug Y plus description). If it corresponds to a task in the current [Iteration task list], it should have the same title, as is written there.
    5. Write relevant comments in the "Statement of Objects" for instance genral comments to the review and list of deleted file or class names that for obvious reasons cannot be included.
    6. Add the files included in the review. This can be done by selecting files in different changesets or in the repository????. Note that we do not review code in the test-branch.BR NOTE: You can only select files that you have commited yourself. BR There are following workarounds if this is not the case:

      • If a person is on leave, you need an account with this persons in subversion username, in this case the persons display name should be "On behalf of NNN"
      • If it is only a few files, you may make dummy edits and commit them before setting up the review.
    7. Add a revision comment to each of the files with specific lines to be review (or ALL LINES). Also note what happened here if relevant e.g. lines that have been removed (fil/class name and revision of file is given automatically by Crucible)
    8. You notify the reviewer of the existence of a review, by clicking "Start Review". BR Note that it is best to make wiki review entry (as explained in the next step) before notification.

  3. Make an entry with review information for the review in the wiki current [Review Table] (see for example ????) ....
  4. The participants agree on a time to review.
  5. Before the review time, each participant reads the code thoroughly and note problems - big or small - that should be discussed. Place the comments at the relevant places in Crucible. For instance:
    • Add a new general comment: BR For general problems like design problems or problems concerning most of the files like missing '.' in end of JavaDoc.

    • Add a revision comment: BR For general problems in a specific file like generally missing JavaDoc in this particular file.

    • On specific line (mark line will result in a comment field to appear): BR For problems on specific lines of a file like lack of white space around delimiters.

    • REMEMBER to post the comments by clicking "Post" for each of the comments.

Anchor(CodeReviewReviewSession) Review Session BR This part, while central to the whole process, should not be allowed to drag on forever. If the reviewers cannot agree on how to fix a problem within a few minutes, the item should be marked as "consider how to..." rather than prolonging the discussion.

A typical review session should take no more than an hour (and most take less than that). If it takes longer, the review should be stopped and a time to continue should be agreed upon. More than an hour of straight review reduces the efficiency.

  1. The participants meet on the phone (only physical meeting if possible)

  2. Before starting check that
    1. Code has been unit tested
    2. Code has been sanity tested
    3. Functionality has been document in manuals
    4. If any of these are missing than the Review should be postponed.
  3. Use Crucible to go through the review
    1. Log on to Crucible - preferable both reviewers.
    2. Discuss each posted comment in order of appearance.
      • General comments
      • Revision comments
      • Specific line comments
    3. Those items that the participants don't agree to discard are marked by clicking on the "Defect" box which enables selection of rank in the "Select rank" drop-down list. When Defect and Rank is specified the irem is posted by clicking "Post".

    4. Note ir is only the author of the comment that can post the comment. If only one of the reviewers have access to Crucible, the non-owned comments must be copied into new comment that can be posted with the mentioned information.

    5. Note the time used for the task in the Crucible General comments for the review using following wording: BR Time use (Coding,Documentation,Review)<IinitialsOf1Reviewer>: <NoOfManDaysUsed> [[BR]] <IinitialsOf2Reviewer>: <NoOfManDaysUsed> BR This is used for the [Iteration review] made in the end of the iteration.

    6. Remember to mark the General comment as defect and post it - otherwise this information will not be passed to the wiki afterwards.

    7. Complete the review by clicking "Complete
  4. Update the table in the current [Review Table]

If flaws are found during code review, a person must at the end of review be chosen to follow-up on the flaws found. Usually, this will be the implementor. The choice must be noted in the current [Review Table] under column "Follow-up" for the line of the review.

Anchor(CodeReviewFollowUp) Follow-up

  1. The follow-up person goes through the list of items and handles each of them. Depending on how an item is handled, the item is marked under Status on the [link Code Review Class Page paragraph].

  2. The follow-up person mark the file as fully reviewed on the [link review page] once all items have been handled.
  3. If the implementor feels the changes are significant enough to require a new review, another review cycle starts. The first review is left as-is. This rarely happens, and should only happen when design issues have been identified and resolved during the review process.

Anchor(CodeReviewReviewPages)

Review Pages (technical information)

There are two kinds of review pages:

Anchor(CodeReviewPagePerPage)

Code Review Page per Class/JSP-page

Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review

Each class/JPS-page has its own page with all code reviews and their documentation made on the specific Class/JPS-page.

Anchor(NameCodeReviewPage)'

Name of Code Review Class/JSP page

Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented

Each code review page is named according to the codes position in the Java project.

For classes the name is formed from the class and package name as follows

Where each part of the name starts by upper case and continuous in lower case (as WikiWords), for example

For JSP pages the name is formed from the JSP-page group and the JSP page name as follows

Where each part is of the name starts by upper case and continuous in lower case – and “-“ are skipped where letter after “-“ is written in uppercase too (as WikiWords), for example

Anchor(TablesFilledPageReview)

Code Review Tables filled for each review of a class/JSP-page

Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review

The below tables (from [:CodeReviewCodePageTemplate:template]) keeps the information for each review of a class/JSP-page (or parts of one). If a class/JSP-page is reviewed more than once, new sections like this get added at the top of the same page. Storing the old reviews with task, date, SVN version and lines has proven useful for tracking down problematic changes and misunderstood designs.

Example is CommonDistributeChannelsReview

Anchor(CreationCodeReviewCodePage)

Creation of New Code Review Class/JSP Page

Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review

You must use the CodeReviewCodePageTemplate (Code Review Class/JSP Page Template) to create a new page.

If an old review page exists on another media then this link should be referenced.

Anchor(UpdateCodeReviewPage)

Update of Existing Code Review Class/JSP Page

Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review

If the Code Review Class Page already exists then the tables for a new review is inserted in the top of the page in order always to see newest review text first.

The page may contain a link to old review pages which is placed on another media and therefore not readable for all.

Anchor(CodeReviewOverview)'

Code Review Overview per Iteration

Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review

Each iteration has its own page with an overview of code reviews, author of changes and who the reviewer is.

Anchor(NameCodeReviewOverview)'

Name of Iteration Code Review Overview

Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review

Each Iteration review overview page is named according to the Iteration name.

The name is formed from the iteration number as follows

for example

Anchor(TableFilledReviewsOverview)

Code Review Overview Table Filled for each Iteration

Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review

The below table (from [:CodeReviewOverviewPageTemplate:template]) keeps information of reviews made on a class/JSP-page within an Iteration.

Example is Iteration33ReviewsOverview

Anchor(CreationCodeReviewOverviewPage)

Creation of New Iteration Code Review Overview Page

Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review

You must use the CodeReviewOverviewPageTemplate(Code Review Iteration Page Template) to create a new page.

Anchor(UpdateCodeReviewOverviewPage)

Update of Existing Iteration Code Review Overview Page

Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review

For each class/JSP-page to be reviewed, there must be added a table line describing it.

Note that the same class/JSP-page may appear several times.

Anchor(CodeReviewTime)' TimeBR The input must be review as soon after actual implementation as possible. In case of changes in code, it cannot be passed to quality assurance (before release test) before it has been reviewed.

Anchor(CodeReviewInput)' InputBR Usually the input is code implementing a Tracker Issue which have been

but depending on the implemented task, it can also be corrected documentation scripts etc.

Anchor(CodeReviewOutput) OutputBR Reviewed and followed-up input, ready for quality assurance before it can be marked as ready for release test.

Anchor(CodeReviewBackground) Background for Code Review processBR The code review process was inspired by [http://satc.gsfc.nasa.gov/fi/fipage.html:NASA's ideas for code inspection]. The process has however been simplified in order to ease the transition to inspection. As the project group gains experience with inspection it is recommended that the inspection process is refined. The description focuses on code-inspection.

Anchor(CodeReviewResourceUsage) Resource UsageBR Code review takes time, of course. The actual time spent discussing the code is typically roughly the same as is spent going over the code beforehand. Follow-up can take a varying amount of time, depending on the starting quality of the code and whether significant changes have been found necessary. Some kinds of code take longer to review than others, for instance straight-forward getter-and-setter style classes go very fast, while a review of a few lines of change in a complex method can take much longer. In the start of the NetarchiveSuite project, we kept track of the time spent preparing for and executing the review (but not doing the follow-up changes to the code). The ratio of preparation time to review time varied, but there was never more than a factor 2 difference to either side, on average the two were about the same. The number of lines of code reviewed per hour (LoC/h) varied from 88 to 300, with a mean and average value of about 170 LoC/h. Later code review times were not recorded, but is likely to be slightly faster due to a better system for taking notes.

Anchor(CodeReviewLiterature) LiteratureBR