Differences between revisions 1 and 24 (spanning 23 versions)
Revision 1 as of 2008-12-19 16:15:26
Size: 17612
Editor: EldZierau
Comment:
Revision 24 as of 2010-08-16 10:24:30
Size: 6707
Editor: localhost
Comment: converted to 1.6 markup
Deletions are marked like this. Additions are marked like this.
Line 1: Line 1:
~-[[Action(edit)]]-~ ## page was renamed from Process/Document Review
~-<<Action(edit)>>-~
Line 3: Line 4:
~-The Document Review process covers: [#DocReviewPurpose Purpose]-~, ~-[#DocReviewResponsible Responsible]-~, ~-[#DocReviewMethod Method]-~ (~-[#DocReviewPlanning Planning]-~, ~-[#DocReviewReviewSession Review Session]-~, ~-[#DocReviewFollowUp Follow-Up]-~), ~-[#DocReviewTime Time]-~, ~-[#DocReviewInput Input]-~, ~-[#DocReviewOutput Output]-~ ~-The Document Review process covers: [[#DocReviewPurpose|Purpose]]-~, ~-[[#DocReviewResponsible|Responsible]]-~, ~-[[#DocReviewMethod|Method]]-~ (~-[[#DocReviewPlanning|Planning]]-~, ~-[[#DocReviewReviewSession|Review Session]]-~, ~-[[#DocReviewFollowUp|Follow-Up]]-~), ~-[[#DocReviewTime|Time]]-~, ~-[[#DocReviewInput|Input]]-~, ~-[[#DocReviewOutput|Output]]-~
Line 5: Line 6:
[[Anchor(DocReviewPurpose)]]
'''~+Purpose+~'''[[BR]]
We use document reviews to improve correctness of documentation. 
<<Anchor(DocReviewPurpose)>>
'''~+Purpose+~'''<<BR>>
We use document reviews to improve correctness of documentation.
Line 9: Line 10:
[[Anchor(DocReviewResponsible)]]
'''~+Responsible+~'''[[BR]]
The Task holder of [:Process/Implementation WithTitle:implementation] task doing or correcting document or script. The task holder is specified in the [:Development#CurrentIterationTaskOverview:current iteration task overview]
<<Anchor(DocReviewResponsible)>>
'''~+Responsible+~'''<<BR>>
This can either be
 * [[Process Role/Task Holder|Task Holder]] of implementation task doing or correcting document or script. Tasks are specified in the [[Development#CurrentIterationTaskOverview|current iteration task overview]].
 * [[Process Role/Module Owner|Module Owner]] for Documentation module for small corrections.
Line 13: Line 16:
[[Anchor(CodeReviewMethod)]] '''~+Method+~'''[[BR]] A code review consists of 2 or more persons that read the code and in a structured way identify changes that will improve the overall quality of the code. Unit test are not usually included. Code review is the third phase of implementation (following unit test writing and implementation). It is normally done when the relevant part of the code is fully implemented, i.e. fulfills all the unit tests, has been sanity tested and documented. Our process for code review contain planning, review session and follow-up as described below. <<Anchor(DocReviewMethod)>>
'''~+Method+~'''<<BR>>
A document review consists of 2 or more persons that read the document/script/assignment and in a structured way identify changes that will improve it. Our process for document review contain planning, review session and follow-up as described below.
Line 15: Line 20:
[[Anchor(CodeReviewPlanning)]] '''''Planning''''' <<Anchor(DocReviewPlanning)>>
'''''Planning'''''
 1. Review participants are specified in the current [[Development#CurrentIterationTaskOverview|Iteration task list]]. Usually it is the implementor and another developer.
 1. The implementor specifies the document (parts) to be reviewed in a new row in the document review table on the current [[Development#CurrentReviews|Iteration review overview]] (the meaning of the different columns are also described in the end of iteration review page):
   1. "Document": Link to issue review page for document named with identification of the document, e.g. http://netarchive.dk/suite/AssignmentDeploy1 - See example in [[Iteration36|Iteration 36]]. A new review is inserted in the top of the page in order always to see newest review text first.
   . If the document review page do not already exist, make a new Document Review page on basis of template ReviewDocumentPageTemplate:
     1. Create new page named according to the document name as follows:
     . !DocumentReview/<Unique name for document> + "Review"
     . where each part of the name starts by upper case and continuous in lower case (as !WikiWords), for example [[DocumentReview/NetarchiveSuiteInstallationManualReview]] for review of the [[Installation Manual 3.10|New version of installation manual]]
     1. Copy the text from the template in edit mode
     1. Insert the template it into the new review page and adjust it
     1. If an old review page exists on another media/wiki then this link should be referenced.
   1. ~+`Version`+~: The SVN, CVS or date for revision of document/script to be reviewed.
   1. ~+`Parts/lines`+~: Specifies the parts/section/lines of the document/script to review (if less than the whole file).
   1. ~+`Task`+~: The assignment or tracker issue that the code has been updated for, e.g. Bug 1512.
   1. ~+`Author(s)": The person(s) who have made changes or additions to the code. Only Initials are given, e.g. ELZI.
   1. ~+`Reviewer(s)": The person(s) who have not been involved in changes, who will participate in the review. If the task is in the [[Development#CurrentIterationTaskOverview|Iteration task list]] the reviewer is suggested with the task in this task list.
 1. The participants agree on a time to review, this date is noted under "Review date" on the document review entry on the [[Development#CurrentReviews|Iteration review overview]] page
 1. Before the review time, each participant reads the document thoroughly and note problems - big or small - that should be discussed. These can be written into the document review page (e.g. [[DocumentReview/NetarchiveSuiteDeveloperManualReview]]), or entered during the review as described below.
Line 17: Line 40:
 1. Review participants are specified in the current [Iteration task list]. Usually it is the implementor and another developer.
 1. The implementor specifies the code to be reviewed in Crucible
  1. Log on to Crucible (http://kb-prod-udv-001.kb.dk:8060). Please refer to [:Guidelines/GettingCrucibleAccount:Crucible Guidelines for sign-up] if you have no account already.
  1. Go to your Crucible !DashBoard (using the link "My Crucible !DashBoard")
  1. Click on "Create new review" in the top right corner
  1. Select a proper titel (e.g. Feature request Y, og Bug Y plus description). If it corresponds to a task in the current [Iteration task list], it should have the same title, as is written there.
  1. Write relevant comments in the "Statement of Objects" for instance genral comments to the review and list of deleted file or class names that for obvious reasons cannot be included.
  1. Add the files included in the review. This can be done by selecting files in different changesets or in the repository????. Note that we do not review code in the test-branch.[[BR]] '''NOTE: You can only select files that you have commited yourself'''. [[BR]] There are following workarounds if this is not the case:
   * If a person is on leave, you need an account with this persons in subversion username, in this case the persons display name should be "On behalf of NNN"
   * If it is only a few files, you may make dummy edits and commit them before setting up the review.
  1. Add a revision comment to each of the files with specific lines to be review (or ALL LINES). Also note what happened here if relevant e.g. lines that have been removed (fil/class name and revision of file is given automatically by Crucible)
  1. You notify the reviewer of the existence of a review, by clicking "Start Review". [[BR]] '''Note'' that it is best to make wiki review entry (as explained in the next step) before notification. '''''
 1. Make an entry with review information for the review in the wiki current [Review Table] (see for example ????) ....
 1. The participants agree on a time to review.
 1. Before the review time, each participant reads the code thoroughly and note problems - big or small - that should be discussed. Place the comments at the relevant places in Crucible. For instance:
  * Add a new general comment: [[BR]] For general problems like design problems or problems concerning most of the files like missing '.' in end of !JavaDoc.
  * Add a revision comment: [[BR]] For general problems in a specific file like generally missing !JavaDoc in this particular file.
  * On specific line (mark line will result in a comment field to appear): [[BR]] For problems on specific lines of a file like lack of white space around delimiters. [[BR]] '''REMEMBER to post the comments''' by clicking "Post" for each of the comments.

[[Anchor(CodeReviewReviewSession)]]
'''''Review Session '''''[[BR]]
<<Anchor(DocReviewReviewSession)>>
'''''Review Session '''''<<BR>>
Line 43: Line 47:
 1. Before starting check that
  1. Code has been unit tested
  1. Code has been sanity tested
  1. Functionality has been document in manuals
  1. If any of these are missing than the Review should be postponed.
 1. Use Crucible to go through the review
  1. Log on to Crucible - preferable both reviewers.
  1. Discuss each posted comment in order of appearance.
   * General comments
   * Revision comments
   * Specific line comments
  1. Those items that the participants don't agree to discard are marked by clicking on the "Defect" box which enables selection of rank in the "Select rank" drop-down list. When '''Defect''' and '''Rank''' is specified the irem is posted by clicking "Post".
  1. '''Note''' it is only the author of the comment that can post the comment. If only one of the reviewers have access to Crucible, the non-owned comments must be copied into new comment that can be posted with the mentioned information.
  1. Note the time used for the task in a Crucible General comments for the review using following wording: [[BR]] `Time use (Coding,Documentation,Review)` [[BR]] `<IinitialsOf1Reviewer>: <NoOfManDaysUsed>` [[BR]] `<IinitialsOf2Reviewer>: <NoOfManDaysUsed>` [[BR]] '''Remember''' to set selection of rank in the "Select rank" drop-down list to "Time Used". This is used for the [Iteration review] made in the end of the iteration.
  1. '''Remember''' to mark the General comment as defect and post it - otherwise this information will not be passed to the wiki afterwards.
  1. Complete the review by clicking "Complete"
 1. Agree to who is doing follow-up in case flaws are found during code review. Usually, this will be the implementor.
 1. Update the table in the current [Review Table] with
   * '''Review date'''
   * '''Issues found''' - using the [http://kb-prod-udv-001.kb.dk:8060/plugins/servlet/export Crucible export function] and insert result on page IssuesFromNsXX where XX is the number in the Crucible review id.
   * '''Follow-up''' - initials of person decided to do the follow-up.
 1. Note ~+`Time use (Documentation,Review)`+~ in the header, given in number of person days used.
 1. Discuss each comment in order of appearance in the document. Those items that the participants don't agree to discard are marked with ~+`status`+~ "rejected".
 1. Fill in ~+`severity`+~, and in case thios is "major" or "showstopper" a tracker issue must be reported as well.
 1. Agree to who is doing follow-up in case flaws are found during code review. Usually, this will be the implementor.
 1. Update the table in the current [[Development#CurrentReviews|Iteration review overview]] in the entry for the actual review:
  * '''Review date'''
  * '''Follow-up''' - initials of person decided to do the follow-up.
Line 65: Line 55:
[[Anchor(CodeReviewFollowUp)]] <<Anchor(DocReviewFollowUp)>>
Line 67: Line 57:
 1. The follow-up person goes through the list of items and handles each of them. Depending on how an item is handled, the item is marked under Status on the [link Code Review Class Page paragraph].
 1. The follow-up person mark the file as full
y reviewed on the [link review page] once all items have been handled.
 1. The follow-up person goes through the list of items and handles each of them. Depending on how an item is handled, the item is marked under Status on the document review page.
 1. The follow-up person mark the file as fully reviewed on the
[[Development#CurrentReviews|Iteration review overview]] in the entry for the actual review, once all items have been handled.
Line 71: Line 61:
[[Anchor(CodeReviewReviewPages)]]
==== Review Pages (technical information) ====
There are two kinds of review pages:
<<Anchor(DocReviewTime)>>
'''~+Time+~'''<<BR>>
The input must be review as soon after actual update or creation of documentation as possible.
Line 75: Line 65:
 * Code Review Page per Class
 . that contains all reviews made on the class
 * Code Review Overview per Iteration
 . that contains an overview of code reviews made within an iteration
[[Anchor(CodeReviewPagePerPage)]]
<<Anchor(DocReviewInput)>>
'''~+Input+~'''<<BR>>
Usually the input is implementing a Tracker Issue for documentation, but it could also be script or an assignment.
Line 81: Line 69:
===== Code Review Page per Class/JSP-page =====
Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented [[BR]] The contents can be seen as guidence of what informaiton should be through the review''''' '''''
<<Anchor(CodeReviewOutput)>>
~+Output+~<<BR>>
Reviewed and followed-up input, ready for release test or release.
Line 84: Line 73:
'''''Each class/JPS-page has its own page with all code reviews and their documentation made on the specific Class/JPS-page. '''''

'''''[[Anchor(NameCodeReviewPage)]]'''''''''' '''''

===== Name of Code Review Class/JSP page =====
Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented''''' '''''

'''''Each code review page is named according to the codes position in the Java project. '''''

'''''For classes the name is formed from the class and package name as follows '''''

 . '''''''''''''<Unique package name for class> + <Class name> + "Review"'' '''''
Where each part of the name starts by upper case and continuous in lower case (as !WikiWords), for example

 . CommonDistributeChannelsReview'' ''
 . for dk.netarkivet.common.distribute.channels.java (under /trunk/src/)
For JSP pages the name is formed from the JSP-page group and the JSP page name as follows

 . <Unique group for JSP-page > + <JSP-page name> + "JSPReview"'' ''
Where each part is of the name starts by upper case and continuous in lower case – and “-“ are skipped where letter after “-“ is written in uppercase too (as !WikiWords), for example

 . HistoryHarveststatusJobdetailsJSPReview'' ''
 . for History/!HarvestStatus-jobdetails.jsp (under /trunk/webpages/)
[[Anchor(TablesFilledPageReview)]]

===== Code Review Tables filled for each review of a class/JSP-page =====
Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented [[BR]] The contents can be seen as guidence of what informaiton should be through the review''''' '''''

'''''The below tables (from [:CodeReviewCodePageTemplate:template]) keeps the information for each review of a class/JSP-page (or parts of one). If a class/JSP-page is reviewed more than once, new sections like this get added at the top of the same page. Storing the old reviews with task, date, SVN version and lines has proven useful for tracking down problematic changes and misunderstood designs. '''''

 . '''''[[Include(CodeReviewCodePageTemplate)]]'''''''''' '''''
Example is CommonDistributeChannelsReview

[[Anchor(CreationCodeReviewCodePage)]]

===== Creation of New Code Review Class/JSP Page =====
Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented [[BR]] The contents can be seen as guidence of what informaiton should be through the review''''' '''''

'''''You must use the CodeReviewCodePageTemplate (Code Review Class/JSP Page Template) to create a new page. '''''

 * '''''Create new page named as described above '''''
 * Copy the text from the template in edit mode
 * Insert the template it into the new review page and adjust it
If an old review page exists on another media then this link should be referenced.

[[Anchor(UpdateCodeReviewPage)]]

===== Update of Existing Code Review Class/JSP Page =====
Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented [[BR]] The contents can be seen as guidence of what informaiton should be through the review''''' '''''

'''''If the Code Review Class Page already exists then the tables for a new review is inserted in the top of the page in order always to see newest review text first. '''''

'''''The page may contain a link to old review pages which is placed on another media and therefore not readable for all. '''''

'''''[[Anchor(CodeReviewOverview)]]'''''''''' '''''

===== Code Review Overview per Iteration =====
Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented [[BR]] The contents can be seen as guidence of what informaiton should be through the review''''' '''''

'''''Each iteration has its own page with an overview of code reviews, author of changes and who the reviewer is. '''''

'''''[[Anchor(NameCodeReviewOverview)]]'''''''''' '''''

===== Name of Iteration Code Review Overview =====
Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented [[BR]] The contents can be seen as guidence of what informaiton should be through the review''''' '''''

'''''Each Iteration review overview page is named according to the Iteration name. '''''

'''''The name is formed from the iteration number as follows '''''

 . '''''"Iteration" + <Iteration number> + "!ReviewsOverview" '''''
for example

 . Iteration33ReviewsOverview
 . for review overview in Iteration 33
[[Anchor(TableFilledReviewsOverview)]]

===== Code Review Overview Table Filled for each Iteration =====
Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented [[BR]] The contents can be seen as guidence of what informaiton should be through the review''''' '''''

'''''The below table (from [:CodeReviewOverviewPageTemplate:template]) keeps information of reviews made on a class/JSP-page within an Iteration. '''''

 . '''''[[Include(CodeReviewOverviewPageTemplate)]]'''''''''' '''''
Example is Iteration33ReviewsOverview

[[Anchor(CreationCodeReviewOverviewPage)]]

===== Creation of New Iteration Code Review Overview Page =====
Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented [[BR]] The contents can be seen as guidence of what informaiton should be through the review''''' '''''

'''''You must use the CodeReviewOverviewPageTemplate(Code Review Iteration Page Template) to create a new page. '''''

 * '''''Create new page named as described above '''''
 * Copy the text from the template in edit mode
 * Insert the template it into the new review page and adjust it
[[Anchor(UpdateCodeReviewOverviewPage)]]

===== Update of Existing Iteration Code Review Overview Page =====
Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented [[BR]] The contents can be seen as guidence of what informaiton should be through the review''''' '''''

'''''For each class/JSP-page to be reviewed, there must be added a table line describing it. '''''

'''''Note that the same class/JSP-page may appear several times. '''''

'''''[[Anchor(CodeReviewTime)]]'''''''''' '''~+Time+~'''[[BR]] The input must be review as soon after actual implementation as possible. In case of changes in code, it cannot be passed to quality assurance (before release test) before it has been reviewed. '''''

'''''[[Anchor(CodeReviewInput)]]'''''''''' '''~+Input+~'''[[BR]] Usually the input is code implementing a Tracker Issue which have been '''''

 * '''''unit tested '''''
 * sanity tested
 * documented in manuals
but depending on the implemented task, it can also be corrected documentation scripts etc.

[[Anchor(CodeReviewOutput)]] ~+Output+~'''[[BR]] Reviewed and followed-up input, ready for quality assurance before it can be marked as ready for release test. '''

'''[[Anchor(CodeReviewBackground)]]'''''' '''~+Background for Code Review process+~'''[[BR]] The code review process was inspired by [http://satc.gsfc.nasa.gov/fi/fipage.html:NASA's ideas for code inspection]. The process has however been simplified in order to ease the transition to inspection. As the project group gains experience with inspection it is recommended that the inspection process is refined. The description focuses on code-inspection. '''

'''[[Anchor(CodeReviewResourceUsage)]]'''''' '''~+Resource Usage+~'''[[BR]] Code review takes time, of course. The actual time spent discussing the code is typically roughly the same as is spent going over the code beforehand. Follow-up can take a varying amount of time, depending on the starting quality of the code and whether significant changes have been found necessary. Some kinds of code take longer to review than others, for instance straight-forward getter-and-setter style classes go very fast, while a review of a few lines of change in a complex method can take much longer. In the start of the !NetarchiveSuite project, we kept track of the time spent preparing for and executing the review (but not doing the follow-up changes to the code). The ratio of preparation time to review time varied, but there was never more than a factor 2 difference to either side, on average the two were about the same. The number of lines of code reviewed per hour (LoC/h) varied from 88 to 300, with a mean and average value of about 170 LoC/h. Later code review times were not recorded, but is likely to be slightly faster due to a better system for taking notes. '''

'''[[Anchor(CodeReviewLiterature)]]'''''' '''~+Literature+~'''[[BR]] '''

 * '''Steve !McConnell, Rapid Development page 73-74 '''
 * [http://satc.gsfc.nasa.gov/fi/fipage.html:NASA's ideas for code inspection]
<<Anchor(CodeReviewBackground)>>
'''~+Background for Document Review process+~'''<<BR>>
The document review process is inspired by the [[Process/Code Review|code review]] process.

edit

The Document Review process covers: Purpose, Responsible, Method (Planning, Review Session, Follow-Up), Time, Input, Output

Purpose
We use document reviews to improve correctness of documentation.

Responsible
This can either be

Method
A document review consists of 2 or more persons that read the document/script/assignment and in a structured way identify changes that will improve it. Our process for document review contain planning, review session and follow-up as described below.

Planning

  1. Review participants are specified in the current Iteration task list. Usually it is the implementor and another developer.

  2. The implementor specifies the document (parts) to be reviewed in a new row in the document review table on the current Iteration review overview (the meaning of the different columns are also described in the end of iteration review page):

    1. "Document": Link to issue review page for document named with identification of the document, e.g. http://netarchive.dk/suite/AssignmentDeploy1 - See example in Iteration 36. A new review is inserted in the top of the page in order always to see newest review text first.

    2. If the document review page do not already exist, make a new Document Review page on basis of template ReviewDocumentPageTemplate:

      1. Create new page named according to the document name as follows:
      2. !DocumentReview/<Unique name for document> + "Review"

      3. where each part of the name starts by upper case and continuous in lower case (as WikiWords), for example DocumentReview/NetarchiveSuiteInstallationManualReview for review of the New version of installation manual

      4. Copy the text from the template in edit mode
      5. Insert the template it into the new review page and adjust it
      6. If an old review page exists on another media/wiki then this link should be referenced.
    3. Version: The SVN, CVS or date for revision of document/script to be reviewed.

    4. Parts/lines: Specifies the parts/section/lines of the document/script to review (if less than the whole file).

    5. Task: The assignment or tracker issue that the code has been updated for, e.g. Bug 1512.

    6. `Author(s)": The person(s) who have made changes or additions to the code. Only Initials are given, e.g. ELZI.

    7. ~+`Reviewer(s)": The person(s) who have not been involved in changes, who will participate in the review. If the task is in the Iteration task list the reviewer is suggested with the task in this task list.

  3. The participants agree on a time to review, this date is noted under "Review date" on the document review entry on the Iteration review overview page

  4. Before the review time, each participant reads the document thoroughly and note problems - big or small - that should be discussed. These can be written into the document review page (e.g. DocumentReview/NetarchiveSuiteDeveloperManualReview), or entered during the review as described below.

Review Session
This part, while central to the whole process, should not be allowed to drag on forever. If the reviewers cannot agree on how to fix a problem within a few minutes, the item should be marked as "consider how to..." rather than prolonging the discussion.

A typical review session should take no more than an hour (and most take less than that). If it takes longer, the review should be stopped and a time to continue should be agreed upon. More than an hour of straight review reduces the efficiency.

  1. The participants meet on the phone (only physical meeting if possible)
  2. Note ~+Time use (Documentation,Review) in the header, given in number of person days used.

  3. Discuss each comment in order of appearance in the document. Those items that the participants don't agree to discard are marked with status "rejected".

  4. Fill in severity, and in case thios is "major" or "showstopper" a tracker issue must be reported as well.

  5. Agree to who is doing follow-up in case flaws are found during code review. Usually, this will be the implementor.
  6. Update the table in the current Iteration review overview in the entry for the actual review:

    • Review date

    • Follow-up - initials of person decided to do the follow-up.

Follow-up

  1. The follow-up person goes through the list of items and handles each of them. Depending on how an item is handled, the item is marked under Status on the document review page.
  2. The follow-up person mark the file as fully reviewed on the Iteration review overview in the entry for the actual review, once all items have been handled.

  3. If the implementor feels the changes are significant enough to require a new review, another review cycle starts. The first review is left as-is. This rarely happens, and should only happen when design issues have been identified and resolved during the review process.

Time
The input must be review as soon after actual update or creation of documentation as possible.

Input
Usually the input is implementing a Tracker Issue for documentation, but it could also be script or an assignment.

Output
Reviewed and followed-up input, ready for release test or release.

Background for Document Review process
The document review process is inspired by the code review process.

Process/Document Review WithoutTitle (last edited 2010-08-16 10:24:30 by localhost)