17338
Comment:
|
17460
|
Deletions are marked like this. | Additions are marked like this. |
Line 6: | Line 6: |
[[Anchor(DocReviewPurpose)]] '''~+Purpose+~'''[[BR]] We use document reviews to improve correctness of documentation. | [[Anchor(DocReviewPurpose)]] '''~+Purpose+~'''[[BR]] We use document reviews to improve correctness of documentation. |
Line 8: | Line 10: |
[[Anchor(DocReviewResponsible)]] '''~+Responsible+~'''[[BR]] The Task holder of [:Process/Implementation WithTitle:implementation] task doing or correcting document or script. The task holder is specified in the [:Development#CurrentIterationTaskOverview:current iteration task overview] | [[Anchor(DocReviewResponsible)]] '''~+Responsible+~'''[[BR]] This can either be * [:Process Role/Task_Holder:Task Holder] of [:Process/Implementation:implementation] task doing or correcting document or script. Tasks are specified in the [:Development#CurrentIterationTaskOverview:current iteration task overview]. * [:Process Role/Module Owner:Module Owner] for Documentation for small corrections. |
The Document Review process covers: [#DocReviewPurpose Purpose], [#DocReviewResponsible Responsible], [#DocReviewMethod Method] ([#DocReviewPlanning Planning], [#DocReviewReviewSession Review Session], [#DocReviewFollowUp Follow-Up]), [#DocReviewTime Time], [#DocReviewInput Input], [#DocReviewOutput Output]
Anchor(DocReviewPurpose) PurposeBR We use document reviews to improve correctness of documentation.
Anchor(DocReviewResponsible) ResponsibleBR This can either be
[:Process Role/Task_Holder:Task Holder] of [:Process/Implementation:implementation] task doing or correcting document or script. Tasks are specified in the [:Development#CurrentIterationTaskOverview:current iteration task overview].
[:Process Role/Module Owner:Module Owner] for Documentation for small corrections.
--Rewrite the below--
Anchor(DocReviewMethod) MethodBR A document review consists of 2 or more persons that read the document/script and in a structured way identify changes that will improve it. Our process for document review contain planning, review session and follow-up as described below.
Anchor(CodeReviewPlanning) Planning
- Review participants are specified in the current [Iteration task list]. Usually it is the implementor and another developer.
- The implementor specifies the code to be reviewed in Crucible
Log on to Crucible (http://kb-prod-udv-001.kb.dk:8060). Please refer to [:Guidelines/GettingCrucibleAccount:Crucible Guidelines for sign-up] if you don't have an account already.
Go to your Crucible DashBoard (using the link "My Crucible DashBoard")
- Click on "Create new review" in the top right corner
- Select a proper titel (e.g. Feature request Y, og Bug Y plus description). If it corresponds to a task in the current [Iteration task list], it should have the same title, as is written there.
- Set yourself both as 'moderator', and 'author' (unless code is authored by somebody else).
- Write relevant comments in the "Statement of Objects" for instance genral comments to the review and list of deleted file or class names that for obvious reasons cannot be included.
- Add the files included in the review. This can be done by selecting files in different changesets or in the repository. Note that we do not review code in the test-branch.
- Add a revision comment to each of the files with specific lines to be review (or ALL LINES). Also note what happened here if relevant e.g. lines that have been removed (file/class name and revision of file is given automatically by Crucible)
You notify the reviewer of the existence of a review, by clicking "Start Review". BR Note that it is best to make wiki review entry (as explained in the next step) before notification.
- Make an entry with review information for the review in the wiki current [Review Table] (see for example ????) ....
- The participants agree on a time to review.
- Before the review time, each participant reads the code thoroughly and note problems - big or small - that should be discussed. Place the comments at the relevant places in Crucible. For instance:
Add a new general comment: BR For general problems like design problems or problems concerning most of the files like missing '.' in end of JavaDoc.
Add a revision comment: BR For general problems in a specific file like generally missing JavaDoc in this particular file.
On specific line (mark line will result in a comment field to appear): BR For problems on specific lines of a file like lack of white space around delimiters. BR REMEMBER to post the comments by clicking "Post" for each of the comments.
Anchor(CodeReviewReviewSession) Review Session BR This part, while central to the whole process, should not be allowed to drag on forever. If the reviewers cannot agree on how to fix a problem within a few minutes, the item should be marked as "consider how to..." rather than prolonging the discussion.
A typical review session should take no more than an hour (and most take less than that). If it takes longer, the review should be stopped and a time to continue should be agreed upon. More than an hour of straight review reduces the efficiency.
- The participants meet on the phone (only physical meeting if possible)
- Before starting check that
- Code has been unit tested
- Code has been sanity tested
- Functionality has been document in manuals
- If any of these are missing than the Review should be postponed.
- Use Crucible to go through the review
- Log on to Crucible - preferable both reviewers.
- Discuss each posted comment in order of appearance.
- General comments
- Revision comments
- Specific line comments
Those items that the participants don't agree to discard are marked by clicking on the "Defect" box which enables selection of rank in the "Select rank" drop-down list. When Defect and Rank is specified the irem is posted by clicking "Post".
Note it is only the author of the comment that can post the comment. If only one of the reviewers have access to Crucible, the non-owned comments must be copied into new comment that can be posted with the mentioned information.
Note the time used for the task in a Crucible General comments for the review using following wording: BR Time use (Coding,Documentation,Review) BR <IinitialsOf1Reviewer>: <NoOfManDaysUsed> BR <IinitialsOf2Reviewer>: <NoOfManDaysUsed> BR Remember to set selection of rank in the "Select rank" drop-down list to "Time Used". This is used for the [Iteration review] made in the end of the iteration.
Remember to mark the General comment as defect and post it - otherwise this information will not be passed to the wiki afterwards.
- Complete the review by clicking "Complete"
- Agree to who is doing follow-up in case flaws are found during code review. Usually, this will be the implementor.
- Update the table in the current [Review Table] with
Review date
Issues found - using the [http://kb-prod-udv-001.kb.dk:8060/plugins/servlet/export Crucible export function] and insert result on page IssuesFromNsXX where XX is the number in the Crucible review id.
Follow-up - initials of person decided to do the follow-up.
Anchor(CodeReviewFollowUp) Follow-up
- The follow-up person goes through the list of items and handles each of them. Depending on how an item is handled, the item is marked under Status on the [link Code Review Class Page paragraph].
- The follow-up person mark the file as fully reviewed on the [link review page] once all items have been handled.
- If the implementor feels the changes are significant enough to require a new review, another review cycle starts. The first review is left as-is. This rarely happens, and should only happen when design issues have been identified and resolved during the review process.
Review Pages (technical information) There are two kinds of review pages:
- Code Review Page per Class
- that contains all reviews made on the class
- Code Review Overview per Iteration
- that contains an overview of code reviews made within an iteration
Code Review Page per Class/JSP-page Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review
Each class/JPS-page has its own page with all code reviews and their documentation made on the specific Class/JPS-page.
Name of Code Review Class/JSP page Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented
Each code review page is named according to the codes position in the Java project. '
For classes the name is formed from the class and package name as follows
'<Unique package name for class> + <Class name> + "Review"
Where each part of the name starts by upper case and continuous in lower case (as WikiWords), for example
- for dk.netarkivet.common.distribute.channels.java (under /trunk/src/)
For JSP pages the name is formed from the JSP-page group and the JSP page name as follows
<Unique group for JSP-page > + <JSP-page name> + "JSPReview"
Where each part is of the name starts by upper case and continuous in lower case – and “-“ are skipped where letter after “-“ is written in uppercase too (as WikiWords), for example
HistoryHarveststatusJobdetailsJSPReview
for History/!HarvestStatus-jobdetails.jsp (under /trunk/webpages/)
Anchor(TablesFilledPageReview)
Code Review Tables filled for each review of a class/JSP-page Note this is taken from the current process, it will be changed according to use of a new tool for registration of code reviews is chosen and implemented BR The contents can be seen as guidence of what informaiton should be through the review
The below tables (from [:CodeReviewCodePageTemplate:template]) keeps the information for each review of a class/JSP-page (or parts of one). If a class/JSP-page is reviewed more than once, new sections like this get added at the top of the same page. Storing the old reviews with task, date, SVN version and lines has proven useful for tracking down problematic changes and misunderstood designs. Example is CommonDistributeChannelsReview Anchor(CreationCodeReviewCodePage) If an old review page exists on another media then this link should be referenced. for example Anchor(TableFilledReviewsOverview) Example is Iteration33ReviewsOverview Anchor(CreationCodeReviewOverviewPage) Anchor(UpdateCodeReviewOverviewPage) Anchor(CodeReviewInput)'unit tested InputBR Usually the input is code implementing a Tracker Issue which have been ' but depending on the implemented task, it can also be corrected documentation scripts etc. Anchor(CodeReviewOutput) OutputBR Reviewed and followed-up input, ready for quality assurance before it can be marked as ready for release test. Anchor(CodeReviewBackground) Background for Code Review processBR The code review process was inspired by [http://satc.gsfc.nasa.gov/fi/fipage.html NASA's ideas for code inspection]. The process has however been simplified in order to ease the transition to inspection. As the project group gains experience with inspection it is recommended that the inspection process is refined. The description focuses on code-inspection. Anchor(CodeReviewResourceUsage) Resource UsageBR Code review takes time, of course. The actual time spent discussing the code is typically roughly the same as is spent going over the code beforehand. Follow-up can take a varying amount of time, depending on the starting quality of the code and whether significant changes have been found necessary. Some kinds of code take longer to review than others, for instance straight-forward getter-and-setter style classes go very fast, while a review of a few lines of change in a complex method can take much longer. In the start of the NetarchiveSuite project, we kept track of the time spent preparing for and executing the review (but not doing the follow-up changes to the code). The ratio of preparation time to review time varied, but there was never more than a factor 2 difference to either side, on average the two were about the same. The number of lines of code reviewed per hour (LoC/h) varied from 88 to 300, with a mean and average value of about 170 LoC/h. Later code review times were not recorded, but is likely to be slightly faster due to a better system for taking notes. Anchor(CodeReviewLiterature) LiteratureBR Steve McConnell, Rapid Development page 73-74 [http://satc.gsfc.nasa.gov/fi/fipage.html NASA's ideas for code inspection]
Create new page named as described above
Create new page named as described above