Gallery > Peer Review

PEER REVIEW


DataDrill’s best practice, ready to use library of Peer Review indicators allows organizations to quickly start managing a peer review or inspection process and significantly raise software product quality. Easy to use tools such as management dashboards and performance dashboards, makes DataDrill the top choice for all your peer reviews.

 Effort Chart
Peer Review Effort

 

DataDrill provides visibility into peer reviews execution: ensuring that software products are reviewed on time, identifying product weakness and removing defects at an early stage.

Progress Compliance Chart
Peer Review Progress Compliance

 

Statistical analysis of peer review progress checks for unstable or erratic process implementation, offering the opportunity to redirect the effectiveness of the test activities.

Defect Stability Chart
Peer Review Progress Compliance

 

Managing Peer Reviews
Using DataDrill and Code Collaborator


Introduction

Peer Reviews, or inspections of software source code, can significantly raise the quality of a software product by removing defects at an early stage.  In this paper, Distributive Management's DataDrill is used to plan, monitor and control the peer review process while SmartBear's Code Collaborator is used to perform software peer reviews.  When used together, DataDrill and Code Collaborator provide an integrated solution for software peer reviews that allows an organization to quickly lower defects, raise software quality and increase software project management visibility into the software development process.

 

Business Value

A combined DataDrill and Code Collaborator solution offers a number of key features that are not present in either internally-developed, open source or other commercial solutions:

  • Managers can establish planned expectations and targets for peer reviews and then monitor progress toward goals
  • Up-to-date progress information is delivered to managers regarding peer review findings and issues
  • Built-in rule analysis alerts managers to common mistakes and potential process errors
  • Statistical analysis of peer review progress checks for unstable or erratic process implementation
  • Ready-to-use best practice indicators for peer reviews
  • Pre-build integration requires no technical support

    By providing the features above, the DataDrill - Code Collaborator solution empowers an organization to address both IT project management and technical aspects of peer reviews.  This focused solution helps your organization achieve its intended business goal for peer reviews: to deliver better software products.  Other direct aspects of business value for this solution include:
  • Reduce the number of software defects entering unit, systems and integration test activities
  • Identify issues with requirement and design documents as early as practical
  • Raise the level of software quality for the end user
  • Increase IT project management visibility into the software development process

 

The benefits outlined above are quickly achieved, yielding a short period to realize ROI and to begin developing or expanding a competitive advantage in the delivery of software and software-intensive products.  In addition, you can focus your business on true revenue generation, rather than spending valuable resources (and time) defining technical integrations or defining detailed peer reviw indicators

 

Background

A peer review is a desk audit of software source code by developers, other than the author(s), who check the code to identify issues and potential issues.  Peer reviews can encompass both the software under review and the requirements and standards being satisfied.  For best results, each review should have a small subset of code, perhaps around 400 lines, which are inspected by other members of the team.  Each participant in a peer review is given a role, with specific actions and responsibilities to be performed.

 

During the review, participants read and inspect the software selection and record any issues or potential issues, as well as the effort expended.  The author of the software resolves the issues by modifying the software, changing a document, or, when appropriate, by rejecting the issue.

 

Code Collaborator is a peer code review tool that lets developers easily review code online without extensive meetings, miles of code print-outs, or convoluted email threads.  The software automatically gathers changes from version control and packages them for easy distribution to reviewers.  Developers can then review and chat directly on teh code itself, with Code Collaborator tracking changes and correlating them with the appropriate section of code, even when line numbers change.

 

Another essential ingredient to peer review success is the planning and management activities that accompany the technical activities.  DataDrill from Distributive Management is a software project management solution that provides managers with the ability to plan, monitor and control critical software engineering processes, including peer reviews.  By collecting, analyzing and centralizing all critical software project information, managers are able to use indicators and metrics to quickly spot problem areas and take action, without worrying about the mechanics of data collection and storage.

 

Managing Peer Reviews with DataDrill

 

DataDrill and Code Collaborator Operation

A complete peer review solution addresses the needs of both the software engineers who perform the reviews and the managers who plan and control them.  The simplified diagram below shows how these functions are related.

 

 

Software managers plan the peer review process, which is then performed by the engineers.  The data maintained in Code Collaborator is collected, analyzed by DataDrill and the provided to managers in the form of indicators and graphs.

 

Tailoring for the Peer Review Process

To collect the indicators and measures described in this document, the "default" peer review is augmented with a few pieces of data that must be entered when the peer review is started and when it is completed.  This additional data is implemented using the built-in "custom fields" within Code Collaborator, and then extracted by DataDrill during the collection process.  In code Collaborator, we recommend you create the following fields and keep them up to date as a peer review proceeds:

 

 

The fields in the privious table must be added to Code Collaborator to allow management of the peer review process using DataDrill EXPRESS.

 

Managing Peer Reviews with DataDrill


The DataDrill - Code Collaborator solution provided out of the box capabilities to monitor the progress of peer reviews.  In DataDrill, the graphs and indicators are combined into an information need that addresses one area of management interest.  For controlling the peer review process, DataDrill provided a peer review information need that addresses the need for managers to:

  • Monitor peer reveiw progress
  • Manage peer review findings
  • Maintain stability of the peer review process
  • Enforce peer review rules and best practices


The following subsections describe the DataDrill peer review information need that addresses these management needs. 

 

Monitor Peer Review Progres

These graphs are designed to help managers understand the number of reviews in process and the timeliness of completing them.  These graphs aid managers in spotting problems in completing reviews before they become critical.  Applicable graphs are show in the following table.

 

 

The following figure is an example of the graphing provided by DataDrill.  This graph provides three sets of data of interest to managers; on the left axis and in the stacked area series are the number of reviews planned as well as the number started and completed before their planned dates; the right side axis shows the percent of peer reviews that were started and completed on time; the alarm bar at the top of the graph provides color status of the number of rules completed on time.

 

 

Manage Peer Review Findings

This Set of graphs provides insight into the findings associated with each peer review.

 

 

The following figure, showing a graph of Peer Review Coverage By Review, shows managers how much code (as a percent of the total lines of code) has actual been peer reviewed.  Coverage information provides managers with an understanding for the degree of confidence to assume when assessing whether software is ready to proceed into downeam activities.  Software with low coverage percent is likely to have a higher number of defects and require more testing effort than code with a higher coverage percent.

 

 

Notice in the graph above that the bottom axis contains actual review names instead of dates.  This type of graph in DataDrill, called an event-based graph, is useful when examing raw data and conducting (??) analysis for peer reviews, since they do not occur on a strict periodic basis.

 

Stability of the Peer Review Process

The peer review process must be periodically monitored to ensure that the process does not appear to be out of control.  The idea is that if the peer review process is unstable, then we do not want to use the resulting defect and effort data for decision-making.  The determination of what is considered "out of control" is performed using a statistical technique call a control chart using the "U chart" calculation.

 

 

The following graph shows an analysis of the defects found per review as a function of the number of lines reviewed.  This graph displays series called "UCL" (for upper control limit) and "LCL" (for lower control limit), which are calculated based on an analysis of the average number of defects found per KSLOC for all reviews.  When the "Defects per KSLOC" exceeds the UCL, it indicates that the peer review process may not be being performed correctly and may be what is termed "unstable".  An unstable process means that when the results of all peer reviews are compared to themselves, it appears (*or-becomes clear?) that at least one of the peer reviews generated significantly too many or too few defects.

 


Peer Review Rules and Best Practices

This set of graphs provides insight into the findings associated with each peer review

 

 

An example of the Peer Review Rule Grid is shown below.

 

 

Note:  the rules and the trigger values in the grid can be tailored in the field.


Summary

When used together, DataDrill and Code Collaborator provide and integrated solution for conducting software peer reviews that allows an organization to quickly lower defects, raise software quality and increase management visibility into the software development process.  By controlling defects during the coding phase of a software project, organziations can remove more defects earlier, when it is cheaper and easier to find them.

The process presented here captures industry best practices and makes them easily deployable in a short amount of time.  Further, the joint DataDrill / Code Collaborator solution provides both the technical and management aspects of peer reviews with a minimum of technical or infrastructure requiremens.