CSCI 265 notes:

Formal Inspections

This section reviews formal inspection processes as presented through the NASA formal inspection guidebook.

As overall rules we observe the following (directly quoted from the guidebook)

  1. Inspections are carried out on products that have been completed by their author but not yet tested, reviewd, or otherwise approved or baselined.
  2. The objective of the inspection process is to detect and remove defects. Typical defects are errors of documentation, logic, and function.
  3. Inspections are carried out by peers of the author. Participants in the inspection process should represent organizations that will use or will be affected by the material being inspected.
  4. Inspections should not be used as a tool to evaluate workers. Management is not to be present during inspections. When a management official has technical expertise which is not available from other sources, that individual may be brought into the third hour.
  5. A trained moderator leads inspections, and all participants should have training in the process.
  6. Inspectors are assigned to and prepare for specific roles (e.g. reader, recorder, author).
  7. Inspections are carried out in a prescribed series of steps from planning through follow-up.
  8. Inspection meetings are limited to two hours.
  9. Checklists of questions and typical defects are used to stimulate defect finding. Project-tailored entrance and exit criteria should be developed for each type of product to be inspected.
  10. The product being inspected should be of an appropriate size that it can be inspected during a two hour meeting.
  11. Correction of defects is the responsibility of the author, and is verified by the moderator. The inspection team must refrain from suggesting methods for correction during the inspection meeting.
  12. Data and trends on the number of defects, the types of defects, and the time expended on inspections should be maintained. This information should be used to evaluate and improve the effectiveness of the inspection process.
Example Detailed Design Inspection Checklist

CLARITY
  Is the overall function of each unit and the design 
     as a whole clearly described?
  Is the intent of all units or processes documented?
  Is unit design clearly represented, including data flow, 
      control flow, and interfaces?

COMPLETENESS
  Have the specifications for all units in the program 
       been provided?
  Have all the acceptance criteria been described?
  Have the algorithms used to implement each unit been 
       clearly specified?
  Have all the calls made by each unit been listed?
  Has the history of inherited designs been documented 
       along with known risks?

COMPLIANCE
  Does the document follow all relevant project and 
       organizational standards?
  Has the design been created using the methodology 
       and tools required by the organization(s)?

CONSISTENCY
  Are data elements named and used consistently 
      throughout the design?
  Is the design of all interfaces consistent among 
     themselves and with the system interace specifications?
  Does the detailed design, together with the architectural 
     design, fully describe the "as-built" system?
  Does the design conform to the architectural design?

DATA USAGE
  Are all the variables and constants (including pointers) 
      defined and initialized?
  Are all defined data blocks actually used?
  Are literals used where a constant data name should be used?
  Are all data blocks specified as used (for structure and usage)?
  Are all routines that modify data aware of the data block's usage
      by all other routines?
  Are all logical units, events, and synchronization flags defined
      and initialized?

FUNCTIONALITY
  Does the design correctly implement the specified algorithms?
  Will the design fulfill the specified requirements and purpose?

INTERFACE
  Do argument lists and return types match between calls, prototypes,
      and function declaration with respect to number, type, and order?
  Are all passed values (including return values) properly
      defined and checked?
  Are all input and output values properly defined and checked?
  Is the data area of passed data mapped as the called routine
     expects it to be?
  Are messages issued for all error conditions?
  Have the parameters been specified in terms of unit of measure,
     range of values, accuracy, and precision?

LEVEL OF DETAIL
  Is the ratio of code to design documentation less than 10-to-1?
  Are all required module attributes defined?
  Are all assumptions about design modules adequately documented?
  Has sufficient detail been included to develop and maintain the code?

LOGIC CORRECTNESS
  Is there logic missing?
  Are greater-than, equal to, less-than-zero, or other
      conditions each handled?
  Are branches correctly stated (e.g. the logic is not reversed)?
  Are actions for each case correct?

MAINTAINABILITY
  Does each unit have high internal cohesion and low external coupling?
  Are any programming standards in jeopardy because of the design?
  Has the complexity of the design been minimized?
  Does the design exhibit clarity, readability, and modifiability?

PERFORMANCE
  Will all synchronization mechanisms perform as required?
  Do processes have time windows?
  Have all constraints, such as processing time and size, been specified?

RELIABILITY
  Are defaults used and are they correct?
  Are boundary checks performed on memory accesses (arrays,
      structures, pointers, etc) to ensure memory is not altered?
  Have interfaces been checked for inadvertent destruction of data
      (e.g. unwanted side effects on passed parameters)?
  Is error checking performed on inputs, outputs, interfaces, and results?
  Are the potential effects of undesired events considered and planned for?
  Do return codes for particular situations match the documented
     global definitions?

TESTABILITY
  Is the design described in a testable, measurable, and demonstratable form?
  Can each unit be tested, demonstrated, analyzed, or inspected
      to show that they satisfy requirements?
  Does the design contain checkpoints to aid in testing (e.g. assertions,
       conditionally-compiled code, etc)?
  Have all test drivers, test data sets, and test results been described?
  Can all logic be tested?

TRACEABILITY
  Are all parts of the design traced back to the requirements?
  Can all design decisions be tracked back to trade studies?
  Have the unit requirements been traced to the specification documents?