CSCI 265 notes:
- Standards
- Code development should always be based on a set of standards which are applied
within an organisation.
- Possible code style standards might include:
- Basic program layout:
- Header files (includes)
- Other pre-processor directives
- Prototypes (function and type declarations)
- Main function
- Other functions
- Size: no function, including main, should have more than
100 lines of code.
- Each executable should be accompanied (in the same directory)
by a README.TXT file and a MAKEFILE, which should outline all actions
necessary to created the desired executable and possible variations.
- Unless otherwise specified, development is to be carried out
using one of the following packages/compilers:
- Borland 4.02 or higher, with the Power Pack
- MicroSoft Visual C++
- Unix compilers: cc, CC, gcc, or g++
- General code styles:
- Hungarian notaton for variable names: variable and function names should
not be cryptic - they should convey the role of the function or variable
to anyone reading your code.
- Example: the variable name x means nothing to those
unfamiliar with your program, however the name lNumberofSteps
clearly indicates that the variable is being used to record some
number of steps.
- Variables should be prefixed with one or more characters indicating
the variable type:
- u : unsigned integer
- l : long integer
- i : integer
- s : short integer
- d : double precision float
- f : floating point
- c : character
- sz : character string
- p : pointer
- a : array
- g : global
- Capitalize the first letter of each word in a variable name, e.g.
dToolCount.
The use of underscores (except as otherwise noted) is discouraged.
- Function names and modules: when a program is divided into modules,
there should be a three-character abbreviation used to uniquely
identify each module.
The name of every function and globally-accessible value declared within the
module should begin with that three character abbreviation followed by the
underscore.
- Document standards might specify the sections which
must be included in the document, content items that must be addressed
in each section, the order of sections, forms which must be
included, the document layout (fonts, page
formats, footnote and bibliography styles), the support tools used
(e.g. specific word processing packages, spell checkers, etc)
Formal Inspections
This section reviews formal inspection processes as presented
through the
NASA formal
inspection guidebook.
As overall rules we observe the following (directly quoted from the
guidebook)
- Inspections are carried out on products that have been
completed by their author but not yet tested, reviewd, or otherwise
approved or baselined.
- The objective of the inspection process is to detect and remove
defects. Typical defects are errors of documentation, logic, and function.
- Inspections are carried out by peers of the author.
Participants in the inspection process should represent organizations
that will use or will be affected by the material being inspected.
- Inspections should not be used as a tool to evaluate workers.
Management is not to be present during inspections.
When a management official has technical expertise which is not available
from other sources, that individual may be brought into the third hour.
- A trained moderator leads inspections, and all participants
should have training in the process.
- Inspectors are assigned to and prepare for specific roles
(e.g. reader, recorder, author).
- Inspections are carried out in a prescribed series of steps
from planning through follow-up.
- Inspection meetings are limited to two hours.
- Checklists of questions and typical defects are used to
stimulate defect finding. Project-tailored entrance and exit criteria
should be developed for each type of product to be inspected.
- The product being inspected should be of an appropriate size
that it can be inspected during a two hour meeting.
- Correction of defects is the responsibility of the author,
and is verified by the moderator. The inspection team must refrain
from suggesting methods for correction during the inspection meeting.
- Data and trends on the number of defects, the types of defects,
and the time expended on inspections should be maintained. This information
should be used to evaluate and improve the effectiveness of the inspection
process.
- The format for inspections is to detect and eliminate defects,
which include any error, nonconformance, or failure to satisfy a
requirement in the product
- The goal of the inspections is to ensure that defects are fixed
early in the life cycle
- A recommended composition for inspection teams is:
The entire group is aimed at inspection, so the
moderator/reader/author/recorder
should not refrain from pointing out defects simply because "it's not their
role"
- Entrance criteria (pre-requisites) may be applied to judge if
a product is ready for the inspection stage. For instance, source code
should be debugged to the point of compiling successfully before an
inspection takes place, documents should be run through spell checkers
or similar tools, etc.
- Before a product is judged to have passed inspection,
exit criteria may be applied such as ensuring that all major defects found
during inspection have been corrected.
- Information relevant to the inspection should be recorded
on appropriate forms, such as
- An inspection announcement - issued by the moderator and detailing
the inspection date, time, location, and team composition
- Logs - prepared by each inspector, detailing defects found
and time spent in preparation
- A defect list, prepared by the recorder and providing information
about each defect
- An inspection report - prepared by the moderator and summarizing
the the inspection
- Defect checklists - summarizing some of the common defect types
for guiding the inspectors in searching for defects
- The inspection process is divided into seven stages:
- Planning - the moderator assures that the product is
ready for inspection, partitions it into sizes appropriate for two-hour
inspection meetings, selects the inspection team and assigns roles,
issues the inspection announcement, and distributes appropriate
inspection materials (copies of the product to be inspected,
preparation logs, background information, and checklists)
- Overview - this stage is optional, and is held if the moderator
feels the team members need greater familiarity with the product to be
inspected. The author of the product to be inspected presents an overview
of the product, the rationale for it, its function, intended use,
relationship to other products, and the development approach used for it.
- Preparation - each individual prepares for the inspection,
by reviewing the product line by line and looking both for general problems
and for problems related to their own specific area(s) of expertise.
The product should simultaneously be checked against other relevant
documents/products for compliance and correctness.
Each individual should record the time spent on this inspection process,
and record each perceived defect in the product.
Copies of the recorded information
are submitted to the moderator prior to the meeting, and the moderator
reviews the logs to ensure the team is adequately prepared for the
meeting.
- Meeting The moderator calls the meeting to order,
summarized team roles, and restates the purpose of the inspection
and the product.
The reader then begins a logical and orderly interpretation of the
product, noting the function of items (e.g. might be paragraphs or sections
in documents, functions or control blocks in source code, etc)
and their relationship to the product and higher level documents.
Any inspector may interrupt the reader at any time when an item with
a possible defect is read. A short discussion may be held if necessary
(including questions directed at the author) and the defect is noted
and categorized by the recorder. To keep the meeting on time, time limits
may be placed on the discussion of any given suspected defect -- if the limit
is reached then discussion is terminated and an "unresolved" tag is appended
to the defect categorization.
Time permitting, the unresolved defects may be returned to at the end
of the meeting. Ideally consensus should be reached on each potential defect.
Defects should be prioritized (major or minor), and at the end of the
meeting the
moderator and/or team determines whether a reinspection is needed. If so,
all major defects should be corrected before the reinspection.
The author and moderator should meet briefly to estimate rework time
and schedule a followup meeting.
- Third hour this is additional time (not necessarily immediately
following the original meeting) scheduled for discussion of unresolved
issues or
discussion of defect corrections. Not all attendees of the original
inspection
need be present at the "third hour", only those with a direct interest in
the issues under discussion.
- Rework - this is the stage in which the author corrects defects
found during the inspection meeting. Major defects must be corrected,
minor defects are to be corrected if time and cost permit.
Any unresolved issues must also be addressed by the author during this phase.
- (Re-inspection) - if deemed necessary during the inspection
meeting
- Followup - this is a short meeting between the author and
the moderator to ensure that all major defects have been corrected and
no secondary defects have been introduced.
- Typical products targetted for formal inspection include:
- The requirements/specifications document
- The architectural design document
- The detailed design document
- The software test plan
- The source code (after compilation but before complete unit testing)
Example Detailed Design Inspection Checklist
CLARITY
Is the overall function of each unit and the design
as a whole clearly described?
Is the intent of all units or processes documented?
Is unit design clearly represented, including data flow,
control flow, and interfaces?
COMPLETENESS
Have the specifications for all units in the program
been provided?
Have all the acceptance criteria been described?
Have the algorithms used to implement each unit been
clearly specified?
Have all the calls made by each unit been listed?
Has the history of inherited designs been documented
along with known risks?
COMPLIANCE
Does the document follow all relevant project and
organizational standards?
Has the design been created using the methodology
and tools required by the organization(s)?
CONSISTENCY
Are data elements named and used consistently
throughout the design?
Is the design of all interfaces consistent among
themselves and with the system interace specifications?
Does the detailed design, together with the architectural
design, fully describe the "as-built" system?
Does the design conform to the architectural design?
DATA USAGE
Are all the variables and constants (including pointers)
defined and initialized?
Are all defined data blocks actually used?
Are literals used where a constant data name should be used?
Are all data blocks specified as used (for structure and usage)?
Are all routines that modify data aware of the data block's usage
by all other routines?
Are all logical units, events, and synchronization flags defined
and initialized?
FUNCTIONALITY
Does the design correctly implement the specified algorithms?
Will the design fulfill the specified requirements and purpose?
INTERFACE
Do argument lists and return types match between calls, prototypes,
and function declaration with respect to number, type, and order?
Are all passed values (including return values) properly
defined and checked?
Are all input and output values properly defined and checked?
Is the data area of passed data mapped as the called routine
expects it to be?
Are messages issued for all error conditions?
Have the parameters been specified in terms of unit of measure,
range of values, accuracy, and precision?
LEVEL OF DETAIL
Is the ratio of code to design documentation less than 10-to-1?
Are all required module attributes defined?
Are all assumptions about design modules adequately documented?
Has sufficient detail been included to develop and maintain the code?
LOGIC CORRECTNESS
Is there logic missing?
Are greater-than, equal to, less-than-zero, or other
conditions each handled?
Are branches correctly stated (e.g. the logic is not reversed)?
Are actions for each case correct?
MAINTAINABILITY
Does each unit have high internal cohesion and low external coupling?
Are any programming standards in jeopardy because of the design?
Has the complexity of the design been minimized?
Does the design exhibit clarity, readability, and modifiability?
PERFORMANCE
Will all synchronization mechanisms perform as required?
Do processes have time windows?
Have all constraints, such as processing time and size, been specified?
RELIABILITY
Are defaults used and are they correct?
Are boundary checks performed on memory accesses (arrays,
structures, pointers, etc) to ensure memory is not altered?
Have interfaces been checked for inadvertent destruction of data
(e.g. unwanted side effects on passed parameters)?
Is error checking performed on inputs, outputs, interfaces, and results?
Are the potential effects of undesired events considered and planned for?
Do return codes for particular situations match the documented
global definitions?
TESTABILITY
Is the design described in a testable, measurable, and demonstratable form?
Can each unit be tested, demonstrated, analyzed, or inspected
to show that they satisfy requirements?
Does the design contain checkpoints to aid in testing (e.g. assertions,
conditionally-compiled code, etc)?
Have all test drivers, test data sets, and test results been described?
Can all logic be tested?
TRACEABILITY
Are all parts of the design traced back to the requirements?
Can all design decisions be tracked back to trade studies?
Have the unit requirements been traced to the specification documents?