Next: Appropriate Processes  Up: Opening Abstract  Previous: Opening Abstract

The Dogma

Inspections and Technical Reviews have proven time and again to be a cost-effective method for improving the quality of software. While the overall concept has proven to be sound, the process through which it is best achieved has no universal consensus.

Historically, much of the discussion regarding the procedural issues of Formal Technical Reviews (FTR) (see terminology footnote) has been replete with commandments of near-biblical proportions. Many have faithfully followed the lead of Fagan in implementing Software Inspections [Fagan'76], while others others have designed equally valuable variations on the theme. In the best case, we work out a good process, in line with our own bias and experiences, which helps us toward that goal of finding problems early. Since a good process is often hard to develop, we may grab one that reputedly works well, and thus we would likely acquire a ``process document'' with a full set of behavioral rules, all billed as critical to the success of the FTR program. Here are some excerpts from the literature, just to highlight a few:

The duties of the moderator also include scheduling suitable meeting places, reporting inspections results within one day, and following up on rework... [Fagan'76]
During the meeting ... no discussion as to whether a purported defect is `real' is allowed... [Gilb'88]
The review schedule must include time for each reviewer to complete the questionnaires, and must also allow a chance for discussion... [Parnas & Weiss'87]
Of course, the methods aren't without justification. Ten years after his initial publication, [Fagan'86] writes:
Evaluation of hundreds of inspections involving thousands of programmers in which alternatives to the [inspection] steps have been tried has shown that all these operations are really necessary. Omitting or combining operations has led to degraded inspection efficiency...
And Gilb echoes: ``If you think an inspection rule is unnecessary, you have probably misunderstood the method'' [Gilb'88]. My experience has been quite different, however. I have worked with many development teams, both large and small, across several corporations over the last fifteen years to implement FTRs as an integral process in improved software development methodologies. I rarely saw the ``Fagan Process'' embraced with all its details intact.

It is relatively easy to write down a list of Thou shall... and Thou shall not... guidelines, but much harder to justify them empirically or in principle. Usually everything involves a tradeoff, and what really matters is where one finds themself and their organization amidst the pull of various issues. My goal in this paper is to grab some of the commonly accepted FTR commandments by the horns, wrestle them down to first principles, and draw out some lessons for when they make sense and when they don't. For further discussion along these lines, Freedman & Weinberg present other, though more specific, points to consider in designing an FTR process that fits [Freedman & Weinberg'83]. They have their own recommendations to make, though they usually provide an underlying rationale from which pros and cons can be inferred. The bottom line is to place each discussion in the context of a specific organization when designing a process for a custom fit.


Terminology Footnote: In this paper, I will use the term Formal Technical Review and the acronym FTR in referring to team efforts to eliminate problems in software through group review; it may refer variously to Walkthroughs, Inspections, or Technical Reviews, as appropriate to the context. These are distinct and very different from the U.S. Defense Department's standard set of reviews (such as PDR, CDR, etc.), often called Formal Reviews, which more specifically are organized as contract monitoring events. back to text...


Next: Appropriate Processes Up: Opening Abstract Previous: Opening Abstract