Reviews - Workshop: Saud Bahwan Group CSD
Reviews - Workshop: Saud Bahwan Group CSD
No Revs.
• Reviews reduce rework.
– Rework accounts for 44% of dev. cost!
– Reqs (1%), Design (12%), Coding (12%), Testing (19%)
• Reviews are pro-active tests.
– Find errors not possible through testing.
• Reviews are training.
– Domain, corporate standards, group.
Industry Experiences - 1
Industry Experiences - 2
Industry Experiences - 3
Industry Experiences - 4
Industry Experiences - 5
• Aetna Insurance Company:
– FTR found 82% of errors, 25% cost reduction.
• Bell-Northern Research:
– Inspection cost: 1 hour per defect.
– Testing cost: 2-4 hours per defect.
– Post-release cost: 33 hours per defect.
• Hewlett-Packard
– Est. inspection savings/year: $21,454,000
• IBM (using Cleanroom)
– C system software
– No errors from time of first compile.
Defect amplification
and removal (1)
Unit testing
85
85 Integration testing
42
0 50%
42
21
0 0 50%
0
Defect amplification and removal
- revisited
Preliminary design
Assumption: amplification 1:0.5
0 Detailed design
0 50% 5 (10)
5 Coding
10 3 50% 16 (40)
16
25 (85)
25 8 50%
25
Unit testing
25 (85)
25 Integration testing
12 (42)
0 50%
12
6 (21)
0 0 50%
0
What can we do to remove errors in
early development phases
Non-execution-based testing !
– Reviews: Inspections & walkthroughs
– Presentations
The later a change is
addressed, the greater the
cost, risk and duration
Cost,
Risk,
Duration
ng
on
ts
on
e
n
Te ld
n
v
ig
ig
en
Li
i ti
st i
iti
Bu
es
es
m
in
in
D
D
re
ef
ef
d
al
Co qui
D
D
le
tu
ai
Re
ep
et
nc
D
Reviews: an effectiveness scale
most
effective
inspection (FTR)
walk through (FTR)
formal presentation
formality
informal presentation
peer group review
casual conversation
Effectiveness of inspections
• [Fagan 1976] inspections of design & code
– 67%-82% of all faults were found by inspections
– 25% time saved on programmer resources (despite inspections)
• [Fagan 1986]
– 93% of all faults were found by inspections
• Cost reduction for fault detection (compared with testing)
– [Ackerman, Buchwald, Lewski 1989]: 85%
– [Fowler 1986]: 90%
– [Bush 1990]: 25.000US$ saved PER inspection
Why review? Who benefits?
• Formal technical review provides:
– Defect information to the author.
– Information on work product and development to
peers.
– Fault likelihood data to testers.
– Product status to management.
– Process status to SPI group.
Objectives of reviews
• Uncover errors in any representation of
software
• Verify that
– software meets its requirements
– software follows predefined standards
– software is developed in uniform manner
• Make projects more manageable
• Educate new team members
Basic
Review
Principles
What is Formal Technical
Review?
• A method involving a structured
encounter in which a group of technical
personnel analyzes or improves the
quality of the original work product as
well as the quality of the method.
True FTR is well-defined
• Well-defined process
– Phases (kick-Off, etc.)
– Procedures (checklists, etc.)
• Well-defined roles
– Coordinator, Reviewer, Recorder, Author, etc.
• Well-defined objectives
– Defect removal, requirements elicitation, etc.
• Well-defined measurements
– Forms, consistent data collection, etc.
FTR is effective quality improvement
• Reviews can find 60-100% of all defects.
• Reviews are technical, not management.
• Review data can assess/improve quality of:
– work product
– software development process
– review process
• Reviews reduce total project cost, but have non-trivial
cost (~15%)
• Upstream defect removal is 10-100 times cheaper.
• Reviews disseminate domain knowledge, development
skills, and corporate culture.
Who, What, and When
• Who decides what should be reviewed?
– Senior technical personnel, project leader
• What should be reviewed?
– Work products with high impact upon project risks.
– Work products directly related to quality objectives.
– “Upstream” work products have higher impact.
• When should review be planned?
– Specify review method and target work products in
project plan/software development plan/quality plan.
The range of review practices
Development Method
Non-Cleanroom Cleanroom
Verification-
inFTR FTR based
Inspection
Tool-Based Manual (Dyer92)
Walkthrough
(Yourdon89)
Code Active Code Software
Reading Design Inspection Review
(McConnell93) Reviews (Fagan76) (Humphrey90)
(Parnas85)
Inspection
FTArm (Gilb93)
(Johnson94)
Scrutiny TekInspect
(Gintell93)
2-Person N-Fold
CAIS ICICLE Phased Insp. Inspection Inspection
(Mashayekhi94) (Brothers90) (Knight93) (Bisant89) (Martin90)
How to carry out reviews:
The review team
producer
review
leader reviewer(s)
recorder
Basic guidelines (1)
• 3-6 people (typical)
– experienced senior technical staff
– representatives of
• team that created the document
• client representative
• team for next development phase
• software quality assurance group
• IEEE Standard for Software Reviews and Audits
[IEEE 1028, 1988]
Basic guidelines (2)
• Review leader should be SQA representative
– has the most to lose
– creator: eager to get approval (to start next job)
– client: can wait for acceptance testing
• Review leader distributes material
• Advance preparation of max. 2 hours before the
meeting
• Duration: less than 2 hours
Result of a review
• Decision about the product
– accept without further modification
– reject the work due to severe errors (review must be
repeated)
– accept with minor modifications (that can be
incorporated into the document by the producer)
• All participants have to sign-off
– shows participation / responsibility
– shows their concurrence with the findings
Reviewer’s preparation
• be sure that you understand the context
• first, skim all the product material to understand
location and format of the information
• next, read product material and annotate hardcopy
• pose your written comments as questions
• avoid issues of style
• inform the review leader if you can’t prepare
Families of Review Methods
Method Family Typical Goals Typical Attributes
Minimal overhead Little/no preparation
Formal process
Detect and remove all
Inspections defects efficiently and Checklists
effectively.
Measurements
Verify phase
Reviews
Walkthrough
producer guides review
many variations
presentation reviews
overlook of many details
presentation overshadows review
Ego is a key problem
Reviews
Inspection
Formal process
Requires intensive advance preps
checklists utilized
many variations
product reviews
Reviews
Audits
external review
audit the product
audit the process
Reviews
The Process
Roles, Rules, Reports
Presenter (producer)
presents material objectively
has team for support
Reviews
Numbe r of Re vie we rs 5 3
Re vie w P repa ra tion (re lative ) 10 pa ge s /hrs . 5 pa ges /hrs .
Effort Re vie w P re pa ra tion 25 hrs . 12 hrs .
Effort Re vie w S e s s ion (a bs olute ) 14 hrs . 10 hrs .
Tota l Revie w Effort 5 pe rs on da ys 3 pe rs on da ys
• Consolidate issues.
Preparation
• Correct defects.
Rework
Verify
Planning
Planning Kick-Off Preparation Review Mt. Rework Verify
• Objectives
– Gather review package: work product, checklists, references, and
data sheets. (PL & Review Coordinator)
– Form inspection team (Review members).
– Determine dates for meetings.
• Procedure
– Review Coordinator assembles team and review package.
– Review Coordinator enhances checklist if needed.
– Review Coordinator plans dates for meetings.
– Review Coordinator checks work product for readiness.
– Review Coordinator helps Author prepare overview.
Kick-Off
Planning Kick-Off Preparation Review Mt. Rework Verify
• Objectives
– Author provides overview.
– Reviewers obtain review package.
– Preparation goals established.
– Reviewers commit to participate.
• Procedure
– Moderator distributes review package.
– Author presents overview, if necessary.
– Scribe duty for Review Meeting assigned.
– Moderator reviews preparation procedure.
Preparation
Planning Orientation Preparation Review Mt. Rework Verify
• Objectives
– Find maximum number of non-minor issues.
• Procedure for reviewers:
– Allocate recommended time to preparation.
– Perform individual review of work product.
– Use checklists and references to focus attention.
– Note critical, severe, and moderate issues on Reviewer
Data Form.
– Note minor issues and author questions on work product.
Example Issue Classification
• Critical
– Defects that may cause the system to hang, crash, produce
incorrect results or behavior, or corrupt user data. No known
work-arounds.
• Severe
– Defects that cause incorrect results or behavior with known
work-arounds. Large and/or important areas of the system is
affected.
• Moderate
– Defects that affect limited areas of functionality that can either
be worked around or ignored.
• Minor
– Defects that can be overlooked with no loss of functionality.
Example references
• Corporate standards:
– Procedure for Software Quality Plans
• Exemplary documents:
– Foo System Software Quality Plan
• High quality reference texts:
– Software Quality: Concepts And Plans, Ch. 13 (Plan
following an industrial model), Robert Dunn.
• On-line resources:
– http://flute.lanl.gov/SWQA/SMP.html
Why not write on the work product?
• Objectives
– Create consolidated, comprehensive listing of non-minor issues.
– Provide opportunity for group synergy.
– Improve reviewing skill by observing others.
– Create shared knowledge of work product.
• Procedure
– Moderator requests issues sequentially.
– Reviewers raise issues.
– Recorder notes issues on Scribe Data Sheet.
– Recorder Data Sheet is visible to everyone.
Rework
Planning Kick-Off Preparation Review Mt. Rework Verify
• Objectives
– Assess each issue, determine if it is a defect, and
remove it if necessary.
– Produce written disposition of non-minor issue.
– Resolve minor issues as necessary.
Rework (cont.)
• Procedure
– Author obtains Scribe Data Sheet containing
consolidated issues list as well as copies of work
products.
– Author assesses each issue and notes action taken using
Author Data Sheet.
– Author determines the ‘type’ of each defect
(reqs/spec/design/imp, etc.)
– When finished Author provides Author Data Sheet and
reworked product to Moderator to Verify.
1. Inspection ID Example Rework Data
_________ 2. Document ____________ 3. Author ____________
4. Issue Disposition
Num Fixed Type Explanation
_____ ____ __________ _____________________________________________________________
_____ ____ __________ _____________________________________________________________
_____ ____ __________ _____________________________________________________________
5. Effort min
6. Rework Outcome of all Review Meeting Data Sheet issues are noted on this form.
Objectives All minor issues have been addressed.
No known defects remain in the work product.
Verify
Planning Kick-Off Preparation Review Mt. Rework Verify
• Objectives
– Assess the (reworked) work product quality.
– Assess the inspection process.
– Pass or fail the work product.
• Procedure for moderator:
– Obtain reworked product and Author Data Sheet.
– Review work product/data sheet for problems.
– Provide recommendation for work product.
– Perform sign-off with reviewers.
– Compute summary statistics for inspection.
– Generate any process improvement proposals.
– Enter review data into quality database.