NOTES Software Quality Management

Published on December 2016 | Categories: Documents | Downloads: 32 | Comments: 0 | Views: 418
of 382
Download PDF   Embed   Report

Comments

Content

Software Quality Management
Software Quality Introduction

Agenda
 Definitions
– Quality and software Quality

 Views about quality  Total quality management

Quality
 Hard to define, impossible to measure and easy to recognise  Example
– Cars in India, Mercedes, Chevy, Indica

 Quality is
– not absolute – multidimensional – about acceptable compromises – criteria of quality interact with each other

Defining quality
 Vague definitions
– The degree of excellence – Zero defect – Quality is when customer comes back but not the product

 Trying to be precise
– ISO
 The totality of features and characteristics of a product or service that bear on its ability to satisfy stated and implied needs

Combining all the definitions
 It is fitness of needs
– Needs => specification conformance – fitness => intended purpose

 Example
– Water fall model of development
 Requirements and design form the need  Verification and validation forms the fitness

Issues in software quality
 Software has no physical existence  Lack of knowledge of client needs at the start of the project  Rate of change of hardware / software  High expectations from customer / adaptability

Manufacturing vs software
   No releases No bug fixes Purchase new model for new specifications  Patch releases  Minor and Major releases  New features on top of old model

Views on quality
 Polyhedron metaphor
– Timeliness, cost, functionality, correctness, reliability, maintainability

 Project manager’s view
– Reliability, Timeliness, maintainability, correctness

 Business analyst
– Functionality, correctness, timeliness, cost

 Programmer
– Functionality, correctness, maintainability

Views on quality
 Quality auditor
–?

 End user
–?

 Line Manager
–?

 Project Sponsor
–?

Garvin’s views of quality
 Transcendent view
– Elegance

 Product based view  User based view
– Fitness of purpose

– Higher the quality higher the cost

 Manufacturing view  Value based view

– Conformance to requirements – Give what the customer requires at the price they want

Total Quality Management
 Coined in 1985 by Naval Air Systems Command  Links long term success with quality and customer satisfaction  Successful TQM examples
– HP – Total Quality Control – Motorola – 6- σ – IBM – Market driven quality

Software TQM Contd.,
 HP
– Focus on management commitment, leadership, customer focus, total participation and systematic analysis leading to customer satisfaction – Stringent quality levels to obtain customer satisfaction through cycle time reduction and participative management – Defect elimination, Cycle time reduction, customer and business partner satisfaction and adherence to Baldridge assessment discipline

 Motorola

 IBM

TQM system
 Customer focus
– Total customer satisfaction

 Process
– Continuous process improvement

 Human side of quality
– Company wide quality culture

 Measurement and Analysis
– Continuous improvement with goal oriented measurement

End of Session 1

Hierarchical models for quality – Software Development process
Session 2

Agenda
    Definition of Hierarchial models Examples of Hierarchial models Software Development Process models Process maturity framework and quality standards

Hierarchial Model
 Need for a model
– To evaluate quality under different situations

 Model of BITS for assessment
– Subject – 4 tests – EC1, EC2, EC3, and EC4 – EC2, EC4 is copy  open book – Marks for each test – Grading – CGPA

Structure of Hierarchial Model
Quality Factor

Quality Criteria (Reliability)

Quality Criteria (Maintainability)

Quality Criteria (Usability)

Quality Metrics
(Accuracy, Consistency, Error Tolerance, Simplicity)

Quality Metrics

Quality Metrics

GE Model or McCall’s Model
Maintainability Flexibility Testability Product Portability Reusability Interoperability

Product

Revision Transition

Product Operation Correctness, Reliability, Efficiency, Integrity, Usability

Boehm Model
 General Utility
– As is utility
 Efficiency  Human Engineering

– Maintainability
 Testability  Understandability  Modifiability

– Portability

Base Factors of Boehm’s model
               Device independence Self Containedness Accuracy Completeness Robustness / Integrity Consistency Accountability Device efficiency Accessibility Communicativeness Self Descriptiveness Structuredness Conciseness Legibility Augmentability

Common Factors of the model
 Quality criteria is based on user view  Models focus on parts that designers can analyse  These models cannot be tested or validated
– Then what’s the point in having this?

 Measurement of quality is a weighted summation of the characteristics – result is of limited value but useful

Interrelation of quality criteria
      Integrity vs efficiency (I) Maintainability vs flexibility (D) Usability vs efficiency ? Portability vs efficiency ? Portability vs reusability ? Correctness vs efficiency ?

 These relationships may not be commutative / some may defy classification !!

Gillies model
 Quality = correctness
– Technical factors (conformance to spec)
 Reliability  Maintainability  Integrity  Efficiency  Usability  Adaptability  Interoperability  Portability

Gillies model
– Business Factors (fitness of purpose)
 Added value  Cost  Timeliness of delivery  User Satisfaction  Ease of Transition

Software Development Process Models

Agenda
       Waterfall Prototyping Spiral Iterative development OOD Clean room methodology Defect Prevention Process

Water Fall Development Model
Requirements Gathering and Analysis Architectural Design

HLD/Ins

LLD/Ins

Coding/Ins

Unit Testing

Integration Component and system test

Alpha and Beta Testing Release

Water fall
 Document driven approach  Risk takes a back seat

Prototyping
 Useful when systems requirements change with time  All external interfaces made available  Two modifications
– Throw away prototyping – Evolutionary prototyping

Building prototype, Customer evaluation, refining design and prototype

Requirements Gathering and analysis

Quick Design

Spiral Model Cumulative
cost

Review

Planning next phases

Spiral Model
 Advantages
– – – – – Risk analysis and risk driven approach Prototyping is important Uses simulations, models and benchmarks Reviews are done Prepares for growth, evolution and changes

 Disadvantages
– Matching to contract software – Relying on risk management expertise – Need for further elaboration of spiral steps

Iterative development

Iterative Development
   a.k.a iterative enhancement approach Prototype + Water fall + Spiral = IDP IBM OS/2 used this approach

OO Development
 Model the essential system
– Use case

 Derive candidate essential classes
– Found in external entities, data stores, input flows and process specs.

 Constrain the essential model
– Bring the target implementation environment into picture

 Derive additional classes
– Add classes specific to implementation

OO Development
 Synthesize classes
– Refinement of classes using OOP

 Define interfaces  Complete the design
– Interaction, states etc

 Implement the solution
– Classes coded and unit tested

Cleanroom Methodology

Defect Prevention Process

Process Maturity Framework and Quality Standards

Agenda
    SEI Assessment SPR Assessment Malcolm Baldridge Assessment ISO

SEI-CMM
 Initial
– Chaotic, unpredictable cost, schedule and quality performance

 Repeatable
– Requirements management – Software project planning and management – Software subcontract management – SQA – SCM

SEI-CMM
 Defined
– Organisational process improvement – Organisation process definition – Training program – Integrated software management – Software product engineering – Intergroup coordination – Peer reviews

SEI-CMM
 Managed
– Process measurement and analysis – Quality management

 Optimizing
– Defect Prevention – Technology innovation – Process change management

CMMI
 Initial
– Processes are adhoc and chaotic

 Managed
– – – – – – – Requirements management Project planning Project monitoring and control Supplier agreement management Measurement and analysis Process and product qa SCM

CMMI
 Defined
– – – – – – – – – – – – Requirements development Technical solution Product integration Verification Validation Organizational process focus Organization process definition Integrated product management Risk management Decision analysis and resolution Organizational environment for integration Integrated teaming

CMMI
 Qualitatively managed  Optimizing
– Organization process performance – Quantitative project management – Organizational innovation and deployment – Causal analysis and resolution – – – – – – L0 – In complete L1 – Performed L2 – Managed L3 – Defined L4 – Quantitatively Managed L5 - Optimizing

 Capability level of individual process areas

Software Productivity Research Assessment
  5 point scale
– – – – – – – – – – – – – Excellent, Good, Average, Below Average, Poor Quality and productivity measurements Pretest defect removal experience among programmers Test defect removal experience among programmers Project quality and reliability targets Pretest defect removal at project level Project testing defect removal Post test defect removal Projects and software products being assessed Software technologies used Software processes used Ergonomics and work environment for staff Personnel training for staff and management

Questions on



Findings are grouped into

Malcolm Baldridge Assessment
 Most prestigious quality award in US  7 Categories and 28 examination items  Categories
– – – – – – – Leadership Information and analysis Strategic quality planning HR utilization Quality Assurance of product and services Quality results Customer Satisfaction

 Scoring is based on
– Approach, Deployment and results

Malcolm Baldridge Assessment
 Purpose of assessment
– – – Elevate quality standards Faciliate sharing and communication within organization Serve as working tool for planning, training, assessment and other uses – Provide basis for making award – Provide feedback to applicants

 1000 points in award criteria  Each exam item has a % score >70% will be considered for award  Feedback is very important not the award

ISO
    Evaluated on 20 elements Abroad about 60 – 70% organizations are rejected In India it is available at a cost!!! Documentation, Documentation , Documentation all well tracked  Looks at both
– Product metrics – Process metrics

Notes
 ISO and MB models are complementary  SEI and SPR looks at maturity of organization  ISO and MB assess the organization irrespective of the industry  We will now start focusing on measurement and metrics which is used in all of the above !!!! from next class

Software Quality Management
BITS Pilani
Pilani Campus

Lecture 1-2

Agenda
• Definitions
– Quality and software Quality

• Views about quality • Total quality management

BITS Pilani, Pilani Campus

Quality
• Hard to define, impossible to measure and easy to recognise • Example
– Cars in India, Mercedes, Chevy, Indica

• Quality is
– not absolute – multidimensional – about acceptable compromises – criteria of quality interact with each other
BITS Pilani, Pilani Campus

Defining quality
• Vague definitions
– The degree of excellence – Zero defect – Quality is when customer comes back but not the product

• Trying to be precise
– ISO
• The totality of features and characteristics of a product or service that bear on its ability to satisfy stated and implied needs
BITS Pilani, Pilani Campus

Combining all the definitions
• It is fitness of needs
– Needs => specification conformance – fitness => intended purpose

• Example
– Water fall model of development
• Requirements and design form the need • Verification and validation forms the fitness

BITS Pilani, Pilani Campus

Issues in software quality
• Software has no physical existence • Lack of knowledge of client needs at the start of the project • Rate of change of hardware / software • High expectations from customer / adaptability

BITS Pilani, Pilani Campus

Manufacturing vs software
• No releases • No bug fixes • Purchase new model for new specifications • Patch releases • Minor and Major releases • New features on top of old model

BITS Pilani, Pilani Campus

Views on quality
• Polyhedron metaphor
– Timeliness, cost, functionality, correctness, reliability, maintainability

• Project manager’s view
– Reliability, Timeliness, maintainability, correctness

• Business analyst
– Functionality, correctness, timeliness, cost

• Programmer
– Functionality, correctness, maintainability

BITS Pilani, Pilani Campus

Views on quality
• Quality auditor
–?

• End user
–?

• Line Manager
–?

• Project Sponsor
–?
BITS Pilani, Pilani Campus

Garvin’s views of quality
• Transcendent view
– Elegance

• Product based view
– Higher the quality higher the cost

• User based view
– Fitness of purpose

• Manufacturing view
– Conformance to requirements

• Value based view
– Give what the customer requires at the price they want

BITS Pilani, Pilani Campus

Total Quality Management
• Coined in 1985 by Naval Air Systems Command • Links long term success with quality and customer satisfaction • Successful TQM examples
– HP – Total Quality Control – Motorola – 6-σ – IBM – Market driven quality
BITS Pilani, Pilani Campus

Software TQM Contd.,
• HP
– Focus on management commitment, leadership, customer focus, total participation and systematic analysis leading to customer satisfaction

• Motorola
– Stringent quality levels to obtain customer satisfaction through cycle time reduction and participative management

• IBM
– Defect elimination, Cycle time reduction, customer and business partner satisfaction and adherence to Baldridge assessment discipline

BITS Pilani, Pilani Campus

TQM system
• Customer focus
– Total customer satisfaction

• Process
– Continuous process improvement

• Human side of quality
– Company wide quality culture

• Measurement and Analysis
– Continuous improvement with goal oriented measurement
BITS Pilani, Pilani Campus

End of Session 1

BITS Pilani, Pilani Campus

Hierarchical models for quality – Software Development process
Session 2

BITS Pilani, Pilani Campus

Agenda
• • • • Definition of Hierarchial models Examples of Hierarchial models Software Development Process models Process maturity framework and quality standards

BITS Pilani, Pilani Campus

Hierarchial Model
• Need for a model
– To evaluate quality under different situations

• Model of BITS for assessment
– Subject – 4 tests – EC1, EC2, EC3, and EC4 – EC2, EC4 is copy  open book – Marks for each test – Grading – CGPA
BITS Pilani, Pilani Campus

Structure of Hierarchial Model
Quality Factor

Quality Criteria (Reliability)

Quality Criteria (Maintainability)

Quality Criteria (Usability)

Quality Metrics
(Accuracy, Consistency, Error Tolerance, Simplicity)

Quality Metrics

Quality Metrics

BITS Pilani, Pilani Campus

GE Model or McCall’s Model
Maintainability Flexibility Testability Product Portability Reusability Interoperability

Product

Revision Transition

Product Operation Correctness, Reliability, Efficiency, Integrity, Usability
BITS Pilani, Pilani Campus

Boehm Model
• General Utility
– As is utility
• Efficiency • Human Engineering

– Maintainability
• Testability • Understandability • Modifiability

– Portability
BITS Pilani, Pilani Campus

Base Factors of Boehm’s model
• • • • • • • • • • • • • • • Device independence Self Containedness Accuracy Completeness Robustness / Integrity Consistency Accountability Device efficiency Accessibility Communicativeness Self Descriptiveness Structuredness Conciseness Legibility Augmentability

BITS Pilani, Pilani Campus

Common Factors of the model
• Quality criteria is based on user view • Models focus on parts that designers can analyse • These models cannot be tested or validated
– Then what’s the point in having this?

• Measurement of quality is a weighted summation of the characteristics – result is of limited value but useful
BITS Pilani, Pilani Campus

Interrelation of quality criteria
• • • • • • Integrity vs efficiency (I) Maintainability vs flexibility (D) Usability vs efficiency ? Portability vs efficiency ? Portability vs reusability ? Correctness vs efficiency ?

• These relationships may not be commutative / some may defy classification !!

BITS Pilani, Pilani Campus

Gillies model
• Quality = correctness
– Technical factors (conformance to spec)
• • • • • • • • Reliability Maintainability Integrity Efficiency Usability Adaptability Interoperability Portability

BITS Pilani, Pilani Campus

Gillies model
– Business Factors (fitness of purpose)
• • • • • Added value Cost Timeliness of delivery User Satisfaction Ease of Transition

BITS Pilani, Pilani Campus

Software Development Process Models

BITS Pilani, Pilani Campus

Agenda
• • • • • • • Waterfall Prototyping Spiral Iterative development OOD Clean room methodology Defect Prevention Process
BITS Pilani, Pilani Campus

Water Fall Development Model
Requirements Gathering and Analysis

Architectural Design

HLD/Ins

LLD/Ins

Coding/Ins

Unit Testing

Integration Component and system test

Alpha and Beta Testing Release

BITS Pilani, Pilani Campus

Water fall
• Document driven approach • Risk takes a back seat

BITS Pilani, Pilani Campus

Prototyping
• Useful when systems requirements change with time • All external interfaces made available • Two modifications
– Throw away prototyping – Evolutionary prototyping

Building prototype, Customer evaluation, refining design and prototype

Requirements Gathering and analysis

Quick Design

BITS Pilani, Pilani Campus

Spiral Model
Cumulative cost

Review

Planning next phases

BITS Pilani, Pilani Campus

Spiral Model
• Advantages
– – – – – Risk analysis and risk driven approach Prototyping is important Uses simulations, models and benchmarks Reviews are done Prepares for growth, evolution and changes

• Disadvantages
– Matching to contract software – Relying on risk management expertise – Need for further elaboration of spiral steps

BITS Pilani, Pilani Campus

Iterative development

BITS Pilani, Pilani Campus

Iterative Development
• a.k.a iterative enhancement approach • Prototype + Water fall + Spiral = IDP • IBM OS/2 used this approach

BITS Pilani, Pilani Campus

OO Development
• Model the essential system
– Use case

• Derive candidate essential classes
– Found in external entities, data stores, input flows and process specs.

• Constrain the essential model
– Bring the target implementation environment into picture

• Derive additional classes
– Add classes specific to implementation
BITS Pilani, Pilani Campus

OO Development
• Synthesize classes
– Refinement of classes using OOP

• Define interfaces • Complete the design
– Interaction, states etc

• Implement the solution
– Classes coded and unit tested

BITS Pilani, Pilani Campus

Cleanroom Methodology

BITS Pilani, Pilani Campus

Defect Prevention Process

BITS Pilani, Pilani Campus

Process Maturity Framework and Quality Standards

BITS Pilani, Pilani Campus

Agenda
• • • • SEI Assessment SPR Assessment Malcolm Baldridge Assessment ISO

BITS Pilani, Pilani Campus

SEI-CMM
• Initial
– Chaotic, unpredictable cost, schedule and quality performance

• Repeatable
– Requirements management – Software project planning and management – Software subcontract management – SQA – SCM
BITS Pilani, Pilani Campus

SEI-CMM
• Defined
– Organisational process improvement – Organisation process definition – Training program – Integrated software management – Software product engineering – Intergroup coordination – Peer reviews

BITS Pilani, Pilani Campus

SEI-CMM
• Managed
– Process measurement and analysis – Quality management

• Optimizing
– Defect Prevention – Technology innovation – Process change management

BITS Pilani, Pilani Campus

CMMI
• Initial
– Processes are adhoc and chaotic

• Managed
– – – – – – – Requirements management Project planning Project monitoring and control Supplier agreement management Measurement and analysis Process and product qa SCM

BITS Pilani, Pilani Campus

CMMI
• Defined
– – – – – – – – – – – – Requirements development Technical solution Product integration Verification Validation Organizational process focus Organization process definition Integrated product management Risk management Decision analysis and resolution Organizational environment for integration Integrated teaming

BITS Pilani, Pilani Campus

CMMI
• Qualitatively managed
– Organization process performance – Quantitative project management

• Optimizing
– Organizational innovation and deployment – Causal analysis and resolution

• Capability level of individual process areas
– – – – – – L0 – In complete L1 – Performed L2 – Managed L3 – Defined L4 – Quantitatively Managed L5 - Optimizing

BITS Pilani, Pilani Campus

Software Productivity Research Assessment
• • 5 point scale
– Excellent, Good, Average, Below Average, Poor

Questions on
– – – – – – – Quality and productivity measurements Pretest defect removal experience among programmers Test defect removal experience among programmers Project quality and reliability targets Pretest defect removal at project level Project testing defect removal Post test defect removal Projects and software products being assessed Software technologies used Software processes used Ergonomics and work environment for staff Personnel training for staff and management



Findings are grouped into
– – – – –

BITS Pilani, Pilani Campus

Malcolm Baldridge Assessment
• Most prestigious quality award in US • 7 Categories and 28 examination items • Categories
– – – – – – – Leadership Information and analysis Strategic quality planning HR utilization Quality Assurance of product and services Quality results Customer Satisfaction

• Scoring is based on
– Approach, Deployment and results

BITS Pilani, Pilani Campus

Malcolm Baldridge Assessment
• Purpose of assessment
– Elevate quality standards – Faciliate sharing and communication within organization – Serve as working tool for planning, training, assessment and other uses – Provide basis for making award – Provide feedback to applicants

• 1000 points in award criteria • Each exam item has a % score >70% will be considered for award • Feedback is very important not the award
BITS Pilani, Pilani Campus

ISO
• Evaluated on 20 elements • Abroad about 60 – 70% organizations are rejected • In India it is available at a cost!!! • Documentation, Documentation , Documentation - all well tracked • Looks at both
– Product metrics – Process metrics

BITS Pilani, Pilani Campus

Notes
• ISO and MB models are complementary • SEI and SPR looks at maturity of organization • ISO and MB assess the organization irrespective of the industry • We will now start focusing on measurement and metrics which is used in all of the above !!!! from next class
BITS Pilani, Pilani Campus

Measurement and Metrics
BITS Pilani
Pilani Campus

Lecture 3-4

Measurement and Metrics

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Agenda
• Metrics and Measures • Pitfalls • Some metrics for product, project and process

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Measurement and Metrics
• Why measure? • Metrics
– Indicator of one or more quality criteria and we seek to measure

• Metric Characteristic
– Must be clearly linked to the quality criteria – be sensitive to the different degrees of criteria – provide objective determination of criteria which can be mapped to a suitable scale
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Example
• What are the ways to measure strength? • Metrics used can be
– Kg you can pull ? – Time you can run ? – Temperature you can bear ?

• How do we measure temperature?
– Linear expansion – so ?

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Metrics
• Type
– Predictive
• Used for prediction

– Descriptive
• Talks about the system at that instant

• Good metrics
– – – – – – – Objective Reliable Valid Standard Comparable Economical Useful

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Common Metrics and Quality
• Readability, Complexity, Modularity, Testability of a document gives a measure for usability, maintainability • Error prediction, detection gives a measure of correctness • MTTF, Complexity gives a measure on reliability

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Limitation of Common Metrics
• • • • Relationships, cannot be validated Generally not objective Quality is a relative quantity – not absolute Depend on small set of measurable parameters • Complete set of quality criteria not measured • Metrics measure more than one criteria
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Simple Methods for calculating metrics
• Simple scoring
– Scoring for each criteria – take average

• Weighted scoring
– Same as above but add weights and divide by weight.

• Phased weighted score
– Take different phases into account

• Polarity Profiling
– Desirable vs achieved
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Complex Metrics
• Metrics are got from measures • Example
– Prove, the more rigorously the front end of the development process is executed, the better the quality at the back end. – Steps
• What is front end of development process? • What is backend of development process?

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Complex Metrics
• Front end (rigorous)
– Inspection
• Number of lines inspected • Final scoring of inspection on Liekert scale (very effective to poor inspection)

– Testing
• Coverage • Defect rate in terms of KLOC



Form Hypothesis
– Higher the metrics during inspection, lower it will be during testing – Good final scoring leads to less defect rate – Good inspection and more coverage vs less defect

• • • •

Get Test data Correlate with the hypothesis or do statistical analysis If hypothesis confirms the data we are OK If it refutes the data search for missing links

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Two ways
• • • • Theory Proposition Hypothesis Data Analysis • Concept • Definition • Operational Definition
– Software defect rate

• Measurements in the real world

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Levels of measurement
• Nominal scale
– Jointly exhaustive and mutually exclusive
• Example
– Types of software process

• Ordinal scale
–a>b>c

• Interval or Ratio scales
– A/B or A:B

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Basic Measures
• Ratio
– a/b what if a > b

• Proportion
– a + b + c = N => a/N + b/N + c/N = 1

• Percentage
– Ratio / proportion expressed in terms of 100 – Usually have a problem as % depends on base

• Rate
– Dynamic measure – Associated with a changing quantity

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Six Sigma
• Stringent quality measure • 3.4 defective parts per million • Area under a curve for a normal distribution

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Confidence Interval
σ 2σ 3σ 4σ 5σ 6σ 7σ 68.26894921371% 95.44997361036% 99.73002039367% 99.99366575163% 99.99994266969% 99.99999980268% 99.99999999974%
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Reliable measurements
• Depends on situation • Usually
– Index of Variation = Standard Deviation / Mean

• Reliability vs Validity
– 100 % of the time wrong result – 10% right but others < 50% – 100% right all the time
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Measurement Errors
• Systematic measurement error
– Related with validity – Can be overcome with averages

• Random measurement error
– Related with reliability – Methods
• Test/retest • Alternative form • Split half method

• Finally Use Correlation but be carful while using it
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Correct Measure

A 1. Causal, 2. Correlated, 3. No spurious relationship

B

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Metrics
• Product Metrics • Process Metrics • Project Metrics

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Product Metrics
• Mean Time To Failure
– Error = Human mistake and incorrect s/w – Fault = Due to error and unit fails – Failure = Cannot function as per requirements

• Defect Density
– Defect = Anamoly in the product – Calculated in terms of shipped source instructions and changed source instructions

• Customer problems • Customer satisfaction
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Function Points
• • • • • • External inputs (3-6), outputs (4-7) Internal(7-15), external files(5-10) Queries (3-6) Calculate weighted sum = FC Take the 14 factors on 5 pt scale Do Value adjustment
– VAF = 0.65 + 0.01 * sum (ci)

• FP = FC * VAF
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

SEI - CMM
• • • • • • Defect rate per function point L1 = 0.75 L2 = 0.44 L3 = 0.27 L4 = 0.14 L5 = 0.05

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Process Quality metrics
• • • • Defect Density during machine testing Defect arrival pattern Phase based defect removal pattern Defect removal effectiveness
– DRE = Defects removed during dev phase / Defect latent in the product expressed as percentage

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Software Maintenance
• Fix backlog and backlog management index
– BMI = Problems closed / problems arrived per month in %

• Fix response time and responsiveness
– Mean time for all problems closed

• Precent delinquent fixes
– No of fixes that exceeded the response time criteria by severity level / Number of fixes delivered during that time as percentage

• Fix quality
– Number of defective fixes / number of fixes as a percentage
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Collecting data
• • • • • • Establish goal for data collection Develop list of questions Establish data categories Design and test data collection forms Collecte and Validate data Analyse data

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Case Study
• Read the metrics program of
– Motorola – HP – IBM in your book

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Summary
Thanks

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Quality Management Systems
BITS Pilani
Pilani Campus

Lecture 5-6

History
• 3 Gurus
– Deming
• Conformability and dependability • background in statistics • No posters

– Juran
• whose idea of quality was fitness of purpose • No posters

– Crosby
• Zero defects – book quality is free • Have poster campaign

• Edward Deming
– Worked with Juran by accident
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Deming’s 14 point for management
• Constancy of purpose • A new philosophy
– Single source of supply than tenders

• • • • •

Cease dependence on inspection End lowest tender contracts Improve every process Institute training on the job Institute leadership
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Deming’s 14 point for management
• • • • • • • Drive out fear Break down barriers Eliminate exhortations Eliminate targets Permit pride of workmanship Encourage education Create top management structures

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Juran’s 10 steps for quality
• Build awareness of need and opportunity for improvement • Set goals for improvement • Organize to reach the goals • Provide training • Carry out projects to solve problems

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Juran’s 10 steps for quality
• • • • • Report progress Give recognition Communicate results Keep score Maintain momemtum by making annual improvement part of the regular process of the company

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Crosby
• Aims at defect prevention • Suggest quality vaccine
– Determination, Education, Implementation

• 4 absolutes of quality
– Definition : Conformance to requirements – System : Prevention – Performance Standard : zero defects – Measurement : the price of non-conformance
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

14 steps of Crosby
• Make it clear that management is committed to quality • Form QMT with each department represented • Determine where the current and potential problems lie

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

14 steps of Crosby
• Evaluate cost of quality and explain its use as a tool • Raise the quality of awareness and concern of all employees • Take actions to correct problems identified • Establish a committee for the zero defects programme

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

14 steps of Crosby
• Train supervisors to actively carry out their role in quality improvement • Hold a “zero defects day” for all employees to highlight the changes • Encourage individuals to establish improvement goals

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

14 steps of Crosby
• Encourage communication with management about obstacles to improvement • Recognize and appreciate participants • Establish quality councils to aid communication • Do it all over again to show it never ends.

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

QMS
• The organizational structure, responsibilities, procedures, processes and resources for implementing quality -ISO Definition • Best
– Gives a frame work

• Worst
– Bureaucratic framework

• Must include
– QA and Quality improvement

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

TQM
• A method for ridding people’s lives of wasted effort by involving everybody in the process of improving the effectiveness of work so that results are achieved in less time • Kanji’s definition
– Quality is to satisfy customer’s requirements continually – Total quality is to achieve quality at low cost – Total quality management is to obtain total quality by involving every one’s daily commitment

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

QIP
• Used to refine QMS • Elements of QMS
– Organizational structure – Responsibility – Procedure – Processes – Resources

• Establishment of quality culture part of QIP
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Monitoring quality
• Use the 7 tools described earlier

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Human factors in QMS
• Staff acceptance of procedures, tools and techniques • Needs
– management committement – Team approach – organizational quality climate

• Quality circle or quality improvement team

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

External standards
• Most people are cynical • Quality needs to be forced • Quality is a personal factor and hence conflicts arise • Groupisms
– More revenue vs. quality

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Parts of QMS
• Development procedures
– Methodology and use of tools, distribution of expertise to maintain quality

• Quality control
– Meetings, planning, user sign off, change control etc.

• Quality improvement
– QIT, Quality circles

• Quality Assurance
– Monitoring , assessment

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Benefits
• • • • • Cost Timeliness Reliability Functionality Maintainability

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Kaizen
• Emphasis is on improvement and inspection • Quality must fully meet consumer needs • Customer centered view • Small cooperative groups and emphasis on worker’s suggestions

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Questions?

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

ISO 9000 Series of Quality Management Standards

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Agenda
• • • • • Purpose of standard Details of standard Compliance Audit TickIT

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Purpose
• A model of best practice that needs to be followed / compared against • Not a better way of designing things • Conformance to standard way • Implementation meets the standard over a continuous time

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Types of accreditation
• First Party • Second Party
– Military suppliers

• Third Party

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Advantages of accreditation
• • • • Self confidence External credibility Part of tenders Inclusion in buyer’s guide compiled by accreditation bodies and circulated to potential customers

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Standards
• ISO 9000
– A guide to select appropriate standard

• 9001
– Design, Development, Production, Installation and Service

• 9002
– Production and Installation

• 9003
– Final inspection and testing

• 9004
– Guidance for QMS to set up 1/2/3 standards

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Fundamental principles of ISO 9001 Model
• Do it right the first time • It should fit the purpose • 20 sub clauses under clause 4 are used

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Management Responsibility
• Provision of management representative • Responsible for quality and accountable to senior management • Basic principles for establishing a quality system are explained

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Quality System
• Organizational Quality System • Documented • Quality plan and manual must be prepared

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Contract Review
• Each customer order is a contract • Order entry procedures • Goals
– Customer requirements are given in writing – Highlight differences so that they are agreed – Ensure that requirements can be met

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Design Control
• Control and verify design activities
– Planning for R & D – Assignment to qualified staff – Identify interfaces between relevant groups – Preparation of design brief – Production of technical data – Verification that output of the design phase meets the input requirements – Identification of all changes and modification and documenting the same
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Document Control
• Level 1 documents
– Planning and policy documents

• Level 2 documents
– Procedures

• Level 3 documents
– Detailed instructions

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Purchasing
• All purchased products conform to the standards and requirements of the organization • If the supplying organization has conformance to ISO, then the procedures are simplified

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Purchaser supplied product
• To trace all purchaser supplied product at every stage of the process as well as storage

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Product Identification and traceability
• Procedure to trace products between inputs and outputs • Enables quality problems to be traced to the root causes

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Process Control
• Documented processes • Procedures for calibration • Documented instructions for staff

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Inspection and Testing
• Done at three stages
– Incoming – In process – Outgoing

• Should not reveal many issues / bugs

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Inspection, measuring and testing equipment
• Calibration of equipments • Maintenance of equipments at regular intervals • Documentation of the above

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Inspection and Testing Status
• Awaiting inspection or test • Finished inspection
– Passed – Failed

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Control of nonconforming product
• Nonconformance
– All products or services falling outside tolerance limits agreed in advance with customer

• Identified, documented, physically separated • Procedures needed for handling them • Selling of nonconforming products allowed!!
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Corrective Action
• Records kept so that future audits can investigate its effectiveness • Should have systematic procedures and have duties of all parties

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Handling storage packaging and delivery
• All activities which are contractual obligation to the customer • Subcontractors employed for transporation subject to this

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Quality records
• Records must be fit for intended purpose • No specific format / structure

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Internal Quality Audits
• • • • Documented Internal auditors Records for external auditors if needed Corrective actions documented

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Training
• Implemented and documented • Written procedures to
– Establish training needs – Carry out training activities – Record training requirements – Record completion requirements

• Also informal knowledge sharing

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Service
• Servicing procedures are documented • Sufficient resources available • Set up good interfaces with the customer for service

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Statistical Techniques
• Use where appropriated • Used in process control

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Seeking accredition
• Implement quality system with outside help • Gain internal acceptance • Pre inspection quality audit by third party • Accredition body audit • Surprise inspection twice an year

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

ISO 9000-3 Notes for guidance of ISO9001 to s/w dev
Section 4 – Quality system frame work Management responsibility, Quality system, internal audits, and corrective action Section 5 – Quality system – lifecycle activities Section 6 – Quality system – supporting activities Configuration management, measurement systems, practices and conventions, rules, tools and techniques

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

TickIT
• Boost awareness of certification and quality • Survey results – read thebook • Third party accredition

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Questions

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Quality Tools
BITS Pilani
Pilani Campus

Lecture 5

BITS Pilani
Pilani Campus

S. Raman

Agenda
• Seven tools and their usage

BITS Pilani, Pilani Campus

Statistical Tools
• • • • Proposed by Ishikawa Seven basic tools Useful for project managers and leaders does not provide developers the way to improve quality

BITS Pilani, Pilani Campus

Seven basic tools

Some books talks about check list instead of control charts

BITS Pilani, Pilani Campus

Histogram

BITS Pilani, Pilani Campus

Creating histogram

BITS Pilani, Pilani Campus

Use of histogram

BITS Pilani, Pilani Campus

Pareto Charts

BITS Pilani, Pilani Campus

Pareto Charts - Construction

BITS Pilani, Pilani Campus

Usage

BITS Pilani, Pilani Campus

Example

BITS Pilani, Pilani Campus

Cause Effect Diagram

BITS Pilani, Pilani Campus

Construction Cause Effect

BITS Pilani, Pilani Campus

Usage – Cause effect

BITS Pilani, Pilani Campus

Example

BITS Pilani, Pilani Campus

Scatter Diagrams

BITS Pilani, Pilani Campus

Construction

BITS Pilani, Pilani Campus

Usage

BITS Pilani, Pilani Campus

Example

BITS Pilani, Pilani Campus

Flow Charts

BITS Pilani, Pilani Campus

Creation

BITS Pilani, Pilani Campus

Use

BITS Pilani, Pilani Campus

Run charts

BITS Pilani, Pilani Campus

Creating Run charts

BITS Pilani, Pilani Campus

Example

BITS Pilani, Pilani Campus

Control Charts

BITS Pilani, Pilani Campus

Development of Control Charts

BITS Pilani, Pilani Campus

Usage

BITS Pilani, Pilani Campus

Summary

BITS Pilani, Pilani Campus

Models and Standards for process improvements BITS Pilani
Pilani Campus

Why not ISO?
 More

documentation  Patch work for software  Mostly for blue collar jobs  More focus on processes  Less focus on results or analysis

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

CMM
    

Initial


Undefined controls and processes Standardized methods for repeatable processes Monitors and improves processes Advanced controls, metrics and feedback Metrics for optimization purposes
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Repeatable


Defined


Managed


Optimizing


Level 1 - Initial
 Chaotic


processes

Unpredictable cost, schedule and quality performance Unplanned commitments Gurus Magic Problems of scale

 Reasons
   

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Repeatable
 Management
 

commitment

Project management systems Quarterly reviews
• Projects reviews


Milestone, financial status, issues, staff comments Performance measures, Issues Assessment updates, Technology plan status, AI status Productivity, Quality and AI Summary

• Computing support review


• Process Status


• Organizational performance


BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Repeatable
 Project


plan

Planning
• Work break down structure



Size and Estimation
• LoC, Function points



Productivity and Scheduling
• User productivity • Project tracking and checkpoints

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Repeatable
 Software


Configuration Management

Baselines
• Configuration control • Change management • Revisions, Versions, Deltas, Conditional code



Tools
• System libraries



Releases
• Authorized releases • Change control boards
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Repeatable
 Software


Quality Assurance

SQA Plan
• • • • To ensure appropriate development methodology Standardize procedures and processes Make projects auditable by external agencies Change control measures



SQA People
• Do not put fresh people



Independent Verification and Validation
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Defined Process
 Software
 

Standards  Software Inspection
Inspection training Reports and Tracking

 Software


Testing

Different Testing

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Defined Process
 Advanced
  

Configuration Management

Design and Control Configuration during different phases Software Configuration Audit

 Defining


the software processes

Process models

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Defined Process
Software Engineering Process Groups


Role
• • • • • • • • • Identify key problems Establish priorities Define Action plans Get professional and management agreement Assign people Provide guidance Launch Track progress Fix problems

   

Standards establishment Process Databases Education and Training Processes Consultation and Assessments

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Managed Processes
 Data
 

Gathering and Analysis

Software measures and metrics Analysis

 Managing
    

Software Quality

Measurement Criteria Estimating software quality Quality Goals Quality plans Tracking and controlling
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Optimizing Processes


Defect Prevention


Processes
• Cause – effect categories



Automating the software process
 

Organizational plans Technology transitions Negotiation Process management Certification



Software Contracting
  

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Assessment Overview
 Not

an audit  Review the organization’s software capabilities and  Advise the management and professionals on how they can improve operations  Identify high priority areas of improvement  Only a guidance on improvement of processes
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Assessment Phases
 Preparation




Senior management approval on commitment to quality Training programme for the assessment team Questions and matching answers and proofs Local team for putting action items in place
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

 Assessment


 Recommendations


Assessment Principles
 Process

model is the basis of assessment  Confidentiality  Senior management involvement  Attitude of respect for the views of people in the organization being assessed  Action orientation

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Assessment Process - I
 Form

an assessment team  Train the assessment team  Confidentiality of assessment process  Site manager or head participates in the opening and closing assessment  A person responsible to implement Action Items are assigned

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Assessment Process - II
 Day


1

 

Assessment Overview (Head, staff and participants Briefing (participants) Questionnaire (project representatives)
• Completed for each project / project type



Project discussions (project representatives)
• To clarify any questions and any evidence that is needed

BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956

Assessment Process - III


Day 2
 

Functional areas interviews (functional rep) Preliminary findings (assess. team) Project discussions (project representatives) Findings formulation (assess. team) Findings Dry run (assess. team) Findings review (Project representatives) Findings presentation (Head, staff, participants) Senior management meeting (Head and assessment team leader) Assessment postmortem (assessment team)
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956



Day 3
 



Day 4
   



Conducting Project Assessments

Scope
• Assess end to end methodologies for the development and management of the project • Overall development effectiveness and efficiency instead of in-process quality status and improvement opportunities • Conducted by external people than inproject members

Audit and Assessment
• Audit
– IEEE
• An independent examination of a work product or a set of work products to assess compliance with specifications, standards, contractual agreements and other criteria

– ISO
• Certification, third party assessment is carried out by an independent organization against a particular standard

• Outcome
– In compliance or not in compliance – Pass or fail

Assessment
• A review of a software organization to advise its management and professionals on how they can improve their operation • • • • Assess maturity level vs. process roadmap Identify current practices Identify strength / weakness find out causes for poor quality

DoD’s mechanisms
• • • • Evaluates a third party vendor Audits them rather than assess them Done by a second party Assessments are of 3 types
– First party – Second Party – Third Party

Assessment issues
• Organizational assessments
– Sample project might lead to different assessments – Organization of the company could be a factor

Assessment issues
• Project level assessments do not suffer from the above issues
– Meaningful factors must be included – Degree of implementation and effectiveness must be measured – Exploratory and In-depth probing is a characteristic of this assessment – Independent assessments must be made to be objective

• Both these assessments should be complementary

Software Process Assessment Cycle
• • • • • • Select a team Appraise the team Expand on the assessment areas Do a site visit Find strengths and weaknesses Create a KPA profile to present it to the appropriate audience

Rules
• • • • Team should be lead by a Lead Assessor Team should have 4 – 10 members Atleast one team member from the organization Must complete the CMM Based Approach for Internal Process Improvement training • Must use
– – – – Standard Questionnaire Individual and Group Interviews Document reviews Feedback from the review of draft findings

Standard CMMI Assessment Method for Process Improvement
• Plan and preparation phase
– Identify assessment scope – Develop assessment plan – Prepare and train assessment team – Make a brief assessment of participants – Administer the CMMI appraisal questionnaire – Examine Questionnaire responses – Conduct initial document review

SCAMPI
• Online assessment phase
– – – – – – Conduct an opening meeting Conduct interviews Consolidate information Prepare presentation of draft findings Prepare draft findings Consolidate, rate and prepare final findings

• Reporting
– Present final findings – Conduct executive session – Wrap up the assessment

Zahran’s Generic Process

Proposed SPA Method
• It is project based • Facts Gathering
– Project methodology review – State of practices review – Questionairre completion and response validation

• Questionairre customization
– No standardized questionairre

• Observations analysis and recommendations go hand in hand with development • Direct input to the project teams given

Sample questionnaire
• What is the most common form of design reviews for this project?
– On maturity

• To what extent were design reviews of the project conducted?
– On project process

Successful Projects
• Effective
– planning, cost estimating, measurements, milestones tracking, quality control, change management, development processes, communication, project managers, technical personnel, specialists, substantial amount of reusable materials

• Failing Projects
– None of the above

Example finding and recommendation

Assessment report
• Project information and basic project data • Assessment approach • Brief description and observation of project team practices • Strengths and Weaknesses, gap analysis • Critical success factors and major project pitfalls • What would they do differently? • Recommendations

Question
• How is it different from audit report?

Sample project taken for assessment

Improvement plans for the project based on assessment

Project Y

Project Z

Summary

Dos and Don’ts of Software Process improvement

SCAMPI Model Advantages

Measuring Process capability
• 0 – 5 levels are used • is this OK?

Note
• Measuring levels is not enough
– The SEPG implementing practices in the project but the organization does not reach level x – The organization has reached level x but most of the projects are not following the practices

Note
• Alignment principle
– Faster, Cheaper or better – pick one from SEPG

• Take time getting faster
– Be reliable rather than deliver on time

Notes
• Keep it simple • Measure the value of process improvement • Measuring process adoption

Notes
• Measuring process compliance

• Celebrate the Journey Not the Destination

Summary

Thank you
Questions?

Defect Removal Effectiveness

Importance of this topic
Defect Prevention in standards Statistical methods Quality improvement Affects schedules Reduction in development times Measure for effectiveness of defect removal process

Growth
Earlier days
Only defect removal method was testing

Later
Formal reviews Inspections

Initial formula for error detection efficiency
Errors Found by an inspection --------------------------------------Total error in the product * 100 %

Analysis
If it is > 100 % then the inspection has found too many errors or the product was not tested properly before inspection.

What should be done to improve DRE?
Code / Design inspection Requirements analysis and inspection
NASA – Houston center

IBM Houston center has proven the relationship between DRE with product quality

Other Definition of DRE
Number of defects found in each phase ------------------------------------------------------------------------------------* 100 % Number of defects found in each phase + Number of defects found in subsequent activities (phases) Total Defect Containment Effectiveness = Number of pre release defects ======================================================== Number of prerelease defects + Number of post release defects Phase Containment Effectiveness = Number of phase i errors (found in dev phase) ====================================================== Number of phase i errors + Number of phase i defects (found later)

DRE formal development
Phase
Requirements HLD LLD Coding Integration Unit testing Component Testing System Testing

Injected
Requirements / Functional specification Design work Design work Coding Integration Bad fixes Bad fixes Bad fixes

Removed
Requirements analysis and review High level design inspection LLD Inspection Code inspection Build verification testing Testing Testing Testing

Process
Defects existing -+ Defects injected Defects detected – Defects not detected
+

Defects fixed – Defects incorrectly fixed Remaining defects

Defect Origin vs. place found

Req HLD LLD C I0 I1 I2 UT CT ST F Tot. 49 681

UT

CT

ST

CS

Tot.

Use of the matrix
Phase wise DRE can be found Phase Defect Removal Effectiveness
Overall Inspection effectiveness Test effectiveness Requirements effectiveness DRE for development etc.

DRE and Quality Planning
Use control charts for quality planning This can be done for each phase See the + and – in the figure. Use this for Phase Based Defect removal model to address questions like, if we increase the effectiveness of unit testing by 5%, then what is the gain in final quality of the product?

Cost Effectiveness
Defect removed at initial phases are less expensive than the ones removed at the end So inverse of the absolute values in the matrix can be taken as weightage!!! Inspections can be modified to improve effectiveness
Having more than one inspection

DRE and Process Maturity
Level 1 = 85% Level 2 = 89% L3 = 91% L4 = 93% L5 = 95%

L4 cumulative percentages of Defects removed by phase
Requirements 94% HLD 95% LLD 96% Coding 94% IT = 75% ST = 70% AT = 70%

Questions?

In Process Software Testing Metrics

Agenda
• To discuss metrics that have been proven to be efficient in the industry (IBM) • Purpose of the metric, data, interpretation, use and real life example

Test Progress S Curve

• Tests Planned for the week • Tests Attempted for the week • Actual Tests run for the week

Scores to test cases
• Test cases are assigned scores depending on their criticality • Every week you need to run x test cases and score over y points for tracking • The total points is calculated as • SUM(yi) where the summation is done over all the test cases

Weekly activities
• • • • Test Coverage weighing Test score Graph tracking Find the disparity in testing between planned and actual and look at schedule disparity ( > 15 % schedule skip by a week etc.)

Problems with S-Curve approach
• Module wise details are missing • Steep S-curve means more test cases in less period of time • All the questions regarding the test cases must be reviewed • Curve might shift to the right indicating project delay • What are the solutions for above?

Other advantages
• For 2 consecutive releases overlapping the S curve gives more information on the front end / back end testing

Testing Defect Arrivals over Time
• Discuss the following test arrival rate

Testing Defect Arrivals
• Discuss the following

TDAR
• Discuss the following

Extrapolating Defect Arrival Pattern
• How to predict the defect volume • Lots of reliability models are available

Defect Backlog Metric
• Discuss

Product Size over time

• Discuss various boxes • Discuss the trend

CPU Utilization
• Discuss this metric

System Crashes and Hangs
• What does this tell?

In process metric and quality management
• Use calendar time • Shipped time should be the reference for X axis and every week the metrics need to be tracked • Metrics should indicate good or bad status • Bad status for a metric should force management action • Metrics should drive process improvements

The Effort Outcome model

– – –

CELL 2 BEST-CASE SCENARIO.
GOOD QUALITY DESIGN / CODE LOW ERROR INJECTION EFFECTIVE TESTING.




CELL 1 IS A GOOD/NOT BAD SCENARIO.
LATENT DEFECTS FOUND VIA EFFECTIVE TESTING.


– –

CELL 3 IS THE WORST-CASE SCENARIO.
BUGGY CODE AND PROBABLY PROBLEMATIC DESIGNS HIGH ERROR INJECTION DURING THE DEVELOPMENT PROCESS.




CELL 4 IS THE UNSURE SCENARIO.
CANNOT ASCERTAIN WHETHER THE LOWER DEFECT RATE IS A RESULT OF GOOD CODE QUALITY OR INEFFECTIVE TESTING.

Merging S Curve and E/O Model
• Cell 2
– Defect arrival lower, S curve ahead or same as baseline

• Cell 1
– Defect arrival higher, S curve ahead or same as baseline

• Cell 3
– Late defect arrivals, S curve behind

• Cell 4
– S curve behind, defect arrival lower in early part of the curve

Metrics for Vendor developed software
• % of test cases attempted • % of defects per executed test case • No. of failing test cases without defect records • Success rate, persistent failure rate, • Defect injection rate, code completeness

When is your product good to ship?

Thanks / Questions?

Complexity Metrics and Models

Need
• • • • White box testing More granular measurement Done at program module level Helps software engineers to improve the quality of their work

Models and usage
• Reliability models
– Researchers and software reliability practitioners – Need mathematics and statistics

• Quality Management models
– Software quality professionals and project managers

• Software Complexity Research
– Computer scientists and experimental software engineers

Most familiar LoC
• Originated from Assembly language • Inverse relation ship with respect to LoC exists until 500 lines of code
– Reason is interface errors

• With advancement – concave models seem to suit better and the tail raises
– Increase in module size is proportional to comprehension of the same

• Optimum program size for each language / project can be found

Halstead’s software science
• Given • n1 = number of distinct operators in a program • n2 = number of distinct operands in a program • N1 = number of operator occurrences • N2 = number of operand occurrence
V* = minimum volume represented by a built-in function performing the task of the entire program

• Vocabulary (n) = n1 + n2 • Length (N) = N1 + N2
– = n1 log n1 + n2 log n2

• Volume (V) = N log n
– = N log (n1+n2)

• Level (L) = V* / V
• =(2/n1) x (n2 / N2)

• Difficulty (D) = V / V*
– (n1 / 2) x (N2 / n2)

• Effort (E) = V / L • Faults (B) = V / S*
S* = Mean number of mental decisions

Cyclomatic complexity
• Indicates program’s testability and understandability • Denotes number of regions in a graph • Gives the linearly independent path that comprises of the program • Effort required to test the program • M = V(G) = e – n + 2p • V(G) = Cyclomatic number of G • e = number of edges • n = number of nodes • p = number of unconnected parts of the graph • Also equal to number of binary decision (if) + 1

Recommendation
• Cyclomatic complexity must be less than 10 • Has moderate to strong correlation with defect rates

Uses
• To help identify overly complex parts needing detailed inspections • To help identify non complex parts likely to have a low defect rate and therefore candidates for development without detailed inspection • To estimate programming and service effort, identify troublesome code and estimate testing effort

Syntactic Constructs
• Complexity takes into account only if then else's • Studies were done taking into account different constructs • Programmers found it difficult to understand the “do- while” construct

Structure metrics
• Takes into account the interaction between the modules of a product • Most important
– Fan-in = count of the modules that call a given module – Fan-out = count of the modules that are called by a given module

• Modules with more Fan-in have insignificant correlation with defect levels • Modules with more Fan-out have positive correlation with defects

Summary
• Two types of metrics
– Syntactic constructs – Structure

• Measure both • Applied for software engineers • Other type of metrics
– Reliability – Project related

Measuring and Analyzing Customer Satisfaction

Need
• TQM point of view
– Integrates product quality with customer satisfaction

• Enhancing CS is the bottom-line for any business • Customer focus is the only way to retain the business • News of dissatisfied customer goes out twice as fast as a satisfied customer

Customer Satisfaction Surveys
• Source
– telephone follow-up – customer complaint data – direct customer visits – customer advisory councils – user conferences

• Must cover the entire customer base

Customer Satisfaction Surveys
• Methods
– Face to face interviews
• • • • • • • • • • prestructured questionnaire limitation is cost factor concerning the interviewer Biases can be introduced in the data by a wrong interviewer Unbiased interviewer should be short and impersonal can be monitored for improvements must be short lack of direct interaction lack of using exhibits Limited group of respondents

– telephone interviews

Customer Satisfaction Surveys
– mailed questionnaires
• • • • • Less expensive less response rates may introduce bias more analysis needed questions must be pre-tested

• Summarizing order
– In person – phone – mail

Sampling Methods
• Use scientific probability sampling methods
– simple random sampling
• Every sample can be selected with p = 1/n

– systematic sampling
• Take every kth individual

– stratified sampling
• group into overlapping “strata”

– cluster sampling
• Group into clusters such as geographical unit

Sample Size
• Sample size depends on
– confidence level – margin of error

• Statistical formulas are available for determining sample size

Analyzing Satisfaction Data
• 5 point scale • Confidence intervals can also be a part of the bar or run charts which show the scale • Some choose customer dissatisfaction as a metric

Analyzing Satisfaction Data
• Find specific attribute satisfaction and Overall satisfaction
– Area of weakness NEED NOT be the priority of improvement – E.g.. Documentation and reliability

• Correlate between attributes and overall satisfaction
– For example 93.8 % reliability figure with high correlation to CS

Satisfaction with company
• Overall satisfaction and Loyalty of a customer has been attributed to
– set of common attributes of the company – satisfaction levels of dimensions of the entire company

• Company and Product = Customer satisfaction • Examples

Example
• Technical solutions
– R, A, Ease of use, pricing, installation

• Support & Service
– Flexible, accessible, product knowledge

• Marketing
– Solution, central point of contact, information

• Administration
– purchasing, billing, warranty expiration, notification

• Delivery
– on time, accurate, post delivery process

• Company image
– technology leader, financial stability, executives image

How much CS is good?
• Long term goal = 100 % • Tradeoff between customer satisfaction and market share • Babich formula

So what do you do?
• Also monitor your competitors CS • perform analyses on specific satisfaction dimensions, quality attributes of products, strength, weakness, prioritization • Do RCA to identify inhibitors in each dimension • Set satisfaction targets • Formulate and implement action plans

Other than this do
• post purchase call back • Complaint management process

Summary
• • • • • Various methods for CS Methods Sampling Analyzes Beyond customer satisfaction

In process quality assessment

Need
• Your quality program is on track to satisfy the objectives • Find out the current and future risks on the project • Meet quality expectation of customers • Method
– Preparation, Evaluation, Summarization and Recommendations

Assessment and Audit
• Audit
– Compares actual process with the defined process

• Assessment
– What is occurring? – What is the difference between right results and results obtained? – What is the likely result?

In process assessment
• Should be integral to project management • Example

Preparation
• Depends on the development process
– Expect only fewer metrics to be available during initial phases – Concentrate on process steps being followed during initial phases

Example

Caution
• • • • Also look at qualitative data Find out who you want to talk to? Find out what you want to talk about? Get variety of people including developers, testers etc. • Frame questions
– Where are we? – What’s the outlook etc?

Evaluation
• Quantitative data
– 7 tools – Get useful information

• Pay attention to things that are abnormal • Example

Evaluation
• Qualitative data
– Use metrics

• Use expert opinion • Criteria
– Use expert judgment and cross validation for your judgment – Use dashboards G,Y, R

Summarization
• Summarize using
– Impact Ranking – Comparison with previous known issues – Identify what is done right

• Example

Overall Assessment Summarization
• Overall assessment is made the bottom line
– Red – high probability of not meeting the quality goals or customer quality expectations – Yellow – moderate risk of not meeting the quality goals or customer quality expectations – Green – likely to meet product quality goals and satisfy customer quality expectations

Desirable vs. undesirable scenario
• With every audit you move from Red to Green and not vice versa
R

Y

G QA1 QA2 QA3 QA4 QA5 QA6 QA7 QA8

Recommendation
• Recommendations are part of assessment summary • No magic formula • Various questions can be answered based on the investigation findings

Risk Mitigation

Summary
• • • • • • What is in-process quality audit? What is a good assessment? Store assessment records Record lessons learned Assessment depends on assessor Manage risks based on assessments

Statistical Quality Control

Quality and Variability
 

Variability is the enemy of quality Variability can be controlled using
 

Inspection Prevention



Use statistial process control

SPC


Two type of variability
 

Random Variation Systematic Variation

Mean Control Charts
   

Center line is the control line Lower Control Limit Upper Control Limit Process statistics must fall between LCL and UCL Process is in Control if the values fall in between LCL and UCL and are random Any pattern implies the process may be out of control





Using LCL and UCL


Once statistics shows that the process is out of control, use the fish bone diagram or other means to find the root cause and get the process in control

Using Mean Control Charts


Calculate grand mean given the data
  

Grand mean = sum of all observations ---------------------------------(Number of samples * number of observations in each sample)



LCL = Grand mean - ((3 * mean range) / (d2 * sqrt(n))) UCL = Grand mean + (( 3 * mean range)/ (d2 * sqrt(n)))




d2 will be given as a table for each sample size

Using Mean Control Charts
     

Calculate Grand Mean Calculate RANGE MEAN Check value of d2 Get the LCL and UCL Study the pattern of the control chart Make conclusions

Decision Tree Analysis
  

Define the problem in structured terms Model the decision process as a decision tree Apply the appropriate probability values and financial data Solve the decision tree Perform Sensitivity Analysis List the underlying assumptions

  

Thank you

Software Quality Model Requirements for Software Quality Engineering
Marc-Alexis Côté M. Ing ([email protected]) Witold Suryn PhD ([email protected]) Elli Georgiadou ([email protected])

Keywords

Software Quality Engineering, Software Quality Models, ISO/IEC 9126.
Abstract

Software Quality Engineering is an emerging discipline that is concerned with improving the approach to software quality. It is important that this discipline be firmly rooted in a quality model satisfying its needs. In order to define the needs of this discipline, the meaning of quality is broadly defined by reviewing the literature on the subject. Software Quality Engineering needs a quality model that is usable throughout the software lifecycle and that it embraces all the perspectives of quality. The goal of this paper is to propose a quality model suitable for such a purpose, through the comparative evaluation of existing quality models and their respective support for Software Quality Engineering.
Introduction

Over the last decade, the general focus of the software industry has shifted from providing ever more functionality to improving what has been coined as the user experience. The user experience refers to characteristics such as ease-of-use, security, stability and reliability. Improvements in such areas lead to an improved quality as perceived by the end users. Some software products, most notably Microsoft's next iteration of their Windows operating system, have been delayed by as much as two years in order to improve their quality. There is no doubt that software quality is becoming an increasingly important subject in software engineering. Traditionally, software requirements have been classified either as functional or nonfunctional with eventual notions of quality hidden in the latter. As the industry focus is shifting from functionality to improving quality, a new category of requirements focused on quality is emerging. In order to define these new quality requirements, quality itself must be defined. A quality model provides the framework towards a definition of quality. Engineers have long recognised that in order for something to find its way in a product, it should be properly defined and specified. Unfortunately, the push towards software quality that can be observed in the industry today is lacking a solid foundation in the form of an agreed upon quality model that can be used not only to evaluate software quality, but also to specify it. Bourque (2000) suggests that the implementation of quality in a software product is an effort that should be formally managed throughout the Software Engineering lifecycle. The implementation of quality should therefore begin with the specification of user quality requirements. Such an approach to the implementation of quality leads to Software Quality Engineering. Suryn (2003) has suggested that this discipline be defined as the application of a

continuous, systematic, disciplined, quantifiable approach to the development and maintenance of quality of software products and systems; that is, the application of quality engineering to software. The objective of this paper is to identify the requirements for a software quality model to be used as a foundation to Software Quality Engineering.
Definition of Software Quality

What exactly constitutes the quality of a product is often the subject of a hot debate. The reason the concept of quality is so controversial is that people fail to agree on what it means. For some it is “[the] degree to which a set of inherent characteristics fulfills requirements” (ISO/IEC 1999b) while for others it can be synonymous with “customer value” (Highsmith, 2002), or even “defect levels” (Highsmith, 2002). A possible explanation as to why any of these definitions fail to garner a consensus is that they generally fail to recognize the different perspectives of quality. Kitchenham and Pfleeger (1996), by reporting the teachings of David Garvin, report on the 5 different perspectives of quality: • The transcendental perspective deals with the metaphysical aspect of quality. In this view of quality, it is “something toward which we strive as an ideal, but may never implement completely.” (Kitchenham & Pfleeger, 1996); The user perspective is concerned with the appropriateness of the product for a given context of use. Kitchenham and Pfleeger further note that “whereas the transcendental view is ethereal, the user view is more concrete, grounded in the product characteristics that meet user's needs”; The manufacturing perspective represents quality as conformance to requirements. This aspect of quality is stressed by standards such as ISO 9001, which defines quality as “[the] degree to which a set of inherent characteristics fulfills requirements” (ISO/IEC 1999b). Other models, like the Capability Maturity Model (CMM) state that the quality of a product is directly related to the quality of the engineering process, thus emphasising the need for a manufacturing-like process; The product perspective implies that quality can be appreciated by measuring the inherent characteristics of the product. Such an approach often leads to a bottom-up approach to software quality: by measuring some attributes of the different components composing a software product, a conclusion can be drawn as to the quality of the end product; The final perspective of quality is value-based. This perspective recognises that the different perspectives of quality may have a different importance, or value, to various stakeholders.









One could argue that in a world where conformance to ISO and IEEE standards is increasingly present in contractual agreements and used as a marketing tool (Adey & Hill, 2000), all the perspectives of quality are subordinate to the manufacturing view. This importance of the manufacturing perspective has increased throughout the years through works like Quality is Free (Crosby, 1979) and the popularity of movements like Six-Sigma (Biehl, 2001). The predominance of the manufacturing view in Software Engineering can be traced back to the 1960s, when the US Department of Defense and IBM gave birth to Software Quality Assurance (Voas, 2003). This has led to the belief that adherence to a development process, as in manufacturing, will lead to a quality product. The corollary to this belief is that process improvement will lead to improved product quality. According to many renowned researchers, this belief is false, or at least flawed. Geoff Dromey states:

“The flaw in this approach [that you need a quality process to produce a quality product] is that the emphasis on process usually comes at the expense of constructing, refining, and using adequate product quality models.” (Dromey, 1996) Kitchenham and Pfleeger reinforce this opinion by stating: “There is little evidence that conformance to process standards guarantees good products. In fact, the critics of this view suggest that process standards guarantee only uniformity of output [...]” (Kitchenham & Pfleeger, 1996) Furthermore, data available from Agile (Highsmith, 2002) projects show that high quality is attainable without following a manufacturing-like approach. However, recent studies conducted at Motorola (Eickelman, 2003; Diaz & Sligo, 1997) and Raytheon (Haley, 1996) show that there is indeed a correlation between the maturity level of an organization as measured by the Capability Maturity Model and the quality of the resulting product. These studies provide data on how a higher maturity level (as measured by the CMM) can lead to: • • • • Improved error/defect density (i.e. the error/defect density lowers as maturity improves) Lower error rate Lower cycle time (time to complete parts of the lifecycle) Better estimation capability

From these results, one could conclude quality can be improved by following a mature process. Georgiadou (2003a) studied the development of lifecycle models, and established that the maturity of the development process is reflected by the emphasis and location of testing and other quality assurance activities. Her study demonstrated that the more mature the process and its underlying lifecycle model the earlier the identification of errors in the deliverables. However, these measured improvements are directly related to the manufacturing perspective of quality. Therefore, such quality improvement efforts fail to address the other perspectives of quality. This might be one of the reasons that some observers of the software development scene perceive the “quality problem” as one of the main failings of the software engineering industry. Furthermore, studies show that improvement efforts grounded in the manufacturing perspective of quality are difficult to scale down to smaller projects and/or smaller teams (Laitinen, 2000; Boddie, 2000). Indeed, rather than being scaled down in smaller projects, these practices are simply not performed. Over recent years, researchers have proposed new models that try to encompass more perspectives of quality than just the manufacturing view. Geoff Dromey (1995; 1996) proposed such model a model in which the quality of the end product is directly related to the quality of the artifacts that are a by-product of the process being followed. Therefore, he developed different models that can be used to evaluate the quality of the requirements model, the design model and the resulting software. The reasoning is that if quality artifacts are conceived and produced throughout the lifecycle, then the end product will manifest attributes of good quality. This approach can clearly be linked to the product perspective of quality with elements from the manufacturing view. This is certainly a step forward from the manufacturing-only approach described above, but it fails to view the engineering of quality as a process that covers all the perspectives of quality. Pfleeger (2001) warns against approaches that focus only on the product perspective of quality:

“This view [the product view] is the one often advocated by software metrics experts; they assume that good internal quality indicators will lead to good external ones, such as reliability and maintainability. However, more research is needed to verify these assumptions and to determine which aspects of quality affect the actual product's use.” Georgiadou (2003b) developed a generic, customisable quality model (GEQUAMO) which enables any stakeholder to construct their own model depending on their requirements. In a further attempt to differentiate between stakeholders Siaka et al (1997) studied the viewpoints of users, sponsors and developers as three important constituencies/stakeholders and suggested attributes of interest to each constituency as well as level of interest. More recently, Siaka and Georgiadou (2005) reported the results of a survey amongst practitioners (from the UK, Greece, Egypt and Cyprus) on the importance placed on product quality characteristics. Using their empirical results they extended ISO 9126 by adding two new characteristics namely Extensibility and Security which have gained in importance in today’s global and inter-connected environment. The above observations illustrate the main disagreements that exist in both the research community and the industry on the subject of software quality. The goal of a quality model is in essence to provide an operational definition of quality. While specific definitions have been established for given contexts, there is no consensus as to what constitutes quality in the general sense in software engineering. A first requirement for a software quality model to be useful as a foundation for Software Quality Engineering is thus to encompass all the perspectives of quality mentioned at the beginning of this section.
Specification and evaluation of quality

Software Quality Engineering calls for a formal management of quality throughout the lifecycle. In order to support this requirement, a quality model should have the ability to support both the definition of quality requirements and their subsequent evaluation. This can be explained by referring to the manufacturing perspective of quality, which states that quality is conformance to requirements. A quality model that is to be used as the foundation for the definition of quality requirements should help in both the specification of quality requirements and the evaluation of software quality. IEEE Std 1061-1998 (IEEE, 1998) defines this as a top to bottom and bottom to top approach to quality: From a top down perspective the [quality] framework facilitates: • • • Establishment of quality requirements factors, by customers and managers early in a system's life cycle; Communication of the established quality factors, in terms of quality sub-factors, to the technical personnel; Identification of metrics1 that are related to the established quality factors and quality sub-factors.

From a bottom up perspective the [quality] framework enables the managerial and technical personnel to obtain feedback by In 2002, the ISO/IEC JTC1 sub-committee SC7 – Systems and Software Engineering – replaced the term “metric” by “measure” to align its vocabulary with the one used in metrology. This thesis will use the term measure whenever possible.
1

• •

Evaluating the software products and processes at the metrics level; Analysing the metric values to estimate and assess the quality factors.

A quality model that is to be used as the foundation for the definition of quality requirements should help in both the specification of quality requirements and the evaluation of software quality. In other words, it should be usable from the top of the development process to the bottom and from the bottom to the top.
Evaluation of quality models

Three requirements that a quality model should possess to be a foundation for Software Quality Engineering have been identified: • • A quality model should support the 5 different perspectives of quality as defined by Kitchenham and Pfleeger (1996); A quality model should be usable from the top to the bottom of the lifecycle as defined by IEEE Std 1061-1998 (IEEE, 1998), i.e. should allow for defining quality requirements and their further decomposition into appropriate quality characteristics, subcharacteristics and measures; A quality model should be usable from the bottom to top of the lifecycle as defined by IEEE Std 1061-1998 (IEEE, 1998), i.e. should allow for required measurements and subsequent aggregation and evaluation of obtained results.



Four quality models will be evaluated with respect to these requirements.
McCall

McCall (McCall, Richards & Walters, 1977) introduced his quality model in 1977. According to Pfleeger (2001), it was one of the first published quality models. Figure 1 presents this quality model. Each quality factor on the left hand side of the figure represents an aspect of quality that is not directly measurable. On the right hand side are the measurable properties that can be evaluated in order to quantify the quality in terms of the factors. McCall proposes a subjective grading scheme ranging from 0 (low) to 10 (high). Regarding this model, Pressman notes that “unfortunately, many of the metrics defined by McCall et al. can be measured only subjectively” (Pressman, 2001). It is therefore difficult to use this framework to set precise and specific quality requirements. Furthermore, some of the factors and measurable properties, like traceability and self-documentation among others, are not really definable or even meaningful at an early stage for non-technical stakeholders. This model is not applicable with respect to the criteria outlined in the IEEE Standard for a Software Quality Metrics Methodology for a top to bottom approach to quality engineering. Furthermore, it emphasises the product perspective of quality to the detriment of the other perspectives. It is therefore not suited as a foundation for Software Quality Engineering according to the stated premises.

Figure 1: McCall’s Quality Model Adapted from Pfleeger (2003) and McCall et al. (1977)
Boehm

Boehm's quality model improves upon the work of McCall and his colleagues (Boehm, Brown, Kaspar, Lipow & MacCleod, 1978). As Figure 2 shows, this quality model loosely retains the factor-measurable property arrangement. However, for Boehm and his colleagues, the prime characteristic of quality is what they define as “general utility”. According to Pfleeger (2001), this is an assertion that first and foremost, a software system must be useful to be considered a quality system. For Boehm, general utility is composed of as-is utility, maintainability and portability (Boehm et al., 1976): • • • How well (easily, reliably, efficiently) can I use it [software system] as-is? How easy is it to maintain (understand, modify, and retest)? Can I still use it if I change my environment?

If the semantics of McCall's model are used as a reference, the quality factors could be defined as: Portability, Reliability, Efficiency, Human Engineering, Testability, Understandability and Modifiability. These factors can be decomposed into measurable properties such as Device Independence, Accuracy, Completeness, etc. Portability is somewhat incoherent in this classification as it acts both as a top level component of general utility, and as a factor that possesses measurable attributes.

Figure 2: Boehm’s quality model Adapted from Pfleeger (2003), Boehm et al. (1976; 1978) It is interesting to note that in opposition to McCall's model, Boehm's model is decomposed in a hierarchy that at the top addresses the concerns of end-users while the bottom is of interest to technically inclined personnel. It is in effect the emergence of the user perspective of quality. However, this interest wanes when one reads Boehm's definition of the characteristics of software quality. Except for General Utility and As-is Utility, all definitions begin with “Code possesses the characteristic [...]”. The measurable properties and characteristics therefore concentrate on highly technical details of quality that are difficult to grasp for nontechnical stakeholders that are typically involved early in the software lifecycle. The characteristics General Utility and As-is Utility are too generic and imprecise to be useful for defining verifiable requirements. Like the McCall model, this model is mostly useful for a bottom to top approach to software quality (i.e. it can effectively be used to define measures of software quality, but is more difficult to use to specify quality requirements). While this model is a step forward in the sense that it provides basic support for a top to bottom approach to software quality, this support is too ephemeral to be considered as a solid foundation for quality engineering.
Dromey

Dromey's (1995) model takes a different approach to software quality then the two previously presented models. For Dromey, a quality model should clearly be based upon the product perspective of quality: “What must be recognized in any attempt to build a quality model is that software does not directly manifest quality attributes. Instead it exhibits product characteristic that imply or contribute to quality attributes and other characteristics (product defects) that detract from the quality attributes of a product. Most models of software quality fail to deal with the product characteristics side of the problem adequately and they also fail to make the

direct links between quality attributes and corresponding product characteristics.” (Dromey, 1995) (Emphasis added to support the argument) Dromey has built a quality evaluation framework that analyzes the quality of software components through the measurement of tangible quality properties (Figure 3). Each artifact produced in the software lifecycle can be associated with a quality evaluation model. Dromey gives the following examples of what he means by software components for each of the different models: • • • • Variables, functions, statements, etc. can be considered components of the implementation model; A requirement can be considered a component of the requirements model; A module can be considered a component of the design model; Etc.

According to Dromey (1995), these components all possess intrinsic properties that can be classified into four categories: • • • • Correctness: Evaluates if some basic principles are violated. Internal: Measure how well a component has been deployed according to its intended use. Contextual: Deals with the external influences by and on the use of a component. Descriptive: Measure the descriptiveness of a component (for example, does it have a meaningful name?).

These properties are used to evaluate the quality of the components. This is illustrated in Figure 4 for a variable component present in the implementation model.

Figure 3 : Dromey’s Quality Model

Figure 4 : Quality evaluation of a variable component It seems obvious from the inspection of the previous figures that Dromey's model is focused on the minute details of quality. This is stated explicitly: “What we can do is identify and build in a consistent, harmonious, and complete set of product properties (such as modules without side effects) that result in manifestations of reliability and maintainability.” (Dromey, 1996) For Dromey, the high level characteristics of quality will manifest themselves if the components of the software product, from the individual requirements to the programming language variables1, exhibit quality-carrying properties. Dromey's hypothesis should be questioned. If all the components of all the artifacts produced during the software lifecycle exhibit quality-carrying properties, will the resulting product manifest characteristics such as maintainability, functionality, and others? The following analogy will be useful in answering this question: If you buy the highest quality flour, along with the highest quality apples and the highest quality cinnamon, will you automatically produce an apple pie that is of the highest quality? The answer is obviously negative. In addition to quality ingredients, at least three more things are needed in order to produce an apple pie of the highest quality: • A recipe (i.e. an overall architecture and an execution process). Dromey acknowledges this by identifying process maturity as a desirable high level characteristic. However, it is only briefly mentioned in both his publications on the subject (Dromey, 1995; Dromey, 1996). The consumer's tastes must be taken into account. In order for the result to be considered of the highest quality by the consumer, it needs to be tuned to his tastes. This is akin to what is commonly called user needs in software engineering. User



needs are completely ignored by Dromey. However, as it was demonstrated in the introduction, they are an integral and non-negligible part of software quality. • Someone with the qualifications and the tools to properly execute the recipe. While Dromey's work is interesting from a technically inclined stakeholder's perspective, it is difficult to see how it could be used at the beginning of the lifecycle to determine user quality needs. Dromey (1995) states that software quality “must be considered in a systematic and structured way, from the tangible to the intangible”. By focusing too much on the tangible, Dromey fails to build a model that is meaningful for stakeholders typically involved at the beginning of the lifecycle. Do end users care about the variable naming convention or module coupling? In most cases, it is doubtful that this question can be answered affirmatively. Therefore, this model is rather unwieldy to specify user quality needs. This does not mean that it cannot be useful later on as a checklist for ensuring that product quality is up to standards. It can definitely be classified as a bottom to top approach to software quality. Furthermore, as was illustrated at the beginning of this section, this quality model has its roots in the product perspective of quality, to the detriment of other perspectives. Therefore, it fails to qualify as a foundation for Software Quality Engineering according to the established requirements.
ISO/IEC 9126

In 1991, the International Organization for Standardization introduced a standard named ISO/IEC 9126 (1991): Software product evaluation - Quality characteristics and guidelines for their use. This standard aimed to define a quality model for software and a set of guidelines for measuring the characteristics associated with it. ISO/IEC 9126 quickly gained notoriety with IT specialists in Europe as the best way to interpret and measure quality (Bazzana, Anderson & Jokela, 1993). However, Pfleeger (2001) reports some important problems associated with the first release of ISO/IEC 9126: • • • There are no guidelines on how to provide an overall assessment of quality. There are no indications on how to perform the measurements of the quality characteristics. Rather than focusing on the user view of software, the model's characteristics reflect a developer’s view of software.

According to Pfleeger, this first incarnation of ISO/IEC 9126 is not usable as a bottom up approach to quality engineering, and even less usable as a top down approach. In order to address these concerns, an ISO committee began working on a revision of the standard. The results of this effort are the introduction of a revised version of ISO/IEC 9126 focusing on the quality model, and a new standard, ISO/IEC 14598 (ISO/IEC, 1999a) focusing on software product evaluation. ISO/IEC 14598 addresses Pfleeger's first concern while the revision to ISO/IEC 9126 aims to resolve the second and third issues. ISO/IEC 9126 is now a four part standard: • • • • ISO/IEC 9126-1 (ISO/IEC, 2001a) defines an updated quality model. ISO/IEC 9126-2 (ISO/IEC, 2003a) defines a set of external measures. ISO/IEC 9126-3 (ISO/IEC, 2003b) defines a set of internal measures. ISO/IEC 9126-4 (ISO/IEC, 2001b) defines a set of quality in use measures.

The new quality model defined in ISO/IEC 9126-1 recognises three aspects of software quality and defines them as follows: (the full definition is given as it is pertinent to the discussion that ensues). • Quality in Use: Quality in use is the user's view of the quality of the software product when it is used in a specific environment and a specific context of use. It measures the extent to which users can achieve their goals in a particular environment, rather than measuring the properties of the software itself. (ISO/IEC, 2001a) • External quality: External quality is the totality of characteristics of the software product from an external view. It is the quality when the software is executed, which is typically measured and evaluated while testing in a simulated environment with simulated data using external metrics. During testing, most faults should be discovered and eliminated. However, some faults may still remain after testing. As it is difficult to correct the software architecture or other fundamental design aspects of the software, the fundamental design remains unchanged throughout the testing. (ISO/IEC, 2001a) • Internal Quality: Internal quality is the totality of characteristics of the software product from an internal view. Internal quality is measured and evaluated against the Internal Quality requirements. Details of software product quality can be improved during code implementation, reviewing and testing, but the fundamental nature of the software product quality represented by the Internal Quality remains unchanged unless redesigned. (ISO/IEC, 2001a) The Internal and External Quality model is inspired from McCall and Boehm's work. It is a three-layer model composed of quality characteristics, quality subcharacteristics and quality measures. Figure 5 illustrates this model. More than 100 measures of Internal and External Quality are proposed as part of the standard. It is important to note that the measures do not make an exhaustive set, which means that other measures can also be used. Finally, Quality in Use is modeled in a different way than Internal and External Quality. Figure 6 illustrates the two-layer Quality in Use model composed of characteristics and quality measures. Table XV provides the definition of the characteristics.

Figure 5 : 3-layer model for internal and external quality. Adapted from (ISO/IEC, 2001a)

Figure 6 : Quality in use model. Adapted from (ISO/IEC, 2001a) Theoretically, Internal Quality, External Quality and Quality in Use are linked together with a predictive mode1. This is illustrated in Figure 7.

Figure 7 : Relationships between the different aspects of quality. Adapted from (ISO/IEC, 2001a) This prediction relationship states that user quality needs should first be established and specified using the Quality In Use model. From these requirements as well as other sources, External Quality requirements should be established using the External Quality model. Finally, the Internal Quality requirements should be constructed from the External Quality requirements and other sources. Once the requirements are established and software construction is under way, the quality model can be used to predict the overall quality. For example, measurement of Internal Quality can be useful in predicting External Quality. Likewise, measurement of External Quality can be useful in predicting Quality in Use.

The above paragraphs describe the ideal theoretical model that links these three aspects of quality. However, in reality, no model may claim to follow perfectly this prediction mechanism. Although the ISO/IEC 9126 model follows this approach closely, no claims are made as to the real predictive power of the model. While the links between Internal and External Quality seem rather obvious because the models are essentially the same, caution must be exercised. While the name of the characteristics and subcharacteristics are the same, the links between Internal and External Quality must be verified empirically. The same reasoning applies to the links between External Quality and Quality in Use. The new version of ISO/IEC 9126 is gaining momentum in the industry. Some corporate quality models, for example MITRE's SQAE (Martin & Shaffer, 1996), are beginning a migration from a model based on McCall's and Boehm's research to one based on ISO/IEC 9126 (Côté, Suryn, Martin & Laporte, 2004a; Côté, Suryn, Martin & Laporte, 2004b; Côté, Suryn, Laporte & Martin, 2005). This new version of ISO/IEC 9126 is thus seen as an improvement upon the older quality models. It is interesting to see how the three aspects of quality defined above can be directly linked to the perspectives of quality that were outlined previously. More specifically: • ISO/IEC 9126-4, which defines Quality in Use, is directly related to the user and value-based perspectives. The definition of the user perspective of quality states that it is concerned with the appropriateness of a product for a given context of use. Quality in Use is defined as the capability of the software product to enable specified users to achieve specified goals in specified contexts of use. The relationship between the two is clear. Quality in Use and the value based perspective of quality are linked essentially through the Satisfaction characteristic. This characteristic inherently recognises that quality can have a different meaning and/or value for different stakeholders. Satisfaction levels can thus be set according to those levels of perception. This has been demonstrated by the study reported in Siakas and Georgiadou (2005). ISO/IEC 9126-3, which defines Internal Quality, and ISO/IEC 9126-2, which defines External Quality, are directly related to both the manufacturing and product perspectives. The definitions of the quality characteristics Functionality and Reliability can be linked with the manufacturing perspective of quality. Reliability, Usability, Efficiency, Maintainability and Portability are all inherent characteristics of the product and a manifestation of the product perspective of quality.



Figure 8 : Relationships between ISO/IEC 9126 and the perspectives of quality

From the review of the different quality models, one might point out that none seem to address the transcendental perspective of quality. One might even ask the following pertinent question: Does ISO/IEC 9126 address the transcendental perspective of quality? Recall that the transcendental perspective of quality relates to quality as something that is recognised but not defined. At this point, the following hypothesis will be made: As the transcendental perspective of quality cannot be defined, it cannot be explicitly implemented in a software product. However, the transcendental aspect of quality will emerge when a holistic approach to quality engineering is taken. This model seems to recognise all the perspectives of quality as important contributors to the overall assessment of quality. It takes an incremental approach to software quality that begins with Quality in Use, something that is easy to grasp for non-technical stakeholders, and ends with Internal Quality, something more technically inclined stakeholders will feel more comfortable with. Furthermore, there is a comprehensive set of suggested measures that allow for the assessment of software quality. ISO/IEC 9126 is thus the only model that fulfills all the stated requirements for a model to be useful as a foundation to Software Quality Engineering. [NEED TO DISCUSS THIS Conclusion

This paper has defined three requirements that a quality model should meet to serve as a foundation to Software Quality Engineering: • • • A quality model should support the 5 different perspectives of quality as defined by Kitchenham and Pfleeger (1996). A quality model should be usable from the top to the bottom of the lifecycle as defined by IEEE Std 1061-1998 (IEEE, 1998). A quality model should be usable from the bottom to top of the lifecycle as defined by IEEE Std 1061-1998 (IEEE, 1998). It was found that the models proposed by McCall, Boehm and Dromey focus on the product perspective of quality to the detriment of other perspectives. Furthermore, they are primarily useful in a bottom up approach to quality that is not suitable for Software Quality Engineering. ISO/IEC 9126 is the only model that supports all the perspectives of quality (with the exception of the transcendental perspective as noted). Furthermore, its predictive framework clearly supports both the top down and bottom up approaches.

These criteria were applied to four quality models: •



This paper has focused on analysing the semantics of the different models with respect to the stated requirements. In theory, ISO/IEC 9126 seems well suited for Software Quality Engineering. Further research is needed to see if the measures associated with ISO/IEC 9126 make this model usable for Software Quality Engineering in practice.
References Adey, C. A. & Hill, G. K. (2000). Quality / ISO 9000 as a Marketing Tool, [En ligne]. http://www.smps.org/mrc/articles/0200qualityiso.pdf Bazzana, G., Anderson, O., & Jokela, T. (1993). ISO 9126 and ISO 9000: Friends or foes? Presented at Software Engineering Standards Symposium.

Biehl, R. E. (2001). Six sigma for Software. IEEE Software, 21(2), 68-70. Boddie, J. (2000). Do We Ever Really Scale Down?, IEEE Software,17(5), 79-81. Boehm, B. W., Brown, J. R., Kaspar, J. R., Lipow, M. L. & MacCleod, G. (1978). Characteristics of Software Quality. New York: American Elsevier. Boehm, B. W., Brown, J. R., Lipow, M. L. (1976). Quantitative Evaluation of Software Quality. Proceedings of the 2nd international conference on Software engineering, San Fransisco, California, United States, 592-605, IEEE Computer Society Press. Bourque, P., R. Dupuis, A. Abran, J.W. Moore, L.L. Tripp et S. Wolff, (2000) Fundamental Principles of Software Engineering - A Journey, Journal of Systems and Software, Côté, M.-A., Suryn, W., Martin, R. A., Laporte, C. Y. (2004a). Evolving a Corporate Software Quality Assessment Exercice: A Migration Path to ISO/IEC 9126, Software Quality Professional, 6(3), 4-17. Côté, M.-A., Suryn, W., Martin, R. A., Laporte, C. Y. (2004b). The analysis of the industrial applicability of software product quality ISO standards: the context of MITRE's Software Quality Assessment exercise, in Proceedings of the 12th International Software Quality Management & INSPIRE Conference (BSI) 2004, Canterbury, Kent, United Kingdom. Côté, M.-A., Suryn, W., Laporte, C. Y., Martin, R. A. (2005). The Evolution Path for Industrial Software Quality Evaluation Methods Applying ISO/IEC 9126:2001 Quality Model: Example of MITRE's SQAE Method, Software Quality Journal, vol. 13, 17-30. Crosby, P.B. (1979). Quality is free: The art of making quality certain. New York : McGrawHill. Diaz M. & Sligo, J. (1997). How Software Process Improvement Helped Motorola, IEEE Software, 17(5), 75-81. Dromey, R. G. (1995). A model for software product quality. IEEE Transactions on Software Engineering 21, 146-162. Dromey, R. G. (1996). Cornering the Chimera. IEEE Software, 13(1), 33-43. Eickelman, N. (2003). An Insider's View of CMM Level 5, IEEE Software, 20(4), 79-81. Haley, T. J. (1996). Software Process Improvement at Raytheon, IEEE Software, 13(6), 3341. Georgiadou E. (2003 a) Software Process and Product Improvement, A Historical Perspective, International Journal of Cybernetics, Volume 1, No1, Jan 2003 pp172-197 Georgiadou E.(2003b) GEQUAMO– A Generic, Multilayered, Customisable, Software Quality Model, Volume 11, Number 4 , 313-323 November 2003 Highsmith, J. (2002). Agile Software Development Ecosystems, Addison-Wesley Professional. IEEE. 1998. Std. 1061-1998 IEEE Standard for a Software Quality Metrics Methodology.

ISO/IEC. 1999a. ISO/IEC 14598-1: Software product evaluation-Part 1 : General overview. Geneva, Switzerland: International Organization for Standardization. ISO/IEC. 1999b. ISO/IEC 9000:2000 Quality management systems -- Fundamentals and vocabulary . Geneva, Switzerland: International Organization for Standardization. ISO/IEC. 2000. ISO/IEC 15288: System Life Cycle Processes. Geneva, Switzerland: International Organization for Standardization. ISO/IEC. 2001a. ISO/IEC 9126-1: Software Engineering-Software product quality-Part 1 : Quality model. Geneva, Switzerland: International Organization for Standardization. ISO/IEC. 2001b. ISO/IEC DTR 9126-4: Software engineering-Software product quality-Part 4: Quality in use metrics. Geneva, Switzerland: International Organization for Standardization. ISO/IEC. 2003a. ISO/IEC TR 9126-2: Software Engineering-Software product quality-Part 2 : External metrics. Geneva, Switzerland: International Organization for Standardization. ISO/IEC. 2003b. ISO/IEC TR 9126-3: Software engineering-Software product quality-Part 3: Internal metrics. Geneva, Switzerland: International Organization for Standardization. Kitchenham, S. L., Pfleeger (1996). Software Quality: The Elusive Target. IEEE Software,13(1), 12-21. Laitinen, M. (2000). Scaling Down is Hard to Do, IEEE Software, 17(5), 78-80. Leffingwell, D. & Widrig, D. (1999). Managing Software Requirements, A Unified Approach. Addison-Wesley Professional. Martin, R. A. & Shaffer, L. (1996). Providing a framework for effective software quality assessment. Bedford, Mass : MITRE Corporation. McCall, J. A., Richards, P. K., & Walters, G. F. (1977). Factors in software quality. Griffiths Air Force Base, N.Y. : Rome Air Development Center Air Force Systems Command. Pfleeger, S. L. (2001). Software Engineering: Theory and practice (2nd ed.). Upper Saddle River, N.J. : Prentice Hall. Pressman, R. S. (2001). Software Engineering: A practitioner's approach (5th ed.). Boston: McGraw-hill. Siaka K V., Berki E, Georgiadou E, Sadler C (1997): The Complete Alphabet of Quality Software Systems: Conflicts and Compromises, 7th World Congress on Total Quality&Qualex 97, New Delhi, India, 17-19 February Siaka, K.V., Georgiadou, E. PERFUMES: A Scent of Product Quality Characteristics, SQM 2005, March 2005, UK Suryn, W. (2003). Course notes SYS861. École de Technologie Supérieure, Montréal. Voas, J. (2003). Assuring Software Quality Assurance. IEEE Software, 20(3), 48-49.

Toyota Plant in Japan

Production mix in the plant

Assembling with white gloves.

Easy access to tools

Visual alarm that indicates problems.

Training material used to explain the process and teach people.

5S

Chairs below table to save space.

Device to clean foot. Before entering the production hall, everybody must clean the foot

In the office area, the chairs stays besides and are used only for resting. People work in stand up position (the aim is genba approach). The furniture were elevate. All of them with wheel to facilitate moving..

5S – identification in chairs has the same identification in table, to indicates the right position of the chairs. It could look like excess, but it’s to show the 5S culture. That’s a meeting room of company direction.

5S – all material identified and in its place. In the furniture, the wheels to facilitate moving.

5S. A training material folder to organize all procedures and OPLs. Most of them are made manually.

5S

Floor identification. Each colour has a meaning, Very clear floor.

Board filled by hand. Checklist (beside) is also, made manually.

Skill matrix

Even here you find 5S.

Interest: They are already making plans for the car of the future. (for only 1 person)

Quality Criteria Relationship

 Maintainability and testability vs. efficiency  Optimized and compact code is not easy to maintain.

-

(Inverse) (inverse) (inverse)

 Portability vs. efficiency  Flexibility, reusability and interoperability vs. efficiency
for reusability will all decrease efficiency

-

 The use of optimized software or system utilities will lead to decrease in probability. -

 The generally required for a flexible system, the use if interface routines and the modularity desirable

 Flexibility and reusability vs. integrity
and protection problem.

-

(inverse)

 The general flexible data structures required for flexible and reusable software increase the security

 Interoperability vs. integrity  Reusability vs. reliability
 Reusable software is required to be general:

-

(inverse) (inverse)

 Coupled system allows more avenues of access to more and different users. -

 Maintaining accuracy and error tolerance across all cases is difficult.

 Maintainability vs. flexibility
 Maintainable code arises from code that is well structured.

-

(direct) (direct)

 Maintainability vs. reusability

-

 Well structured easily maintainable code is easier to reuse in other programs either as a library of routines or as code placed directly within another program.

 Portability vs. reusability  Correctness vs. efficiency

-

(direct) (neutral)

 Portable code is likely to be free of environment-specific features. -

 The correctness of code, i.e. its conformance to specification does not influence its efficiency

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close