
Software Benchmarking - Boston SPIN.ppt
51页Copyright © 2000 by SPR. All Rights Reserved.PRJ/1Software Benchmarking: What Works and What Doesn’t?6 Lincoln Knoll Drive Burlington, Massachusetts 01803 Tel.: 781.273.0140 Fax: 781.273.5176 Software Productivity Research an Artemis companyNovember 27, 2000Capers Jones, Chief ScientistCopyright © 2000 by SPR. All Rights Reserved.PRJ/2Basic Definitions of Terms • Assessment: A formal analysis of software development practices against standard criteria. • Baseline: A set of quality, productivity, and assessment data at a fixed point in time, to be used for measuring futureprogress. • Benchmark: A formal comparison of one organization’squality, productivity, and assessment data against similardata points from similar organizations.Copyright © 2000 by SPR. All Rights Reserved.PRJ/3Major Forms of Software Benchmarks • Staff compensation and benefits benchmarks• Staff turnover and morale benchmarks• Software budgets and spending benchmarks• Staffing and specialization benchmarks• Process assessments (SEI, SPR, etc.)• Productivity and cost benchmarks• Quality and defect removal benchmarks• Customer satisfaction benchmarksCopyright © 2000 by SPR. All Rights Reserved.PRJ/4TWELVE CRITERIA FOR BENCHMARK SUCCESS The Benchmark data should:1.Benefit the executives who fund it2.Benefit the managers and staff who use it3.Generate positive ROI within 12 months4.Meet normal corporate ROI criteria5.Be as accurate as financial data6.Explain why projects vary7.Explain how much projects vary8.Link assessment and quantitative results9.Support multiple metrics 10.Support multiple kinds of software 11.Support multiple activities and deliverables 12.Lead to improvement in software resultsCopyright © 2000 by SPR. All Rights Reserved.PRJ/5SEVEN BENCHMARK HAZARDSThe Benchmarks should not:1.Conceal the names of projects and units2.Show only overall data without any details3.Omit non-coding activities4.Omit “soft factors” that explain variances5.Support only one metric such as LOC6.Omit quality and show only productivity7.Be used to set ambiguous or abstract targets:- 10 to 1 productivity improvement- 10 to 1 quality improvement- 30% schedule improvement Copyright © 2000 by SPR. All Rights Reserved.PRJ/6REACTIONS TO SOFTWARE BENCHMARKSManagement Level Benchmark ReactionsBoard of DirectorsInterested and supportive CEO/PresidentVery Interested Senior Vice Presidents Very interested Software Vice PresidentsInterested but apprehensive Software DirectorsApprehensive Third-line ManagersVery apprehensive Second-line Managers Very apprehensive First-line ManagersVery apprehensive SupervisorsVery apprehensive Technical StaffSomewhat apprehensiveConclusion: Software middle management is most apprehensive.Copyright © 2000 by SPR. All Rights Reserved.PRJ/7Size Data Source Code Function Points“Soft” Attribute data Environment Tools, Processes Audit trails“Hard” Data Staffing Schedules Effort Costs DefectsNormalized Data Productivity QualitySIZINGASSESSINGMEASURES> 6 square meters) –Avoid small or crowded cubicles with 3 or more staff–Adequate conference and classroom facilities–Excellent internet and intranet communications–Excellent communication with users and clientsCopyright © 2000 by SPR. All Rights Reserved.PRJ/50SOFTWARE BENCHMARK AND IMPROVEMENT PLANS•Think long range: 3 to 5 years•Consider all factors:–Management –Process –Tools –Organization –Skills and training –Programming Languages –Environment•Plan expenses of up to $15,000 per staff member over 3 years•Consider your corporate culture•Expect immediate results•Concentrate only on processor any other “silver bullet”•Expect major improvements for minor expenses•Ignore resistance to changeDODON’TCopyright © 2000 by SPR. All Rights Reserved.PRJ/51Software Benchmark Information Sources • Software Assessments, Benchmarks, and Best PracticesAddison Wesley Longman, 2000. (Capers Jones)• Measuring the Software Process: A Guide to Functional MeasurementPrentice Hall, 1995 (David Herron and David Garmus)• Function Point AnalysisPrentice Hall, 1989 (Dr. Brian Dreger)• http://www.IFPUG.org (International Function Point Users Group) • http://www.SPR.com(Software Productivity Research web site)e。












