Software

Common System and Software Testing Pitfalls

Related Events & Pricing

Related events: Next Generation Testing Conference, London,  25 June 2015

Early Bird (till 15 May) £195 + VAT (if booked with NGT conference £150 + VAT)
Standard Price: £275 + VAT (if booked with NGT conference £195 + VAT)

Background

Software and system testers repeatedly fall victim to the same pitfalls, which you can think of as “anti-patterns” or mistakes that make testing far less effective and efficient than it ought to be. Drawing on his 36 years of software and system engineering experience, Firesmith shows testers and technical managers and other stakeholders how to avoid falling into these pitfalls, recognize when they have already fallen in, and escape while minimizing their negative consequences. Based on his recent book by the same name, this presentation / tutorial will address his taxonomy of 127 pitfalls organized into 18 categories, whereby each pitfall is documented in terms of its name, ID, description, potential applicability, characteristic symptoms, potential negative consequences and causes, specific actionable recommendations for avoiding it or limiting its consequences, and related pitfalls.
 

This taxonomy of pitfalls is can be used as:

         Training materials for testers and testing stakeholders

         Standard terminology regarding commonly occurring testing pitfalls

         Checklists for use when:

        Producing test plans and related documentations

        Evaluating contractor proposals

        Evaluating test plans and related documentation (quality control)

        Evaluating as-performed test process (quality assurance)

        Identifying test-related risks and their mitigation approaches

         Categorization of pitfalls for metrics collection 

 

Speaker  

Donald Firesmith is a principal engineer in the Software Solutions Division of the Software Engineering Institute (SEI), a Federally Funded Research and Development Center (FFRDC), where he helps the United States Military and
other Governmental Agencies acquire large and complex software-reliant systems. He has 36 years of experience in both commercial and governmental software and systems development in numerous application domains that
range from software applications and management information systems to embedded aviation and space systems. His primary areas of expertise include requirements engineering, system and software architecture engineering,
object- oriented development, testing, quality engineering, and process improvement including situational method engineering.

Donald Firesmith has published dozens of technical articles, spoken at numerous international conferences, and has been the program chair or on the program committee of multiple conferences and workshops. He has taught several hundred courses in industry and numerous tutorials at conferences. These articles, presentations, and conference papers can be downloaded from his website: http://donald.firesmith.com. He is the developer of the OPEN Process Framework (OPF) Repository, http://www.opfro.org, the world's largest free open-source website documenting over 1,100 reusable system/software development method components. 

Donald Firesmith is the sole or principle author of the following books: Common System and Software Testing Pitfalls (Addison-Wesley, 2014); The Method Framework for Engineering System Architectures (MFESA) (CRC, 2008); The OPEN Process Framework (Addison-Wesley, 2001); OPEN Modeling Language (OML) Reference Manual (Cambridge University Books, 1998); Documenting a Complete Java Application using OPEN (Addison-Wesley, 1998); The Dictionary of Object Technology: The Definitive Desk Reference (Cambridge University Books, 1995); and Object-Oriented Requirements Analysis and Logical Design: A Software Engineering Approach (John Wiley & Sons, 1993).