Interests - Ideas

Feb, 2012

  • Peer reviews have been identified as one of the best practices in requirements engineering [Komssi et al., 2010].
  • Software inspections (Fagan Inspection) (the most formal peer review technique) has been found to be effective for the discovery of defects in documents, but many software companies practice inspections infrequently or not at all [Komssi et al., 2010].
  • In software, we generally leave inspections to companies that are quite advanced in their process maturity [O'Neill, 1997]. (this is correct? what about small companies?)
  • Why do most companies skip inspections when there is so much to gain? [O'Neill, 1997]
  • It is a static verification technique and one of its main benefits is that it can be applied to any artifact produced during software development. It is probably one of the few methods that people actually agree will help in improving software quality, although it is not always applied [Aurum et al., 2002].
  • One common problem is that, the terminology used for different types of software review processes is often imprecise and leads to confusion. In particular, there is no universal agreement on what a software inspection process is and how it is different from the other types of software review processes such as a walkthrough, a formal technical review or a management review [Aurum et al., 2002]. (this could be a problem if I want to search for related literature)
Speculations - Prototypes

Oct, 2012

  • More experimental activities should be done to check which of these factors are more important and if avoiding them helps to improve inspections adoption. Now we are making some survey in order to know if in our country (Uruguay) software companies are using software inspections (Fagan or other formal/informal method), how they are performing static analysis and if some of the factors founded in the mapping study also apply to our country.
  • It would be interesting to know whether there is any relationship between the perceived factors causing the low adoption of inspection process and the factors related to bad experiences or failed experiences unreported.
  • It would be interesting to find a relationship between the volume of work in what we called "technical view" in our taxonomy and the causes of the problem under research. One possible way could be to discover if the problems have been tried to resolve creating or modifying inspection techniques instead of trying to understand what happened with the existing ones. This suspicion may be supported by the lack of empirical data to validate theoretical proposals [Kollanus & Koskinen, 2009] [Shull et al., 2003].
Knowledge - Products - Facts

Oct, 2012

  • The result of this research is in this article, but this are the main conclusions:

  • The results of software inspections are generally known and accepted, being present in many publications. For example it is claimed that this method has up to 85% efficiency in removal of defects, with an average of 65% [Jones, 2008]. It also indicates that, combined with practices of testing, it can reduce by a factor of 10 the number of defects detected [8].

  • I've asked to Karl Wiegers about his definition of"Team Review" and if it may be the same as the IEEE1028 "Technical Review". He said "I think you can equate my team reviews with the IEEE's technical reviews. The roles are not as well defined as in inspection, but the idea of a review leader (or moderator) and a scribe (or recorder) is common to both team reviews and technical reviews. Also, the inclusion of a meeting and the need for individual preparation prior to the meeting is common to both. There are other similarities, too."

  • Mapping Study about Software Inspections Adoption, evidence, factors and proposed solutions. See more here (spanish) or here if you are looking the complete list of analyzed articles- These are the main questions answered through this study:

    • Is there evidence for the low adoption of software inspection techniques? (see the evidence)
    • What factors are referred as the main causes of low adoption?(list of factors).
    • What solutions have been proposed about this? (List of suggested solutions).
    • The following tables is some codification made over the complete list of factors (previous bullet)
    Factors that made adoption difficult Freq
    Characteristics of the process or perceived as typical of it 19
    Lack of knowledge to perform effective inspection, to avoid confusion betweenprocesses and make the training of inspectors. 9
    Inspections are considered expensive (upfront cost increase) 5
    Lack of adaptation and improvement of the process according to the context where applicable 4
    Lack of management tools, support, process analysis and results 4
    Lack of time allotted for planning inspections 4
    Lack of monitoring and recording of process execution and results 3
    Bad past experiences and experiences of failures without reporting 3
    Missing resources or resource-intensive consumption 2
    Resistance to change 2
    Distributed Development 1
    Role of facilitator 1
    Others 7