They came up with a questionnaire to collect that data. Then there were discussions with people from other countries who had heard about the project. They wanted to use the questionnaire to collect data in their country and, to cut a long story short, they decided to create an international organization and a common repository. Over the years we have grown the number of member countries participating until we now have twelve. We have a small permanent staff who collects data, analyzes it, and publishes papers and books including our third edition of Practical Software Project Estimation that has just been published by McGraw-Hill. We also make our data available at a reasonable cost. So in summary, the ISBSG is a not-for-profit organisation that exists simply to try to help the IT industry improve its performance.
John - How large is the repository? I don't know how you would measure the size. Number of projects?
John - One of the things I am really curious about is that since you are collecting data from all these countries, if you were to normalize all factors that influence project performance except the country where the project was done, do you see any differences? OR is a software project the same all over the world?
John - Oh I see. So outsource your projects to Outer Mongolia where they will be done better?
John - I can see offshore projects being divided into two categories, one where the sourcing company has a division offshore and has worked together on several projects. The second is a sourcing company that enters into a contract with another firm like TaTa or Infosys to perform the project. I would imagine that the performance of the first type of project would be better than the second.
Peter - You may be right, but in our report we haven’t differentiated between the two types of offshore development: in-house and outsourced. However, I do think your idea has some validity because one of the significant drivers of project performance is communication. You would expect communication to be better within a single organisation even if the development team is offshore. The ISBSG does collect what we call ‘soft factors’, for some offshore projects communication was listed as a problem.
The general impact of "outsourcing" a project (not offshore), even if one division of a company doing work for another, is a 20% decrease in productivity rate and the defect densities are worse by about 50%, with speed of delivery being similar.
John - I used to work at IBM during the period when the company was moving to build a significant offshore capability and offer clients projects with mixed onshore/offshore delivery. We struggled with early deliveries and adapted our practices, infrastructure, and tools to get better. I suspect that in your repository if you captured the level of experience a team has had with doing offshore work, both in sourcing and delivering, you would find that the more experienced teams have better results.
Something to remember about the statistics I am giving you is that the ISBSG does not keep data on failed projects. We are given data on projects that have been delivered into production. So our data is biased. It is also biased because it is not a random survey of all projects but is data volunteered by organizations that probably have a bias towards getting better at Software Engineering. The mere fact that the organization is aware of the ISBSG is an indicator of its maturity. So our data probably represents the upper end of the industry.
John - Does this give users of the repository false expectations on what might be possible for their organization?
John - So, lets move on to some Patterns of Success. We have already touched on many factors that lead to successful outcomes but tell me what you think the top few contributing factors are for a project to be successful.
Peter - The things that stand out are the size of the development team and the project complexity. The more complex a project the higher the risk of not delivering, and the larger the team size the lower the productivity. Another significant contribution to success is where an organization is pursuing process improvement via CMMI. Even organizations that have only a CMMI level two demonstrate higher productivity. You might think "Hold on, what about all the bureaucratic overhead to comply with CMMI?" Turns out these organizations have only slightly slower speed of delivery, they have somewhat higher productivity, and their defect rates are much better.
John - Why do you think that complying with a CMMI model makes a difference?
Peter - I think it reflects on something you said earlier... level of maturity. These people have looked at what they were doing and have decided that they could get better and are making an investment in doing so. This focus of the organization on process improvement instills a work attitude that influences day to day behaviors.
John - If all these factors are in place for a given project, will that project have an outcome that is an order of magnitude better then average? What is the distribution curve look like as we keep piling on improvements?
Peter - I have not mentioned yet the programming language which is a major influence on productivity. For all these improvements there is an initial negative impact at the time of introduction. For example if a company takes on a modern programming language the productivity is terrible and then over two years of use the productivity improves until it is twice the old baseline. So if it used to take 18 hours per function point it is now 9. These are big improvements.
John - And that assumes that each developer has individual productivity equal to everyone else. And we know from historical data that there is a four fold difference from worst to best individual productivity. As a hiring manager we were always looking for the top talent individuals to bring into the company and create small teams with superstars. However to use a sports analogy, this could sometimes be like an all star game. A team brought together with superstars might under perform another team who has a better fit of the individuals and more experience working together. Does this kind of phenomena every show up in the reports in your repository?
.Peter - We can't collect the experience levels of the the individuals that make up the teams reported to our repository. We do collect some experience data on the project manager.
John - In the second topic of the interview, Failures to Launch, you have not collected data in the repository on failures. I think it would be a fascinating addition... to be able to submit an anonymous report on a failed project and the root causes for the failure.
John - Sure
Peter - We have done a lot of research on what works and does not work (a report called: ‘Techniques and Tools – their impact on projects). We have collected a lot of projects where iterative development is used, Rapid Application Development and Agile Development are examples. Agile projects have a 30% improvement in productivity, and speed of delivery is improved by about 30% as well. We don't have enough data yet to comment on quality improvements.
Another method that seems to work very well is Joint Application Development. Productivity is 10% above average and speed of delivery is 20% above average.
Some things don't seem to make much difference. For example, Object Oriented Development does not seem to improve productivity or speed of delivery over the average.
I don't know if people are still using CASE tools but they had a positive impact on project performance, particularly with much lower defect rates. Given the early rush to CASE tools and the investment in learning to use them, I am not sure people got the dramatic improvements they had hoped when the tools were so popular.Some things don't seem to make much difference. For example, Object Oriented Development does not seem to improve productivity or speed of delivery over the average.
John - I have reviewed one of the ISBSG questionnaires and among a lot of questions you ask if the project is using a particular method such as Agile or TSP or RAD etc. As these reports flow into the repository over time have you seen any trends forming? An increase or decrease in use of a specific method?
One of the very interesting findings we have at the ISBSG is that over the last fifteen years of collecting data we have seen no improvements in average productivity across the industry. All the improvements I mentioned earlier, have not been adopted universally by the industry and are offset by projects continuing to develop in a chaotic manner. So there has been no overall improvement in the way we develop applications. I got the ISBSG Analyst to look at the average productivity over time. We have grouped into groups of five year periods. This is the result (for all software development – New and Enhancements) shown as the median number of hours taken to produce a function point of software:
Median PDR (hours per FP)
1989-1993 7.6
1994-1998 6.7
1999-2003 11.0
2004-2008 12.6
John - Wow. That is a significant finding. Has quality improved or is it flat as well?
So much for silver bullets!
John - I guess that satisfies my need for a "Failure to Launch"... in this case the whole industry has been stagnant in improving project performance of the last decade.
Peter - Oh, that is a difficult one. What we are seeing is that the really successful projects tend to be following the fundamentals, they have small, stable teams, with an experienced project manager, working with languages and infrastructure that they are familiar with. The projects that follow these patterns always out perform any project trying something new. This may disappoint you but all the work we have done looking for real silver bullets tells us that there are none.
John - Thank you very much for your time and your insights
No comments:
Post a Comment