Planning a Performance Test
Overview
Here is a breakdown of an approach I use to plan performance testing projects
The stages in a performance test are:
- Analysis and Test Design
- Scripting
- Data preparation
- Execution
- Analysis of Results and Reporting
For LoadRunner projects, allow two to three days per script for A, B & C (wrapped up together) - So for five scripts the projected A-C effort would be 10 - 15 days. The variation in effort required is a function of the complexity of authentication and state management (use of page IDs, etc.) used in the HTML. Another factor will be the amount of error handling required in scripts. The more error cases which need to be coded around the longer it takes. This is usually imposible to predict in advance and so a minimal allowance is made for this in the base effor of 2-3 days per script.
For small projects (say ~ 5 scripts) I allow three days for Execution and two days for Reporting. This is the entry level. I would scale the effort for these phases up depending on the complexity of the performance test (typically shows up as the number of scripts) and size of the project.
For other technologies (eg. SOAP/webservices data injection via TIBCO) I estimate effort based on the size/complexity of the environment, the number of component systems, the number of data paths between them and the number of external interfaces. I use previous performance testing projects (such as P<P/EBDM Portal) as a basis for scoping out the 'other' (non LoadRunner) components of a test.