Difference between revisions of "Planning a Performance Test"
PeterHarding (talk | contribs) (New page: = Overview= Here is a breakdown of an approach I use to plan performance testing projects The stages in a performance test are: * Analysis and Test Design * Scripting * Data preparation...) |
PeterHarding (talk | contribs) |
||
Line 5: | Line 5: | ||
The stages in a performance test are: | The stages in a performance test are: | ||
# Analysis and Test Design | |||
# Scripting | |||
# Data preparation | |||
# Execution | |||
# Analysis of Results and Reporting | |||
For LoadRunner projects, allow two to three days per script for | For [[LoadRunner]] projects, allow two to three days per script for 1, 2 & 3 (bundled together) - So for five scripts the projected 1-3 effort would be 10 - 15 days. The variation in effort required is a function of the complexity of authentication and state management (use of page IDs, etc.) used in the [[HTML]]. Another factor will be the amount of error handling required in scripts. The more error cases which need to be coded around the longer it takes. This is usually impossible to predict in advance and so a minimal allowance is made for this in the base effor of 2-3 days per script. | ||
For small projects (say ~ 5 scripts) I allow three days for Execution and two days for Reporting. This is the entry level. I would scale the effort for these phases up depending on the complexity of the performance test (typically shows up as the number of scripts) and size of the project. | For small projects (say ~ 5 scripts) I allow three days for Execution and two days for Reporting. This is the entry level. I would scale the effort for these phases up depending on the complexity of the performance test (typically shows up as the number of scripts) and size of the project. | ||
For other technologies (eg. SOAP/ | For other technologies (eg. [[SOAP]]/[[WebServices]] data injection via [[TIBCO]]) I estimate effort based on the size/complexity of the environment, the number of component systems, the number of data paths between them and the number of external interfaces. I use previous performance testing projects as a basis for scoping out the 'other' (non [[LoadRunner]]) components of a test. | ||
[[Category:Performance]] | [[Category:Performance]] | ||
[[Category:Testing]] | [[Category:Testing]] |
Latest revision as of 13:19, 25 January 2008
Overview
Here is a breakdown of an approach I use to plan performance testing projects
The stages in a performance test are:
- Analysis and Test Design
- Scripting
- Data preparation
- Execution
- Analysis of Results and Reporting
For LoadRunner projects, allow two to three days per script for 1, 2 & 3 (bundled together) - So for five scripts the projected 1-3 effort would be 10 - 15 days. The variation in effort required is a function of the complexity of authentication and state management (use of page IDs, etc.) used in the HTML. Another factor will be the amount of error handling required in scripts. The more error cases which need to be coded around the longer it takes. This is usually impossible to predict in advance and so a minimal allowance is made for this in the base effor of 2-3 days per script.
For small projects (say ~ 5 scripts) I allow three days for Execution and two days for Reporting. This is the entry level. I would scale the effort for these phases up depending on the complexity of the performance test (typically shows up as the number of scripts) and size of the project.
For other technologies (eg. SOAP/WebServices data injection via TIBCO) I estimate effort based on the size/complexity of the environment, the number of component systems, the number of data paths between them and the number of external interfaces. I use previous performance testing projects as a basis for scoping out the 'other' (non LoadRunner) components of a test.