Performance testing is typically done to help identify bottlenecks in a system, establish a baseline for future testing, or support a performance tuning effort. Some performance tests are used to determine compliance with performance goals and requirements, and/or collect other performance-related data to help stakeholders make informed decisions related to the overall quality of the application being tested. In addition, the results from performance testing and analysis can help you to estimate the hardware configuration and scale required to support the application(s) when you “go live” to production. Follow these best practice steps of performance testing.

1. Identify the Test Environment

Identify the physical test environment and the production environment, which includes the hardware, software, and network configurations. Having a thorough understanding of the entire test environment at the outset enables more efficient test design and planning and helps you identify testing challenges early in the project. In some situations, this process must be revisited periodically throughout the project’s life cycle.

2. Identify Performance Acceptance Criteria

Identify the response time, throughput, and resource utilization goals and constraints. In general, response time is a user concern, throughput is a business concern, and resource utilization is a system concern. Additionally, identify project success criteria that may not be captured by those goals and constraints; for example, using performance tests to evaluate what combination of configuration settings will result in the most desirable performance characteristics.

3. Plan and Design Tests

Identify key scenarios, determine variability among representative users such as unique login credentials and search terms. The team must also determine how to simulate that variability, define test data, and establish metrics to be collected. Then consolidate this information into one or more models of system usage to be implemented, executed, and analyzed.

4. Configure the Test Environment

Prepare the test environment, tools, and resources necessary to execute each strategy as features and components become available for test. Ensure that the test environment is instrumented for resource monitoring as necessary.

5. Implement the Test Design

Develop the performance tests in accordance with the test design best practice.

6. Execute the Test

Run and monitor your tests. Validate the tests, test data, and results collection.

7. Analyze Results, Report, and Retest

Consolidate and share results data. Analyze the data both individually and as a cross-functional team. Reprioritize the remaining tests and re-execute them as needed. When all of the metric values are within accepted limits, none of the set thresholds have been violated, and all of the desired information has been collected, you have finished testing that particular scenario on that particular configuration.

Conclusion

Performance testing is a critical part of the application development process. It is very important for testing to be integrated throughout production, not just tacked on the end as an afterthought. Additionally, testing should be viewed as an iterative process of develop, test, adjust or tune, test.

The steps described above are simply a guideline, and each application will have unique needs and challenges to face when testing. Professional performance engineers, like the consultants at LoadStorm, have the knowledge and expertise to help any development team overcome these unique challenges and reach performance goals.

Similar Posts