Central to our methodology for characterizing and improving application performance is the ability to emulate a realistic workload in a controlled environment. Trends in response time, throughput, and hardware utilization are observed as load is varied to determine application stability and scalability. The definition and construction of an automated workload is the first and most critical phase in any performance project. If the wrong business processes are selected or the emulation is not correct, the project will either fail to deliver the expected return on investment, or worse, set the wrong expectations on the application's performance characteristics. Once the ability to emulate a realistic workload is established, different types of analysis can be performed depending on project goals.
Type of Analysis | Methodology | Deliverables |
---|---|---|
Regression |
|
Regression analysis is ideal for testing as part of the QA cycle, system
tuning, database tuning, and application optimization. It is not useful for
setting expectations for production environments due to lack of sensitivity
analysis and lack of production strength hardware. Reports typically include the following: Response time, throughput, and hardware resource utilization vs. load, as well as application stability characteristics under stress. |
Benchmark |
|
Benchmark Analysis is useful for setting expectations for a specific
production environment. A pass or fail result will be reported according to
predefined requirements. Documentation typically includes the following: Environment, pass/fail criteria, response time, throughput, hardware utilization statistics, and optimizations performed to achieve results. |
Performance Characterization |
|
Characterization report and/or server sizing tools which set expectations on capacity, throughput, and responsiveness for various operating systems, databases, server configurations, data volumes, and hardware topologies. Used for infrastructure budgeting and capacity planning. |