The performance engineering discipline covers many different aspects for making sure your application is ready to be released into the wind and will provide an outstanding experience for your visitor. These include performance testing and production monitoring, including the user experience. The results of a performance test must be a true indicator of the production experience.
During the performance testing process, many tools are used to drive the load and to monitor and measure the resources consumed during the test. A key outcome is to understand the user experience under load. The typical tools, LoadRunner, SOASTA, iTKO, Rational., etc, provide the transaction response time. However, these tools are not used in production. Often times the Application itself is instrumented with logging information that provide response times as well, but must be aggregated and reported on, usually during a performance issue.
The production and operations team often uses a different set of tools to monitor the overall health of the Systems and Applications. Sometime, the production team and the performance team are reporting different information on the performance on the Application. Each may have a different definition of what a transaction is.
To help minimize this disconnection, I recommend using the same tools for performance testing and production monitoring. This approach also increases the number of people in the company who can use these tools. The final performance test results reports, including user experience, will match the format and information provided from the production team. They will both use a common language to describe the workload and the user experience. This link help increase the vlaue the Performance Engineering team brings to the business.