Performancing Metrics

Performance blog: November 2009

Monday, November 23, 2009

Little's Law

In continuation with my earlier post “Decoding concepts of Performance Engineering” I will try to derive Little's law in this post for better understanding

Consider a steady state system which has only one user and sends the requests at consistant intervals and response time of the requests were observed to be 1 sec.

No of Customers in the System = 1
Response Time = 1 sec (Service Demand)
So Maximum Throughput = (1/Service Demand) = 1 Request/Sec (Based on Utilization Law)

It implies average number of customers required being in the system to achieve 1 request/ sec with an average response time is 1 sec is 1 user

Similarly assume if the response time of the system is 0.1 Sec

Response time = 0.1
Max Throughput = 1/Service Demand = 10 requests/Sec

So One single user can simulate max of 10 requests/sec with a average response time of 0.1 sec

Little's Law





Average Number of Customers = Response time * Throughput

Little’s Law states that average number of customers in a system is equal to product of throughput and response time.

Wednesday, November 18, 2009

[DBNETLIB][ConnectionOpen(Connect()).]SQL server does not exist or access denied.

When attempting to open LoadRunner analysis result file, sometimes it may throw the following error:

[DBNETLIB][ConnectionOpen(Connect()).]SQL server does not exist or access denied.

Analysis will throw this error if SQL server is not properly configured in the analysis machine. In order to access LoadRunner LRA file revert back to Access 2000 from SQL server/MSDE.

Go to Load Analysis – Tools – Options - database Tab and select Access 2000 instead of SQL server/MSDE

I will try to post another message for configuring LoadRunner Analysis with SQL server

Sunday, November 15, 2009

Tutorial on Rational Performance Tester

A Very good tutorial on Rational Performance tester (RPT). This tutorial includes
  • Rational Performance Tester: Architecture
  • Rational Performance Tester: Features
  • RPT and the Performance Testing Framework (PTF)
  • Rational Performance Tester: Test Development
  • Rational Performance Tester: Workload Design
  • Rational Performance Tester: Reports
    and more

http://docs.google.com/present/edit?id=0ATOftuUbGkRvZGR3bTkyZGRfNTRncTU2ZGdkYg&hl=en

Friday, November 13, 2009

Understanding Snapshot Attribute in LoadRunner

A snapshot is a graphical representation of the current step and it is an Attribute to functions like Web_URL, web_custom_request etc in LoadRunner

A Sample request is shown below

web_custom_request("Sample_Request",
"URL=http://… /Service",
"Method=POST",
"Resource=0",
"RecContentType=text/xml",
"Mode=HTML",
"Snapshot=t1.inf", //Snapshot attribute is commented
"EncType=text/xml; charset=utf-8",
"Body="
LAST);

When working in Tree view, VuGen displays the snapshot of the selected step in the right pane (as Shown Below). The snapshot shows the client window after the step was executed.(Refer Snapshot)
LoadRunner Snapshot

VuGen captures a base snapshot during recording and another one during replay. You compare the Record and Replay snapshots to determine the dynamic values that need to be correlated in order to run the script.

If Snapshot Attribute is not placed in the request, then Image file will not be generated during the replay(Refer Snapshot).

web_custom_request("Sample_Request",
"URL=http://… /Service",
"Method=POST",
"Resource=0",
"RecContentType=text/xml",
"Mode=HTML",
"EncType=text/xml; charset=utf-8",
"Body="
LAST);

LoadRunner Snapshot

Wednesday, November 11, 2009

Approach to Oracle Apps Performance testing

Oracle Apps Performance testing


During one of my engagements, I was requested to come up with a strategy for performance testing Oracle E-Business suite. Client was major producer and leader of power transmission drives, components and bearings.

Client’s Oracle E-Business Suite is a complete set of business applications that enable the organization to efficiently manage customer interactions, manufacture products, ship orders, collect payments, and more.

Oracle Applications architecture is a framework for multi-tiered, distributed computing that supports Oracle Applications products. In this model, various services are distributed among multiple levels, or tiers. The tiers that compose Oracle Applications are the database tier, which manages the Oracle database; the application tier, which manages Oracle Applications and other tools; and the desktop tier, which provides the user interface display. Only the presentation layer of Oracle Applications is on the desktop tier in the form of a plug-in to a standard Web browser.

To come up with a performance strategy was a challenge to me because of their complex architecture and it was first of a kind in my organization to provide any kind of support for Oracle E-Business Suite.

Test Approach

My approach was to identify the configuration for performance testing in the test environment, for that present production configuration was analyzed to come up with a good approach.

In Production, Application and Database Tiers are present in two separate Solaris boxes and there are 6 instances which are sharing the application tier and 7 instances in DB tier. All the Instances present in the production boxes are oracle applications for different companies (Independent Entities) in that organization and all the hardware resources in the Box like CPU, Memory, and IO are shared across all the instances and the Transactions performed in the several instances are independent to each other.

Test environment which is very much similar to Production was considered for performance testing but Application tier was shared by 14 instances and Database tier was shared by 16 instances which was higher when compared to production.

First recommendation I had given was to the infrastructure team to map the number of instances similar to production in test environment.

I put forward two approaches to the client for the performance test along with the risks

Approach 1

In test environment, create the number of instances similar to prod in both App and DB tier and then analyze the work load model of each and every instance in both application as well as Database tiers and capture the important transactions for all instances and create scripts to replicate the same and execute those transactions in the background such that they utilize the hardware resources in test environment to some extent. Then simulate the work model for our instance and capture and publish the performance metrics

Risks

• Extremely difficult to understand the workload model for all the instances in the production

• Time Consuming and costly.

• Discussion required with multiple stakeholders

Approach 2

Second Approach of the performance test was to ignore the multiple instances and their transactions running in the server and dedicate the maximum amount of Hardware resources Client instance can consume in the test environment with through analysis from the existing production servers such that system utilization should not reach the identified critical level. Based on the analysis, CPU, Memory availability in the test environment should be constrained and Infrastructure Team should help in dedicating the CPU’s and Memory for client’s instance in the test environment for the performance test.

Performance test should then be carried out for the identified transactions in the workload model and capture and publish the performance metrics.

Risks

• Real time performance issues related to multiple instances may not be found

Conclusion

The above two approaches for the performance tests were discussed along with its Pros and Cons. First approach was ideal but given the timeframe available, simulating all the noise for other instances was difficult to implement and also it requires lot of coordination with multiple stakeholders. It was agreed to follow the second approach for the performance test after discussions with infrastructure team

Monday, November 9, 2009

How to reduce the amount of threads per process in LoadRunner

In LoadRunner 9.x, please perform the following steps:


1. Go to the /dat/protocols/QTWeb.lrp file.

2. By default, in web protocol the number of threads mdrv can spawn is 50. Change this value to a lower value (E.g. 15) by adding/changing the following:



[VuGen] //Add below entry if not availble
MaxThreadPerProcess=15

3. Re-run the scenario

Source: HP

Friday, November 6, 2009

User Defined Template feature in LoadRunner 9.5

When testing the application, we also need to test several business processes, to do so we usually reuse same parameter files, runtime settings, Boiler templates etc. In addition we may also want to reuse certain functions for each of the scripts we create. Until now to do this we have to create a new script and then import and copy into it all the necessary files and settings from the existing scripts. With VUGEN 9.5 we can avoid this by using script template.


Create a script with proper boiler plate, parameters files, and runtime settings etc which are common to all the business processes and then save the template. To do this go to File menu– User Defined template – Save as Template in VUGEN

Once the template is created, apply the template, to do this go to File menu –User defined template – Create script from Template and a new script will be created with all the based data as made in the template and we can now start recording our business process on top of it.