SLIDE 36 users during a single test run. We also coordinated with database administrators to clear caches before starting a test run.
E-Commerce Project
The e-commerce project that I worked on was for a company located in Dallas. The company wanted ProtoTest to help with performance testing of the system, and came to us because of our experience with using OpenSTA. They wanted to be able to learn how to do the testing themselves and keep the tests when we were finished. This application was a rewrite of a web-based ordering system, and the software was being developed by a third party. It was developed in Java, the servers ran HPUX, and relied on an Oracle database. Performance test goals for this project included an initial target of 50 concurrent users with transactions taking less than 10 seconds.
Initial Approach
We started the project by identifying which test scripts we would need to simulate the common user activities on the site. The main thing that users did was to create and submit orders, using
- ne of 2 payment methods: credit card or electronic check. A separate script was created for
each payment type. The average user’s order had 60 different items on it. We wanted to try to simulate a fairly realistic load on the server, so we set up our scripts to randomize the number of items on each
- rder, from 30 to 100 items. We also randomly selected which items went on each order. We
applied some of the lessons we had learned from the online banking project to this effort, and made sure that we created reusable components to reduce rework between the scripts. We also made better use of validation within the test scripts to check for and log errors. A set of 30 test user accounts were set up for each payment type, and we were not worried about reuse because the same user could be logged in more than once at the same time. Each user was set up with a large credit limit in case there were problems with third party payment processing. Because the client was located in Dallas we decided to do the work remotely. All the test scripts were created and debugged from the ProtoTest headquarters in Denver. Tests were scheduled to be run with system administrators and developers monitoring servers manually, and we coordinated execution with one another through conference calls.
Initial Results
The first few test runs showed some very poor results in all the timer values. We ran several tests, none of which seemed to generate much processor or disk usage on the servers, but the response times for requests increased significantly as the number of virtual users increased. The following chart shows some of the timer values for one of the order submission scripts. Even those timers that were not very server intensive show marked increases.