Testing Blog Sponsored By     zephyr



Thoughts On Performance Testing Worth Thinking


Written by  Ravi Vadugu | 12 December 2011
E-mail PDF

performance testing
Its no surprise most of us are used to seeing performance testing as something that comes after integration testing and altogether as separate activity that should happen to tick the box in practice. Well its probably more than that and could go to an extent of bringing down a site or end up spending huge sums if not done the right way. Some thoughts from a discussion with our QA folks on some common mistakes that we have done in the past and probably few things that are very common that we all do in testing. 


Discussion Highlights

  • Performance testing is a must for every piece of work in every release
  • Most of the companies do performance testing without a baseline
  • Performance testing for every release is a waste of time and is many times a testing waste
  • Do performance testing to break the system or to know when can system be broken
  • Performance testing is a tools thing why are you bothered, just do it
  • Something you need to test at overall product architecture so do it once and save mini projects repeating them
  • It's developers thing. Ah well you got it wrong again, it's testers stuff
  • Best done by third parties, don't have time for it really.

The What and Why

Performance testing is defined as the technical investigation done to determine or validate the speed, scalability, and/or stability characteristics of the product under test. Performance-related activities, such as testing and tuning, are concerned with achieving response times, throughput, and resource-utilization levels that meet the performance objectives for the application under test.

There are various types of performance testing types which are explained clearly in one of our previous posts taken from a book written by Scott Barber (performance testing) discussing the purpose, benefits of each type. 

Technically we do performance testing to prove

  • Speed - System is quick enough for end users to access without delays and frustrations 
  • Capacity - Infrastructure is capable enough to handle the load on system 
  • Scalability - Ensure system is capable enough to scale to the growth anticipated 
  • Stability - Behaviour of system does not change under heavy load of users accessing the site

While above are technical reasons, as a business i would like to ensure customer satisfaction, save my brand from any defamation or embarrassment should site be unable to cope up with number of users accessing the system.

Doing performance testing with a view to break the system is a myth and certainly should not be the purpose of why you want to do it. 

The How and When

Don't Jump Straight To Test
One of the common issues is straight away jump into the field and start performance testing. Often realize when the implementation team asks for "Against what baseline have you tested this?". Most of us agreed this is a common problem and it takes at least months to convince people that they need a baseline and a month further to establish the baseline. The baseline is no good if it does not contain the average figures and peak load figures sampled across various times in a day, various days in a month and various months in a year. Baselines if being done don't consider to this extent.

Try and assess the industry averages as well to establish a comparison and determine your desired baseline considering a growth of say 20 to 30% a year. It is essential that you rebaseline periodically or with atleast every major release.

Determine What To Test
Determine what type of performance testing you would want to do. You don't need to do all types of performance testing. There is a slight risk that you can definitely take and just do what is required. Something like stress testing is not mandatory, you could save it for future if you are short of time.

Dont Leave It To The End
General practice is performance testing to be initiated after integration testing. Don't leave it to the end. Like any other testing, performance testing is also something that you should consider right at the requirement analysis stage and should factor in a special design task to identify the best practices in design and development to minimise the failure later and decrease defect age of performance bugs. When you use products, applications and web servers, do go through the performance issues and fixes that others have gone thru before development to understand the common defects and avoid them rather than find later and fix them. Functional defects can be fixed in minutes and at almost less or no cost but performance defect has terrible impact on costs and I have worked on projects where we spent £30K in fixing just one performance defect.

Approach Is Very Much Necessary
Have an approach defined for performance testing. It's no different to any other testing or development where you have a process sequential in nature analysis, design, development and execution. As discussed above, conduct analysis of what baseline, common product issues, resolutions, establish what you wish to perf test, design your approach, develop test scripts and execute your scripts multiple times.

When to do performance testing is a bit subjective based on nature of projects and most importantly the criticality of the project.
The discussion revealed some interesting facts

  • Any product/project being released for the first time or a major release being released, performance testing has to be done.
  • There is no need to really do performance testing for every release, specifically if there can be some authoritative view into risk being less or changes being released are small enough.
  • Some of the views were also that a risk can definitely be taken to not do performance test if there is no impact to critical business flows, even if major changes are released. 
  • Subjective to criticality of the project say something like financial, transaction heavy and like systems may need to be tested without fail, however systems like timesheets, ticket management and like need not be tested for every release.

The Who

By this point probably it's clear its neither developers job nor testers job as a whole. Performance testing work happens across the project and whole of the team is responsible to do their job. Offloading it to a third party may be an option but not necessarily the one which is going to simply your life. We ended up agreeing that the one who will maintain the application on the long run ideally should be responsible to get involved in performance testing of applications. 

The Tools Thing

While there are so many performance testing tools available today so are the challenges. With changing landscape of software development, critical ones being agile, cloud and mobile channels, performance testing tools have started to scale themselves to address such challenges. Jmeter has been extended to test on cloud and launched as Blazemeter for example, GCLoad simplifying Cloud Load testing and sweet available free as well, Soasta pioneering democratizing the concept of performance testing on cloud are some of the tools scaling up to challenges.

Have made a collection of performance testing tools for quick reference

Summary

  • Never jump onto performance testing without establishing a baseline
  • Yes it is tools thing but it's how well team composes themselves right from start with a view to avoid performance bugs than find and fix them.
  • Not all performance types are required for your project, so choose what your application really needs to be tested against
  • Tester is the role and not performance tester
  • Explore the tools which are being improved from time to time to scale up to the testing challenges
Ravi Vadugu

Ravi Vadugu

An IT professional with over 12 years of experience. Project management is what i do. Curious/Interested in upcoming technologies, trends, software methodologies(e.g. Agile) and software tools. Love sharing knowledge with rest of the community.

blog comments powered by Disqus