Quantcast
Channel: » Oren Elias
Viewing all articles
Browse latest Browse all 12

Performance Problem Can’t Be Reproduced In a Test Bed

0
0

 

Sometimes a Performance Problem Can’t Be Reproduced In a Test Bed

I attended the Greater Boston Computer Measurement Group’s fall conference a few weeks ago and was pleasantly surprised by the variety of topics discussed. However, among the plethora of bullet points described, one value proposition sticks out in my mind. During a presentation on the use of application analytics for performance management and capacity planners, a statement was made: “Sometimes a performance problem can’t be reproduced in a test bed.” At this point, the speaker outlined her exhaustive research on how application analytics could be used to fill in such gaps. Not surprisingly, most of the audience remained dubious at best. Let’s examine the verity of the original statement; it seems obvious to even the most ambitious of test engineers. However, I would argue that this fact is forgotten more often than not, which can lead to significant costs.

Today’s modern IT landscape can be categorized by two concurrent trends: the proliferation and wide spread use of mobile devices and the adaptation of cloud based technologies. In this landscape, the importance of performance testing has never been more important. However, the complexity of such technological environment can lead to significant difficulties in executing such tests. In this environment it is quite easy to get sidetracked as problems arise or get lost in the data. Moreover, it is incredible difficult to try to replicate the end users’ experience. Thirdly, in identifying potential spikes in traffic, stakeholders may not be on the same page: i.e. Marketing launches a brand new online campaign without telling the IT folks. Bearing points two and three in mind, I would argue it is nearly impossible to test for such occurrences. In response to such challenges, A CIO of a major online book retailer recently told me they had stopped doing almost all of their performance testing. They calculated the investment for good testing on their lower applications was far higher than the potential damage of those applications failing. Instead, the spent the money saved on better monitoring tools for their production applications!

Effective testing does not necessarily identify all problems and more importantly, prevent them from occurring (to the ire of test engineers worldwide). I say to this, good riddance! The goal of a testing campaign is to mitigate all the likely challenges which may affect deployment. Wasting time testing for unlikely events, getting bogged down in data, or fixating on one problem not only slows down progress, but can hamper ROI. No matter how much testing you do, it is impossible to see the future before it happens. Just ask Target, Bank of America, RIM, etc.

What are your thoughts on this value proposition? What are the most challenging aspects of performance testing?

The post Performance Problem Can’t Be Reproduced In a Test Bed appeared first on .


Viewing all articles
Browse latest Browse all 12

Latest Images

Trending Articles





Latest Images