While doing performance testing, it is important to understand the term “concurrent” as it pertains to different aspects within performance testing, and is often misunderstood, causing some significant measures to be overlooked.  Do you ever wonder how your website is performing on end user devices? Or how do you plan for increasing website traffic? Or even how to solve website problems that are seemingly invisible but impacting your overall business? The answer to these questions is performance testing.

Performance testing has been around forever but evolving with newer technologies every day. At its core, performance testing is done by simulating real users to interact with the website using scripts. This interaction data is then captured and analyzed for insights on different aspects of website and application performance such as response time, accessibility, reliability, uptime, resource usage, and scalability, etc.  Performance testing is done to make sure that the website is in a stable state with performance criteria and how to improve and scale when the need arises. More importantly, it will provide useful data about how the system is performing under projected workloads. Performance testing will also uncover inconsistencies, inefficiencies, and usability issues when multiple requests are made concurrently.

 

Fundamental Performance Problems and Metrics

Let’s take a look at what performance problems you should rectify as a first step.

 

Load Time

Application load time is the time taken to fully load your website, application, or a single page before the user can perform any action. It is critical, as with every second of delay, users turn away from the website resulting in revenue loss.

 

Response Time

Response time refers to the server response to any user activity or transaction. More response time means a frustrating experience for the user.

 

Resource Utilization and Bottlenecks

Your website or application should be utilizing resources with efficiency along with manageable resource allocation when there is high traffic or demand for the resources. Any resources such as CPU, memory, network, etc., can be a bottleneck in many scenarios resulting in bog down the entire application.

 

Scalability

Your website or application should be able to handle expected traffic during regular demand or in a special event. If it fails to sustain high demand then poor scalability problem comes into picture that should be analyzed and fixed using load testing.

 

Real-world Scenarios

Apart from these fundamental problems, there are many business-specific use cases that are directly related to performance. For example, if you have a trading application, improving website speed is not a one time task, so you need to be proactively reducing response time, even just a few milliseconds can make or break an opportunity, to avoid any financial loss either to your business or your users.

The following list consists of some basic parameters that you need to measure, monitor, and analyze during performance testing:

  • Response time
  • CPU interrupts per second
  • Network output queue length
  • Network bytes total per second
  • Throughput
  • Maximum active sessions
  • Hit ratios
  • Database locks
  • Garbage collection
  • CPU Usage
  • Memory Usage
  • Disk I/O
  • Network Bandwidth
  • Memory pages/second
  • Page faults/second
  • Concurrent HTTP benchmarks
  • Concurrent Users

 

Concurrent HTTP Connections vs. Concurrent Browsers vs. Concurrent Users

 

Concurrent HTTP

Concurrent HTTP refers to HTTP requests made at any point in time. For example, let’s say there are 10000 users with valid sessions and 100 users are requesting to read the same resource over HTTP at any point in time then we have 100 concurrent HTTP requests.

 

Concurrent Browsers

Concurrent browsers refer to the number of browsers with valid sessions at any point in time. They can send any number of requests to the server at any point in time.

 

Concurrent Users

Concurrent users refer to the users with valid sessions with the server performing the same task at any point in time.

Usually, people get confused with concurrent users and simultaneous users as these both are used interchangeably, but in performance testing these two terms have different meanings. Let’s take a look at an example:

Suppose there are 1,000 different users with a valid session with the server. Each of these users is performing different operations like sign-in, checkout, messaging, etc. These are called simultaneous users which are essentially the number of users with valid sessions in the server. Now, it might happen that 100 out of these 1000 users are performing checkout operations at the same point in time. Then these 100 users would be concurrent users. Concurrent users are often very less than simultaneous users and occur infrequently.

 

Load Testing: Speed, Scalability, and Stability

Load testing is one of the most important type of performance testing to testing the website/application under high traffic load. The data gathered from this testing is analyzed and projected to figure out the problems that may occur when high number of real users access your website. It is helpful to remove bottlenecks and optimize transactions along with planning for future scalability of website/application infrastructure. Let’s look at some basic load testing types, how they are different, and their importance.

 

HTTP Load Test

HTTP load testing is usually done for identifying how many concurrent HTTP requests the server can handle. It can also be approached as maximum number of requests per second. On a granular level, there might be different types of requests, such as read, write, commuting, etc. Finding out the limit for each specific request can give you more insight into what optimization and resource planning you need to do. For example – the number of requests per second can be higher for a read HTTP requests but it would probably lot less for commuting intensive requests.

 

Web Page Load Test

A web page load test is done for any single page loading time. For example, if you have an e-commerce website, you want to check for individual product page load time, cart page load time, checkout page load time to enhance and improve customer experience. If you product page load is fine but you ignore the optimization on cart page, it would definitely result into sales loss.

 

Web Application Load Test

A web application load test is done for measuring the first load of your web application. It is different from page load time where you do it for every other individual page. When a web application starts, it pulls out different resources, initiates few site-wide services, calls third-party services, and so on before it finally loads. It should be your first focus to optimize web application load time to prevent churning.

 

Final Thoughts: Concurrent HTTP vs. Concurrent Browsers vs. Concurrent Users

Load testing is a necessity that helps developers and architects in optimization and resource planning. For web applications that expect peak traffics, it becomes even more important. Apart from load testing, it is also important to regularly monitor your website or application for accessibility, speed, and uptime of third-party services. Don’t forget to load test and monitor your website or application from different geo-locations to improve it further for users as they might have specific performance issues derived from their location.  Using a solution like LoadView allows you to easily load test all your web pages, applications, web services, servers, and APIs, with hundreds to thousands of concurrent HTTP connections or browsers.

Try the LoadView free trial and receive $20 in load testing credits.  Or schedule a live demo with one our performance engineers for a full walk through of the LoadView platform to see all the features and benefits the platform has to offer!