Explain some load testing metrics

Technology CommunityCategory: TestingExplain some load testing metrics
VietMX Staff asked 3 years ago
  • Throughput – is calculated as requests/unit of time. The time is calculated from the start of the first sample to the end of the last sample. This includes any intervals between samples, as it is supposed to represent the load on the server. Throughput = (number of requests) / (total time).
  • Connect Time – Measures the time it took to establish the connection, including SSL handshake,
  • Response time – is the elapsed time from the moment when a given request is sent to the server until the moment when the last bit of information has returned to the client
  • Average response time – To get the average response time you should sum all samplers response time and devide to number of samplers. Sampler means user, request, hit, the meaning is the same.
  • Min – the minimal response time in all samplers. Differently we may say the fastest response.
  • Max – opposite of Min, the slowest response.
  • Median – is a number which divides the samples into two equal halves. Half of the samples are smaller than the median, and half are larger.
  • Error % – This column indicated the percentage of error HTTP response codes.
  • Elapsed time – Measures the elapsed time from just before sending the request to just after the last chunk of the response has been received,
  • Latency – Measures the latency from just before sending the request to just after the first chunk of the response has been received,
  • 90% Line (90th Percentile) – The elapsed time below which 90% of the samples fall
  • Standard Deviation – Measure of the variability of a data set. This is a standard statistical measure