What is Performance Testing?
It is a process of determining the speed, responsiveness, and stability of a computer, software application, network or device under a workload.
Why is so important Performance Testing nowadays?
Currently, there are many people connected to the Internet, so most of the software applications should be able to support huge users traffic.
An organization can also use performance testing as a diagnostic aid to locate computing or communications bottlenecks within a system.
What is Apache JMeter?
There are many tools to do performance testing, but the most famous framework is JMeter. It is an open source software, a 100% pure Java application designed to load test functional behavior and measure performance.
Ones of its features are:
- Ability to load and performance test to many different applications/server/protocol types (HTTPS, SOAP/RES, databases, LDAP, shell, FTP, TCP)
- IDE to do recordings
- CLI mode
- Full multithreading framework
- Dynamic HTML report
Some concepts about Jmeter:
- Test Plan: The container of the Jmeter script, who has all the test cases. It allows running thread groups consecutively (one at a time).
- Workbench: It is a temporary area for doing tests
- Thread Group: It is like the number of browser instances that will run this test case (how many users will do it). Thread groups always process instructions in chronological order. The controls for a thread group allow you to:
- Set the number of threads (users)
- Set the ramp-up period (controls the time to complete all thread requests)
- Set the number of times to execute the test
- SetUp and TearDown Thread Group: To set up or clear something before or after run all the scenarios/threads
- Logic Controller: Define the order of processing requests in a thread (determine the order in which the user request is executed).
- Simple Controller: It just groups the requests
- Loop Controller: Iterate N times between the requests inside it
- Random Controller: Pick up any request inside it (only one) and executes it
- Random Order Controller: Run all the requests on different order
- Runtime Controller: How much time the request(s) should be executed
- Interleave Controller: Execute only one of the requests inside it on each iteration loop
- Once Only Controller: It is executed only once a time in a loop
- Module Controller: To jump or call other requests (to reuse requests)
- Transaction Controller: It organizes different segments of your test and determines how those segments will appear in a report.
- Config Elements: Where you manage the config values, like cookies, CSV data, etc.
- HTTP Request Defaults: It simply defines the default values that the HTTP Request elements use. Avoid data duplication in tests and make test scripts more (easily) maintainable.
- HTTP cookie and cache manager: It is used to add caching functionality to HTTP requests within its scope to simulate browser cache features and it stores and sends cookies just like a web browser (useful in login).
- User Defined Variables: It is used to set the test data
- HTTP cookie and cache manager: It is used to add caching functionality to HTTP requests within its scope to simulate browser cache features and it stores and sends cookies just like a web browser (useful in login).
- User Defined Variables: It is used to set the test data
- Timers: Where to set up or simulate the users real traffic.
Constant Timer: Wait N milliseconds before starting to execute the requests
Gaussian Random Timer: Timer that following Gaussian equation
Synchronizing Timer: It releases X number of threads at a given point. It will block the threads until given X number of threads have been reached at a specific point
Constant Throughput Timer: It paces the samplers under its influence so that the total number of samples per unit of time approaches a given constant as much as possible.
- Pre and Post Processors: What do you need to do before or after each test.
HTML Link Parser: It is a preprocessor that parses the HTML response got to extract data (links, values) and use it dynamically.
Result Status Action Handler: This allows the user to stop the thread or the whole test if the relevant sampler failed.
Regular Expression Extractor: It is useful for extracting information from the response.
Regular Expression Extractor: It is useful for extracting information from the response.
- Samplers: The request that you want to do.
- HTTP Request: It lets you send an HTTP/HTTPS request to a web server.
- Debug Sampler: It is used to debug the script, especially to ensure all the JMeter variables has got the right data in them.
- BeanShell Sampler: This sampler allows you to write a sampler using Java (or from a .jar).
- Assertions: Check the expected results if they are fine or not.
Size Assertion: To compare the body size bytes
Duration Assertion: Check how much time is taking the request
Response Assertion: Add pattern strings to be compared against various fields of the server response
Duration Assertion: Check how much time is taking the request
Response Assertion: Add pattern strings to be compared against various fields of the server response
- Listeners: Generate reports from the test cases executed
Some examples: View Results Tree, Aggregate Report and Graph (average of the time response, throughput -> server capacity response), Simple Data Writer (it just logs the results in a flat-file), Assertion Results
Custom Graphs: jp@gc - Active Threads Over Time, jp@gc - Response Times Over Time, jp@gc - Composite Graph (combine graphs)
Run JMeter Non-GUI Mode:
$ jmeter -n -JHOST=google.com -t examples/GoogleSimpleRequest.jmx
$ jmeter -n -JHOST=google.com -t examples/GoogleSimpleRequest.jmx -l jmeter.jtl -e -o results
Note: The .jtl file has all the needed information to generate reports on any listener.
JMeter Distributed Testing

$ jmeter -n -JHOST=google.com -t examples/GoogleSimpleRequest.jmx -l jmeter.jtl -e -o results -r
$ jmeter -n -JHOST=google.com -t examples/GoogleSimpleRequest.jmx -l jmeter.jtl -e -o results -R 127.0.0.1,192.168.0.7:1099
Best practices:
- Delay Thread creation until needed in Thread
- Avoid GUI mode for large load
- Avoid view result tree and table Listeners, use it only for debugging
- Use loop controllers for the same examples
- Use dynamic test data with .csv files
- For saving listener output use .csv files. Disable .xml file because it saves larger data so large size and takes a long time
Useful plugins in JMeter:
- jpgc - Standard Set
- Custom Thread Groups
- Selenium/WebDriver Support