Complete Guide to Performance Testing in Java
Performance Testing in Java
Performance testing ensures applications meet speed, scalability, and stability requirements under various load conditions. This guide covers tools, techniques, and best practices for performance testing Java applications.
Table of Contents
- Performance Testing Fundamentals
- JMeter for Load Testing
- Microbenchmarking with JMH
- Application Profiling
- Database Performance Testing
- Spring Boot Performance Testing
- Best Practices
Performance Testing Fundamentals
Types of Performance Testing
// Load Testing - Normal expected load
// Stress Testing - Beyond normal capacity
// Spike Testing - Sudden load increases
// Volume Testing - Large amounts of data
// Endurance Testing - Extended periods
@Test
public class PerformanceTestExample {
@Test
void loadTest() {
// Test with expected concurrent users (e.g., 100 users)
simulateUsers(100, Duration.ofMinutes(10));
}
@Test
void stressTest() {
// Test beyond normal capacity (e.g., 500 users)
simulateUsers(500, Duration.ofMinutes(5));
}
@Test
void spikeTest() {
// Sudden increase from 10 to 200 users
simulateSpike(10, 200, Duration.ofSeconds(30));
}
}
Key Performance Metrics
public class PerformanceMetrics {
// Response Time Metrics
public static class ResponseTime {
private long averageMs;
private long p95Ms; // 95th percentile
private long p99Ms; // 99th percentile
private long maxMs;
// Getters and setters
}
// Throughput Metrics
public static class Throughput {
private double requestsPerSecond;
private double transactionsPerSecond;
private long totalRequests;
// Getters and setters
}
// Resource Utilization
public static class ResourceUsage {
private double cpuPercent;
private long memoryUsedMB;
private double diskIOps;
private long networkBytesPerSec;
// Getters and setters
}
}
JMeter for Load Testing
JMeter Test Plan Structure
<!-- Basic JMeter test plan for REST API -->
<?xml version="1.0" encoding="UTF-8"?>
<jmeterTestPlan version="1.2">
<hashTree>
<TestPlan testname="User API Load Test">
<elementProp name="TestPlan.arguments" elementType="Arguments" guiclass="ArgumentsPanel"/>
<boolProp name="TestPlan.functional_mode">false</boolProp>
<boolProp name="TestPlan.serialize_threadgroups">false</boolProp>
</TestPlan>
<hashTree>
<!-- Thread Group (Virtual Users) -->
<ThreadGroup testname="User Load">
<stringProp name="ThreadGroup.on_sample_error">continue</stringProp>
<elementProp name="ThreadGroup.main_controller" elementType="LoopController">
<boolProp name="LoopController.continue_forever">false</boolProp>
<stringProp name="LoopController.loops">10</stringProp>
</elementProp>
<stringProp name="ThreadGroup.num_threads">50</stringProp>
<stringProp name="ThreadGroup.ramp_time">30</stringProp>
</ThreadGroup>
<hashTree>
<!-- HTTP Request -->
<HTTPSamplerProxy testname="Get Users">
<elementProp name="HTTPsampler.Arguments" elementType="Arguments">
<collectionProp name="Arguments.arguments"/>
</elementProp>
<stringProp name="HTTPSampler.domain">localhost</stringProp>
<stringProp name="HTTPSampler.port">8080</stringProp>
<stringProp name="HTTPSampler.path">/api/users</stringProp>
<stringProp name="HTTPSampler.method">GET</stringProp>
</HTTPSamplerProxy>
<!-- Response Assertions -->
<ResponseAssertion testname="Status Code 200">
<collectionProp name="Asserion.test_strings">
<stringProp>200</stringProp>
</collectionProp>
<stringProp name="Assertion.test_field">Assertion.response_code</stringProp>
</ResponseAssertion>
</hashTree>
</hashTree>
</hashTree>
</jmeterTestPlan>
Programmatic JMeter Tests
// Using JMeter programmatically
public class JMeterTestRunner {
public void runLoadTest() {
// Create Test Plan
TestPlan testPlan = new TestPlan("User API Load Test");
// Create Thread Group
ThreadGroup threadGroup = new ThreadGroup();
threadGroup.setName("User Load");
threadGroup.setNumThreads(50);
threadGroup.setRampUp(30);
threadGroup.setSamplerController(createLoopController(10));
// Create HTTP Request
HTTPSamplerProxy httpSampler = new HTTPSamplerProxy();
httpSampler.setDomain("localhost");
httpSampler.setPort(8080);
httpSampler.setPath("/api/users");
httpSampler.setMethod("GET");
// Build test tree
HashTree testPlanTree = new HashTree();
testPlanTree.add(testPlan);
HashTree threadGroupHashTree = testPlanTree.add(testPlan, threadGroup);
threadGroupHashTree.add(httpSampler);
// Run test
StandardJMeterEngine jmeter = new StandardJMeterEngine();
jmeter.configure(testPlanTree);
jmeter.run();
}
private LoopController createLoopController(int loops) {
LoopController loopController = new LoopController();
loopController.setLoops(loops);
loopController.setFirst(true);
loopController.initialize();
return loopController;
}
}
Microbenchmarking with JMH
JMH Setup and Basic Benchmarks
<!-- Maven dependency for JMH -->
<dependency>
<groupId>org.openjdk.jmh</groupId>
<artifactId>jmh-core</artifactId>
<version>1.37</version>
</dependency>
<dependency>
<groupId>org.openjdk.jmh</groupId>
<artifactId>jmh-generator-annprocess</artifactId>
<version>1.37</version>
</dependency>
@BenchmarkMode(Mode.AverageTime)
@OutputTimeUnit(TimeUnit.NANOSECONDS)
@State(Scope.Benchmark)
@Fork(value = 2, jvmArgs = {"-Xms2G", "-Xmx2G"})
@Warmup(iterations = 3)
@Measurement(iterations = 5)
public class StringConcatenationBenchmark {
private static final int N = 1000;
@Benchmark
public String stringConcatenation() {
String result = "";
for (int i = 0; i < N; i++) {
result += "test";
}
return result;
}
@Benchmark
public String stringBuilder() {
StringBuilder sb = new StringBuilder();
for (int i = 0; i < N; i++) {
sb.append("test");
}
return sb.toString();
}
@Benchmark
public String stringBuffer() {
StringBuffer sb = new StringBuffer();
for (int i = 0; i < N; i++) {
sb.append("test");
}
return sb.toString();
}
public static void main(String[] args) throws Exception {
Options opt = new OptionsBuilder()
.include(StringConcatenationBenchmark.class.getSimpleName())
.build();
new Runner(opt).run();
}
}
Advanced JMH Features
@State(Scope.Benchmark)
public class CollectionBenchmark {
@Param({"10", "100", "1000", "10000"})
private int size;
private List<String> arrayList;
private List<String> linkedList;
private List<String> data;
@Setup
public void setup() {
data = IntStream.range(0, size)
.mapToObj(String::valueOf)
.collect(toList());
arrayList = new ArrayList<>();
linkedList = new LinkedList<>();
}
@Benchmark
public List<String> arrayListAdd() {
List<String> list = new ArrayList<>();
for (String item : data) {
list.add(item);
}
return list;
}
@Benchmark
public List<String> linkedListAdd() {
List<String> list = new LinkedList<>();
for (String item : data) {
list.add(item);
}
return list;
}
@Benchmark
@BenchmarkMode(Mode.AverageTime)
public String arrayListGet() {
return arrayList.get(size / 2);
}
@Benchmark
@BenchmarkMode(Mode.AverageTime)
public String linkedListGet() {
return linkedList.get(size / 2);
}
}
Application Profiling
Memory Profiling
public class MemoryProfiler {
public static void analyzeMemoryUsage() {
MemoryMXBean memoryBean = ManagementFactory.getMemoryMXBean();
MemoryUsage heapUsage = memoryBean.getHeapMemoryUsage();
MemoryUsage nonHeapUsage = memoryBean.getNonHeapMemoryUsage();
System.out.println("=== Heap Memory ===");
printMemoryUsage("Heap", heapUsage);
System.out.println("=== Non-Heap Memory ===");
printMemoryUsage("Non-Heap", nonHeapUsage);
// Memory pools
List<MemoryPoolMXBean> memoryPools = ManagementFactory.getMemoryPoolMXBeans();
for (MemoryPoolMXBean pool : memoryPools) {
System.out.println("Pool: " + pool.getName());
printMemoryUsage(pool.getName(), pool.getUsage());
}
}
private static void printMemoryUsage(String name, MemoryUsage usage) {
if (usage != null) {
System.out.printf("%s - Used: %d MB, Max: %d MB, Usage: %.2f%%%n",
name,
usage.getUsed() / 1024 / 1024,
usage.getMax() / 1024 / 1024,
(double) usage.getUsed() / usage.getMax() * 100);
}
}
}
CPU Profiling
public class CPUProfiler {
private static final ThreadMXBean threadBean = ManagementFactory.getThreadMXBean();
public static void profileMethod(Runnable method, String methodName) {
if (!threadBean.isCurrentThreadCpuTimeSupported()) {
System.out.println("CPU time measurement not supported");
return;
}
long startCpuTime = threadBean.getCurrentThreadCpuTime();
long startUserTime = threadBean.getCurrentThreadUserTime();
long startWallTime = System.nanoTime();
method.run();
long endCpuTime = threadBean.getCurrentThreadCpuTime();
long endUserTime = threadBean.getCurrentThreadUserTime();
long endWallTime = System.nanoTime();
System.out.printf("=== %s Performance ===\n", methodName);
System.out.printf("CPU Time: %.2f ms\n", (endCpuTime - startCpuTime) / 1_000_000.0);
System.out.printf("User Time: %.2f ms\n", (endUserTime - startUserTime) / 1_000_000.0);
System.out.printf("Wall Time: %.2f ms\n", (endWallTime - startWallTime) / 1_000_000.0);
}
}
Database Performance Testing
Connection Pool Testing
@SpringBootTest
public class DatabasePerformanceTest {
@Autowired
private DataSource dataSource;
@Test
void testConnectionPoolPerformance() throws SQLException {
int numThreads = 50;
int operationsPerThread = 100;
ExecutorService executor = Executors.newFixedThreadPool(numThreads);
CountDownLatch latch = new CountDownLatch(numThreads);
long startTime = System.currentTimeMillis();
for (int i = 0; i < numThreads; i++) {
executor.submit(() -> {
try {
for (int j = 0; j < operationsPerThread; j++) {
try (Connection conn = dataSource.getConnection();
PreparedStatement stmt = conn.prepareStatement("SELECT 1")) {
ResultSet rs = stmt.executeQuery();
rs.next();
}
}
} catch (SQLException e) {
e.printStackTrace();
} finally {
latch.countDown();
}
});
}
try {
latch.await();
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
long endTime = System.currentTimeMillis();
long totalOperations = numThreads * operationsPerThread;
double opsPerSecond = totalOperations / ((endTime - startTime) / 1000.0);
System.out.printf("Total operations: %d\n", totalOperations);
System.out.printf("Time taken: %d ms\n", endTime - startTime);
System.out.printf("Operations per second: %.2f\n", opsPerSecond);
executor.shutdown();
}
}
Query Performance Testing
@Test
void testQueryPerformance() {
// Test different query strategies
long startTime, endTime;
// Strategy 1: Individual queries
startTime = System.currentTimeMillis();
for (int i = 1; i <= 1000; i++) {
userRepository.findById((long) i);
}
endTime = System.currentTimeMillis();
System.out.println("Individual queries: " + (endTime - startTime) + " ms");
// Strategy 2: Batch query
startTime = System.currentTimeMillis();
List<Long> ids = LongStream.rangeClosed(1, 1000).boxed().collect(toList());
userRepository.findByIdIn(ids);
endTime = System.currentTimeMillis();
System.out.println("Batch query: " + (endTime - startTime) + " ms");
}
Spring Boot Performance Testing
WebMvcTest Performance
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
public class WebPerformanceTest {
@Autowired
private TestRestTemplate restTemplate;
@LocalServerPort
private int port;
@Test
void testEndpointPerformance() throws InterruptedException {
String url = "http://localhost:" + port + "/api/users";
int numRequests = 1000;
int numThreads = 10;
ExecutorService executor = Executors.newFixedThreadPool(numThreads);
CountDownLatch latch = new CountDownLatch(numRequests);
List<Long> responseTimes = Collections.synchronizedList(new ArrayList<>());
long startTime = System.currentTimeMillis();
for (int i = 0; i < numRequests; i++) {
executor.submit(() -> {
try {
long requestStart = System.currentTimeMillis();
ResponseEntity<String> response = restTemplate.getForEntity(url, String.class);
long requestEnd = System.currentTimeMillis();
if (response.getStatusCode().is2xxSuccessful()) {
responseTimes.add(requestEnd - requestStart);
}
} finally {
latch.countDown();
}
});
}
latch.await();
executor.shutdown();
long endTime = System.currentTimeMillis();
// Calculate statistics
responseTimes.sort(Long::compareTo);
double avgResponseTime = responseTimes.stream().mapToLong(Long::longValue).average().orElse(0);
long p95ResponseTime = responseTimes.get((int) (responseTimes.size() * 0.95));
double throughput = numRequests / ((endTime - startTime) / 1000.0);
System.out.printf("Total requests: %d\n", numRequests);
System.out.printf("Average response time: %.2f ms\n", avgResponseTime);
System.out.printf("95th percentile: %d ms\n", p95ResponseTime);
System.out.printf("Throughput: %.2f requests/second\n", throughput);
}
}
Best Practices
Performance Testing Guidelines
public class PerformanceTestingBestPractices {
// 1. Establish Performance Baselines
@Test
void establishBaseline() {
// Measure current performance before optimization
long baseline = measurePerformance(() -> {
// Current implementation
processData();
});
// Store baseline for comparison
storeBaseline("processData", baseline);
}
// 2. Test in Production-like Environment
@Test
@Profile("performance")
void testWithProductionConfig() {
// Use production-like data volumes
// Use production-like hardware specs
// Use production-like network conditions
}
// 3. Identify Performance Bottlenecks
@Test
void identifyBottlenecks() {
PerformanceProfiler profiler = new PerformanceProfiler();
profiler.start();
performBusinessOperation();
PerformanceReport report = profiler.stop();
// Analyze where time is spent
System.out.println("Database time: " + report.getDatabaseTime());
System.out.println("CPU time: " + report.getCpuTime());
System.out.println("Network time: " + report.getNetworkTime());
}
// 4. Monitor Resource Utilization
@Test
void monitorResources() {
ResourceMonitor monitor = new ResourceMonitor();
monitor.startMonitoring();
runLoadTest();
ResourceUsage usage = monitor.stopMonitoring();
assertThat(usage.getMaxCpuUsage()).isLessThan(80.0);
assertThat(usage.getMaxMemoryUsage()).isLessThan(85.0);
}
// 5. Performance Regression Testing
@Test
void preventPerformanceRegression() {
long currentPerformance = measurePerformance(() -> {
optimizedProcessData();
});
long baseline = getStoredBaseline("processData");
// Ensure performance hasn't degraded by more than 10%
double degradation = (double) (currentPerformance - baseline) / baseline * 100;
assertThat(degradation).isLessThan(10.0);
}
}
Common Performance Anti-Patterns
public class PerformanceAntiPatterns {
// ANTI-PATTERN 1: N+1 Query Problem
public void badLoadUsers() {
List<User> users = userRepository.findAll(); // 1 query
for (User user : users) {
user.getOrders().size(); // N queries (lazy loading)
}
}
// BETTER: Use fetch joins or batch fetching
public void goodLoadUsers() {
List<User> users = userRepository.findAllWithOrders(); // 1 query with join
}
// ANTI-PATTERN 2: Inefficient String Concatenation
public String badStringConcatenation(List<String> items) {
String result = "";
for (String item : items) {
result += item + ","; // Creates new string each time
}
return result;
}
// BETTER: Use StringBuilder
public String goodStringConcatenation(List<String> items) {
StringBuilder sb = new StringBuilder();
for (String item : items) {
sb.append(item).append(",");
}
return sb.toString();
}
// ANTI-PATTERN 3: Synchronous Processing in Loops
public void badProcessItems(List<Item> items) {
for (Item item : items) {
processItemSynchronously(item); // Blocks for each item
}
}
// BETTER: Parallel or asynchronous processing
public void goodProcessItems(List<Item> items) {
items.parallelStream()
.forEach(this::processItemSynchronously);
}
}
Summary
Performance testing ensures Java applications meet performance requirements:
Key Testing Types:
- Load Testing: Normal expected traffic
- Stress Testing: Beyond normal capacity
- Spike Testing: Sudden traffic increases
- Endurance Testing: Extended periods
Essential Tools:
- JMeter: Load and stress testing
- JMH: Microbenchmarking
- Application Profilers: CPU and memory analysis
- APM Tools: Production monitoring
Critical Metrics:
- Response Time: Average, P95, P99
- Throughput: Requests/transactions per second
- Resource Usage: CPU, memory, disk, network
- Error Rate: Failed requests percentage
Best Practices:
- Establish performance baselines
- Test in production-like environments
- Identify and address bottlenecks
- Monitor resource utilization
- Implement performance regression testing
- Focus on real-world scenarios
Common Pitfalls:
- Testing only happy path scenarios
- Using unrealistic data volumes
- Ignoring resource constraints
- Not testing under concurrent load
Performance testing is crucial for delivering scalable, responsive Java applications that meet user expectations under real-world conditions.