Master TestNG for Advanced Java Testing
TestNG: Advanced Java Testing Framework
TestNG (Test Next Generation) is a powerful testing framework that revolutionizes how we approach testing in Java applications. While JUnit has long been the standard for Java testing, TestNG introduces advanced capabilities that make it particularly suitable for complex testing scenarios, enterprise applications, and comprehensive test automation suites.
Why TestNG Matters for Modern Java Development
Testing is a cornerstone of reliable software development, and the choice of testing framework significantly impacts both developer productivity and code quality. TestNG was designed to address limitations found in traditional testing frameworks by providing:
Enhanced Test Organization: Unlike simpler frameworks, TestNG allows you to organize tests into logical groups, define dependencies between tests, and control execution flow with unprecedented precision.
Enterprise-Ready Features: Large applications require sophisticated testing strategies. TestNG provides built-in support for parallel execution, data-driven testing, and comprehensive reporting that scales with your application's complexity.
Flexible Configuration: Rather than being constrained by rigid annotation patterns, TestNG offers XML-based configuration, programmatic setup, and annotation-driven approaches that adapt to your team's workflow.
Table of Contents
- Introduction to TestNG
- TestNG vs JUnit
- Basic Test Structure
- Annotations
- Test Configuration
- Data Providers
- Test Groups
- Parallel Execution
- Test Dependencies
- Reporting
- Best Practices
Introduction to TestNG
TestNG emerged from the need for a more sophisticated testing framework that could handle the complexities of modern enterprise applications. While traditional testing frameworks work well for simple unit tests, they often fall short when dealing with integration tests, end-to-end scenarios, or large test suites that require careful orchestration.
Understanding TestNG's Design Philosophy
TestNG was built with the philosophy that testing frameworks should adapt to your testing needs, not the other way around. This means providing multiple ways to configure and execute tests, supporting various testing methodologies, and scaling from simple unit tests to complex test automation frameworks.
The framework introduces concepts that weren't available in earlier testing tools:
Test Method Dependencies: In real-world scenarios, some tests naturally depend on others. For example, you might need to create a user account before testing login functionality. TestNG allows you to express these dependencies explicitly.
Flexible Test Execution: Rather than running all tests in a predetermined order, TestNG lets you group related tests, run specific subsets, and control execution flow based on your testing strategy.
Built-in Parallel Execution: Modern applications run on multi-core systems, and your tests should too. TestNG provides native support for parallel test execution without requiring additional configuration or external tools.
Key Features and Capabilities
TestNG's feature set addresses common pain points in enterprise testing:
- Flexible Test Configuration: Choose between XML-based configuration for complex test suites or annotation-driven configuration for simpler scenarios
- Data-Driven Testing: Built-in support for parameterized tests with multiple data sources, including CSV files, databases, and programmatic data providers
- Parallel Execution: Run tests in parallel at the method, class, or suite level to reduce overall execution time
- Test Dependencies: Define explicit dependencies between test methods or groups to ensure proper test execution order
- Grouping: Organize tests into logical groups for selective execution - run only smoke tests, regression tests, or any custom grouping you define
- Rich Assertions: Comprehensive assertion library with detailed error messages and flexible comparison options
- Detailed Reporting: Built-in HTML and XML reports that provide insights into test execution, failures, and performance metrics
Setting Up TestNG in Your Project
Before diving into TestNG's features, you'll need to add it to your project dependencies. TestNG is distributed as a Maven artifact, making it easy to integrate into any Java project using Maven, Gradle, or other build tools.
Maven Dependencies
The core TestNG dependency provides everything you need to write and execute tests:
<dependencies>
<!-- TestNG Core -->
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>7.8.0</version>
<scope>test</scope>
</dependency>
<!-- Optional: For advanced assertions -->
<dependency>
<groupId>org.assertj</groupId>
<artifactId>assertj-core</artifactId>
<version>3.24.2</version>
<scope>test</scope>
</dependency>
</dependencies>
Understanding the Dependencies:
TestNG Core: This is the main dependency that includes all TestNG annotations, assertions, and execution capabilities. The
test
scope ensures it's only available during test compilation and execution, not in your production code.AssertJ (Optional): While TestNG includes its own assertion library, AssertJ provides a more fluent and expressive assertion API. Many teams prefer AssertJ's readable syntax for complex assertions, making test failures easier to understand.
TestNG vs JUnit: Choosing the Right Framework
Both TestNG and JUnit are excellent testing frameworks, but they serve different needs and project requirements. Understanding their strengths and differences helps you make an informed decision for your specific testing scenarios.
Understanding the Architectural Differences
JUnit's Philosophy: JUnit follows a minimalist approach, providing essential testing features with the option to extend functionality through third-party libraries. JUnit 5 modernized the framework significantly, introducing a modular architecture and improved extensibility.
TestNG's Philosophy: TestNG was designed from the ground up to handle complex testing scenarios. Rather than relying on external libraries, it provides comprehensive built-in features for enterprise-level testing requirements.
Feature Comparison
Feature | TestNG | JUnit 5 | Impact on Your Testing |
---|---|---|---|
Annotations | Rich set with parameters | Modern annotation set | TestNG offers more configuration options directly in annotations |
Parameterized Tests | @DataProvider | @ParameterizedTest | TestNG's data providers are more flexible for complex data scenarios |
Test Dependencies | @Test(dependsOnMethods) | Not built-in | Critical for integration testing where test order matters |
Parallel Execution | Built-in XML configuration | Requires configuration | TestNG makes parallel testing much easier to set up |
Grouping | @Test(groups) | @Tag | TestNG's grouping is more powerful for organizing test suites |
Test Suites | XML configuration | @Suite | TestNG's XML approach offers more flexibility for complex suites |
Reporting | Built-in HTML reports | Requires extensions | TestNG provides immediate reporting without additional setup |
When to Choose TestNG
Understanding your project's testing requirements is crucial for framework selection:
Complex Test Scenarios: If your application requires sophisticated test orchestration, where some tests must run before others, or where you need to group tests into complex hierarchies, TestNG's dependency and grouping features provide significant advantages.
Data-Driven Testing: When you need to run the same test with multiple sets of data from various sources (databases, CSV files, APIs), TestNG's @DataProvider annotation offers more flexibility than JUnit's parameterized tests.
Enterprise Testing: Large organizations often have complex testing requirements including smoke tests, regression tests, and integration tests that need to be executed selectively. TestNG's grouping and XML configuration make this much easier to manage.
Parallel Execution Requirements: If reducing test execution time is critical, TestNG's built-in parallel execution capabilities require minimal configuration compared to JUnit's approach.
Integration Testing: When test order matters (such as creating test data, running tests, then cleaning up), TestNG's dependency features make this natural and explicit.
Basic Test Structure
Understanding how to structure TestNG tests properly is fundamental to writing maintainable and reliable test suites. TestNG follows a well-defined lifecycle that ensures your tests run in a predictable environment with proper setup and cleanup.
Anatomy of a TestNG Test Class
Let's examine a complete test class to understand how TestNG organizes and executes tests:
import org.testng.Assert;
import org.testng.annotations.*;
public class CalculatorTest {
private Calculator calculator;
@BeforeClass
public void setUpClass() {
System.out.println("Setting up Calculator test class");
}
@BeforeMethod
public void setUp() {
calculator = new Calculator();
}
@Test
public void testAddition() {
// Given
int a = 5;
int b = 3;
// When
int result = calculator.add(a, b);
// Then
Assert.assertEquals(result, 8, "5 + 3 should equal 8");
}
@Test
public void testDivision() {
// Test division by zero
Assert.expectThrows(ArithmeticException.class, () -> {
calculator.divide(10, 0);
});
}
@AfterMethod
public void tearDown() {
calculator = null;
}
@AfterClass
public void tearDownClass() {
System.out.println("Tearing down Calculator test class");
}
}
Breaking Down the Test Structure:
Class-Level Setup (@BeforeClass
): This method runs once before any test methods in the class execute. It's ideal for expensive setup operations like database connections, file system preparation, or loading configuration. In our example, it's used for logging, but in real applications, you might initialize shared resources here.
Method-Level Setup (@BeforeMethod
): This runs before each individual test method. Here we create a fresh Calculator instance for each test, ensuring test isolation. This pattern prevents tests from affecting each other through shared state.
Test Methods (@Test
): These contain your actual test logic. Notice the "Given-When-Then" structure:
- Given: Set up test data and conditions
- When: Execute the code being tested
- Then: Verify the results with assertions
Method-Level Cleanup (@AfterMethod
): Runs after each test method completes, regardless of whether the test passed or failed. Used for cleaning up resources created during the test.
Class-Level Cleanup (@AfterClass
): Runs once after all test methods in the class complete. Perfect for cleaning up shared resources like database connections or temporary files.
Enhancing Tests with Descriptions and Priorities
TestNG allows you to add metadata to your tests that improves both documentation and execution control. Two particularly useful features are test descriptions and execution priorities.
public class UserServiceTest {
@Test(description = "Verify user creation with valid data", priority = 1)
public void testCreateUserWithValidData() {
UserService userService = new UserService();
User user = userService.createUser("[email protected]", "John", "Doe");
Assert.assertNotNull(user);
Assert.assertEquals(user.getEmail(), "[email protected]");
Assert.assertEquals(user.getFirstName(), "John");
Assert.assertEquals(user.getLastName(), "Doe");
}
@Test(description = "Verify user creation fails with invalid email", priority = 2)
public void testCreateUserWithInvalidEmail() {
UserService userService = new UserService();
Assert.expectThrows(InvalidEmailException.class, () -> {
userService.createUser("invalid-email", "John", "Doe");
});
}
}
Understanding Test Metadata:
Description Attribute: The description
parameter serves multiple purposes:
- Documentation: Provides human-readable explanations that appear in test reports
- Maintenance: Helps team members understand test intent without reading implementation details
- Reporting: Descriptions appear in HTML reports, making test results more meaningful to stakeholders
Priority Attribute: The priority
parameter controls test execution order:
- Lower numbers execute first: Priority 1 runs before Priority 2
- Default priority is 0: Tests without explicit priority run first
- Useful for setup scenarios: Run foundational tests before dependent tests
- Debugging aid: Execute critical tests first to fail fast when issues exist
This approach is particularly valuable in integration testing scenarios where certain tests should run before others, even when explicit dependencies aren't necessary.
Annotations
Lifecycle Annotations
public class LifecycleAnnotationsExample {
@BeforeSuite
public void beforeSuite() {
System.out.println("@BeforeSuite - Executed once before all tests in the suite");
}
@BeforeTest
public void beforeTest() {
System.out.println("@BeforeTest - Executed before each <test> tag in XML");
}
@BeforeClass
public void beforeClass() {
System.out.println("@BeforeClass - Executed once before all test methods in class");
}
@BeforeMethod
public void beforeMethod() {
System.out.println("@BeforeMethod - Executed before each test method");
}
@Test
public void testMethod1() {
System.out.println("Test Method 1");
}
@Test
public void testMethod2() {
System.out.println("Test Method 2");
}
@AfterMethod
public void afterMethod() {
System.out.println("@AfterMethod - Executed after each test method");
}
@AfterClass
public void afterClass() {
System.out.println("@AfterClass - Executed once after all test methods in class");
}
@AfterTest
public void afterTest() {
System.out.println("@AfterTest - Executed after each <test> tag in XML");
}
@AfterSuite
public void afterSuite() {
System.out.println("@AfterSuite - Executed once after all tests in the suite");
}
}
Test Configuration Annotations
public class TestConfigurationExample {
@Test(enabled = false)
public void disabledTest() {
// This test will be skipped
Assert.fail("This test should not run");
}
@Test(timeOut = 2000)
public void timeoutTest() throws InterruptedException {
// Test must complete within 2 seconds
Thread.sleep(1000);
Assert.assertTrue(true);
}
@Test(expectedExceptions = IllegalArgumentException.class)
public void expectedExceptionTest() {
throw new IllegalArgumentException("Expected exception");
}
@Test(expectedExceptions = {IllegalArgumentException.class, NullPointerException.class})
public void multipleExpectedExceptionsTest() {
// Test passes if any of the expected exceptions is thrown
throw new IllegalArgumentException("One of expected exceptions");
}
@Test(invocationCount = 5)
public void repeatedTest() {
// This test will run 5 times
System.out.println("Repeated test execution");
Assert.assertTrue(true);
}
@Test(threadPoolSize = 3, invocationCount = 10)
public void parallelRepeatedTest() {
// Run 10 times using 3 threads
System.out.println("Thread: " + Thread.currentThread().getName());
Assert.assertTrue(true);
}
}
Test Configuration
XML Configuration (testng.xml)
<?xml version="1.0" encoding="UTF-8"?>
<suite name="TestSuite" parallel="methods" thread-count="3">
<parameter name="browser" value="chrome"/>
<parameter name="environment" value="staging"/>
<test name="SmokeTests">
<groups>
<run>
<include name="smoke"/>
</run>
</groups>
<classes>
<class name="com.example.tests.LoginTest"/>
<class name="com.example.tests.UserRegistrationTest"/>
</classes>
</test>
<test name="RegressionTests">
<groups>
<run>
<include name="regression"/>
<exclude name="slow"/>
</run>
</groups>
<packages>
<package name="com.example.tests.regression"/>
</packages>
</test>
</suite>
Programmatic Configuration
import org.testng.TestNG;
import org.testng.xml.XmlClass;
import org.testng.xml.XmlSuite;
import org.testng.xml.XmlTest;
public class ProgrammaticTestConfiguration {
public static void main(String[] args) {
// Create suite
XmlSuite suite = new XmlSuite();
suite.setName("ProgrammaticSuite");
suite.setParallel(XmlSuite.ParallelMode.METHODS);
suite.setThreadCount(3);
// Create test
XmlTest test = new XmlTest(suite);
test.setName("SmokeTest");
// Add classes
List<XmlClass> classes = new ArrayList<>();
classes.add(new XmlClass("com.example.tests.LoginTest"));
classes.add(new XmlClass("com.example.tests.UserTest"));
test.setXmlClasses(classes);
// Run tests
TestNG testng = new TestNG();
testng.setXmlSuites(Arrays.asList(suite));
testng.run();
}
}
Data Providers
Basic Data Provider
public class DataProviderExample {
@DataProvider(name = "loginData")
public Object[][] loginDataProvider() {
return new Object[][] {
{"[email protected]", "password1", true},
{"[email protected]", "password2", true},
{"[email protected]", "wrongpass", false},
{"", "password", false},
{"[email protected]", "", false}
};
}
@Test(dataProvider = "loginData")
public void testLogin(String email, String password, boolean expectedResult) {
LoginService loginService = new LoginService();
boolean result = loginService.login(email, password);
Assert.assertEquals(result, expectedResult);
}
@DataProvider(name = "numbers")
public Object[][] numberProvider() {
return new Object[][] {
{1, 2, 3},
{5, 7, 12},
{10, 15, 25},
{-1, 1, 0}
};
}
@Test(dataProvider = "numbers")
public void testAddition(int a, int b, int expected) {
Calculator calculator = new Calculator();
int result = calculator.add(a, b);
Assert.assertEquals(result, expected);
}
}
Advanced Data Providers
public class AdvancedDataProviderExample {
@DataProvider(name = "userTestData", parallel = true)
public Object[][] userDataProvider() {
return new Object[][] {
{new User("John", "Doe", 30)},
{new User("Jane", "Smith", 25)},
{new User("Bob", "Johnson", 35)}
};
}
@Test(dataProvider = "userTestData")
public void testUserValidation(User user) {
Assert.assertNotNull(user.getFirstName());
Assert.assertNotNull(user.getLastName());
Assert.assertTrue(user.getAge() > 0);
}
@DataProvider(name = "csvData")
public Iterator<Object[]> csvDataProvider() throws IOException {
List<Object[]> data = new ArrayList<>();
try (BufferedReader reader = Files.newBufferedReader(Paths.get("test-data.csv"))) {
String line;
while ((line = reader.readLine()) != null) {
String[] values = line.split(",");
data.add(values);
}
}
return data.iterator();
}
@Test(dataProvider = "csvData")
public void testWithCsvData(String name, String email, String age) {
Assert.assertNotNull(name);
Assert.assertTrue(email.contains("@"));
Assert.assertTrue(Integer.parseInt(age) > 0);
}
@DataProvider(name = "methodBasedData")
public Object[][] methodBasedDataProvider(Method method) {
// Different data based on test method
if (method.getName().equals("testPositiveNumbers")) {
return new Object[][] {{1}, {2}, {3}, {4}, {5}};
} else if (method.getName().equals("testNegativeNumbers")) {
return new Object[][] {{-1}, {-2}, {-3}, {-4}, {-5}};
}
return new Object[][] {{0}};
}
@Test(dataProvider = "methodBasedData")
public void testPositiveNumbers(int number) {
Assert.assertTrue(number > 0);
}
@Test(dataProvider = "methodBasedData")
public void testNegativeNumbers(int number) {
Assert.assertTrue(number < 0);
}
}
External Data Provider
public class ExternalDataProvider {
@DataProvider(name = "databaseUsers")
public Object[][] databaseUserProvider() {
// Fetch data from database
UserRepository repository = new UserRepository();
List<User> users = repository.findAllActiveUsers();
Object[][] data = new Object[users.size()][1];
for (int i = 0; i < users.size(); i++) {
data[i][0] = users.get(i);
}
return data;
}
@DataProvider(name = "apiTestData")
public Object[][] apiDataProvider() throws Exception {
// Fetch test data from API
RestTemplate restTemplate = new RestTemplate();
TestData[] testDataArray = restTemplate.getForObject(
"http://api.example.com/test-data",
TestData[].class
);
Object[][] data = new Object[testDataArray.length][1];
for (int i = 0; i < testDataArray.length; i++) {
data[i][0] = testDataArray[i];
}
return data;
}
@Test(dataProvider = "databaseUsers")
public void testUserProcessing(User user) {
UserProcessor processor = new UserProcessor();
ProcessingResult result = processor.process(user);
Assert.assertTrue(result.isSuccessful());
}
}
Test Groups
Basic Grouping
public class TestGroupsExample {
@Test(groups = {"smoke", "regression"})
public void loginTest() {
System.out.println("Login test - smoke and regression");
}
@Test(groups = {"smoke"})
public void quickValidationTest() {
System.out.println("Quick validation - smoke only");
}
@Test(groups = {"regression", "slow"})
public void comprehensiveTest() {
System.out.println("Comprehensive test - regression and slow");
}
@Test(groups = {"integration"})
public void databaseIntegrationTest() {
System.out.println("Database integration test");
}
@Test(groups = {"unit"})
public void unitTest() {
System.out.println("Unit test");
}
}
Group Dependencies
public class GroupDependenciesExample {
@Test(groups = {"setup"})
public void setupDatabase() {
System.out.println("Setting up database");
}
@Test(groups = {"setup"})
public void setupTestData() {
System.out.println("Setting up test data");
}
@Test(groups = {"functional"}, dependsOnGroups = {"setup"})
public void testUserCreation() {
System.out.println("Testing user creation");
}
@Test(groups = {"functional"}, dependsOnGroups = {"setup"})
public void testUserRetrieval() {
System.out.println("Testing user retrieval");
}
@Test(groups = {"cleanup"}, dependsOnGroups = {"functional"})
public void cleanupTestData() {
System.out.println("Cleaning up test data");
}
}
Running Specific Groups
<!-- In testng.xml -->
<test name="SmokeTests">
<groups>
<run>
<include name="smoke"/>
</run>
</groups>
<classes>
<class name="com.example.tests.TestGroupsExample"/>
</classes>
</test>
<test name="RegressionTests">
<groups>
<run>
<include name="regression"/>
<exclude name="slow"/>
</run>
</groups>
<classes>
<class name="com.example.tests.TestGroupsExample"/>
</classes>
</test>
Parallel Execution
Method-Level Parallelism
<suite name="ParallelSuite" parallel="methods" thread-count="5">
<test name="ParallelTest">
<classes>
<class name="com.example.tests.ParallelTest"/>
</classes>
</test>
</suite>
Class-Level Parallelism
<suite name="ParallelSuite" parallel="classes" thread-count="3">
<test name="ParallelTest">
<classes>
<class name="com.example.tests.TestClass1"/>
<class name="com.example.tests.TestClass2"/>
<class name="com.example.tests.TestClass3"/>
</classes>
</test>
</suite>
Thread-Safe Test Implementation
public class ThreadSafeTest {
// Use ThreadLocal for thread-safe data
private ThreadLocal<WebDriver> driver = new ThreadLocal<>();
private ThreadLocal<DatabaseConnection> dbConnection = new ThreadLocal<>();
@BeforeMethod
public void setUp() {
// Initialize thread-local resources
driver.set(new ChromeDriver());
dbConnection.set(new DatabaseConnection());
}
@Test(threadPoolSize = 3, invocationCount = 10)
public void parallelWebTest() {
WebDriver webDriver = driver.get();
webDriver.get("http://example.com");
String title = webDriver.getTitle();
Assert.assertNotNull(title);
System.out.println("Thread: " + Thread.currentThread().getName() +
", Title: " + title);
}
@Test(threadPoolSize = 2, invocationCount = 5)
public void parallelDatabaseTest() {
DatabaseConnection connection = dbConnection.get();
List<User> users = connection.findAllUsers();
Assert.assertNotNull(users);
Assert.assertTrue(users.size() > 0);
System.out.println("Thread: " + Thread.currentThread().getName() +
", Users found: " + users.size());
}
@AfterMethod
public void tearDown() {
// Clean up thread-local resources
if (driver.get() != null) {
driver.get().quit();
driver.remove();
}
if (dbConnection.get() != null) {
dbConnection.get().close();
dbConnection.remove();
}
}
}
Test Dependencies
Method Dependencies
public class TestDependenciesExample {
@Test
public void createUser() {
System.out.println("Creating user");
// User creation logic
Assert.assertTrue(true, "User created successfully");
}
@Test(dependsOnMethods = {"createUser"})
public void loginUser() {
System.out.println("Logging in user");
// Login logic that depends on user creation
Assert.assertTrue(true, "User logged in successfully");
}
@Test(dependsOnMethods = {"loginUser"})
public void updateUserProfile() {
System.out.println("Updating user profile");
// Profile update logic
Assert.assertTrue(true, "Profile updated successfully");
}
@Test(dependsOnMethods = {"createUser", "loginUser"})
public void deleteUser() {
System.out.println("Deleting user");
// User deletion logic
Assert.assertTrue(true, "User deleted successfully");
}
// This test will be skipped if createUser fails
@Test(dependsOnMethods = {"createUser"}, alwaysRun = true)
public void cleanupTest() {
System.out.println("Cleanup - this always runs");
}
}
Soft Dependencies
public class SoftDependenciesExample {
@Test
public void independentTest1() {
System.out.println("Independent test 1");
Assert.assertTrue(true);
}
@Test
public void independentTest2() {
System.out.println("Independent test 2");
Assert.fail("This test fails");
}
// This test runs even if independentTest2 fails
@Test(dependsOnMethods = {"independentTest1", "independentTest2"},
ignoreMissingDependencies = true)
public void dependentTest() {
System.out.println("Dependent test - runs despite failures");
Assert.assertTrue(true);
}
}
Reporting
Built-in Reports
TestNG automatically generates HTML and XML reports:
// Configure reporting in testng.xml
<suite name="TestSuite">
<listeners>
<listener class-name="org.testng.reporters.EmailableReporter"/>
<listener class-name="org.testng.reporters.FailedReporter"/>
</listeners>
<!-- Test configuration -->
</suite>
Custom Listeners
import org.testng.ITestListener;
import org.testng.ITestResult;
public class CustomTestListener implements ITestListener {
@Override
public void onTestStart(ITestResult result) {
System.out.println("Test started: " + result.getMethod().getMethodName());
}
@Override
public void onTestSuccess(ITestResult result) {
System.out.println("Test passed: " + result.getMethod().getMethodName());
}
@Override
public void onTestFailure(ITestResult result) {
System.out.println("Test failed: " + result.getMethod().getMethodName());
System.out.println("Failure reason: " + result.getThrowable().getMessage());
// Take screenshot for web tests
if (result.getInstance() instanceof WebTest) {
takeScreenshot(result.getMethod().getMethodName());
}
}
@Override
public void onTestSkipped(ITestResult result) {
System.out.println("Test skipped: " + result.getMethod().getMethodName());
}
private void takeScreenshot(String testName) {
// Screenshot logic
System.out.println("Screenshot taken for: " + testName);
}
}
// Use the listener
@Listeners(CustomTestListener.class)
public class WebTest {
@Test
public void testWebPage() {
// Test implementation
Assert.assertTrue(true);
}
}
ExtentReports Integration
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.reporter.ExtentSparkReporter;
public class ExtentReportListener implements ITestListener {
private static ExtentReports extent;
private static ThreadLocal<ExtentTest> test = new ThreadLocal<>();
@Override
public void onStart(ITestContext context) {
ExtentSparkReporter reporter = new ExtentSparkReporter("extent-report.html");
extent = new ExtentReports();
extent.attachReporter(reporter);
}
@Override
public void onTestStart(ITestResult result) {
ExtentTest extentTest = extent.createTest(result.getMethod().getMethodName());
test.set(extentTest);
}
@Override
public void onTestSuccess(ITestResult result) {
test.get().pass("Test passed");
}
@Override
public void onTestFailure(ITestResult result) {
test.get().fail("Test failed: " + result.getThrowable().getMessage());
}
@Override
public void onFinish(ITestContext context) {
extent.flush();
}
}
Best Practices
Test Organization
// Good: Organized test class with clear structure
public class UserManagementTest {
private UserService userService;
private DatabaseHelper dbHelper;
@BeforeClass
public void setUpClass() {
dbHelper = new DatabaseHelper();
dbHelper.initializeTestDatabase();
}
@BeforeMethod
public void setUp() {
userService = new UserService();
dbHelper.clearUserTable();
}
@Test(groups = {"smoke", "user-creation"},
description = "Verify user creation with valid data")
public void testCreateUserWithValidData() {
// Given
UserCreateRequest request = new UserCreateRequest(
"[email protected]", "John", "Doe", 30
);
// When
User createdUser = userService.createUser(request);
// Then
Assert.assertNotNull(createdUser.getId());
Assert.assertEquals(createdUser.getEmail(), "[email protected]");
Assert.assertEquals(createdUser.getFirstName(), "John");
Assert.assertEquals(createdUser.getLastName(), "Doe");
Assert.assertEquals(createdUser.getAge(), 30);
}
@Test(groups = {"regression", "user-validation"},
description = "Verify user creation fails with invalid email",
expectedExceptions = InvalidEmailException.class)
public void testCreateUserWithInvalidEmail() {
UserCreateRequest request = new UserCreateRequest(
"invalid-email", "John", "Doe", 30
);
userService.createUser(request);
}
@AfterClass
public void tearDownClass() {
dbHelper.cleanupTestDatabase();
}
}
Data-Driven Testing Best Practices
public class DataDrivenTestExample {
@DataProvider(name = "userValidationData")
public Object[][] userValidationDataProvider() {
return new Object[][] {
// email, firstName, lastName, age, expectedValid
{"[email protected]", "John", "Doe", 25, true},
{"[email protected]", "Jane", "Smith", 30, true},
{"", "John", "Doe", 25, false}, // Empty email
{"invalid-email", "John", "Doe", 25, false}, // Invalid email
{"[email protected]", "", "Doe", 25, false}, // Empty first name
{"[email protected]", "John", "", 25, false}, // Empty last name
{"[email protected]", "John", "Doe", -1, false}, // Invalid age
{"[email protected]", "John", "Doe", 151, false} // Invalid age
};
}
@Test(dataProvider = "userValidationData",
description = "Validate user data with various inputs")
public void testUserValidation(String email, String firstName, String lastName,
int age, boolean expectedValid) {
// Given
UserValidator validator = new UserValidator();
User user = new User(email, firstName, lastName, age);
// When
boolean isValid = validator.isValid(user);
// Then
Assert.assertEquals(isValid, expectedValid,
String.format("Validation failed for user: %s, %s, %s, %d",
email, firstName, lastName, age));
}
}
Parallel Testing Best Practices
public class ParallelTestBestPractices {
// Use ThreadLocal for thread-safe test data
private ThreadLocal<TestContext> testContext = new ThreadLocal<>();
@BeforeMethod
public void setUp() {
TestContext context = new TestContext();
context.setTestId(UUID.randomUUID().toString());
context.setStartTime(System.currentTimeMillis());
testContext.set(context);
}
@Test(threadPoolSize = 3, invocationCount = 9)
public void parallelTest() {
TestContext context = testContext.get();
String threadName = Thread.currentThread().getName();
System.out.printf("Test ID: %s, Thread: %s%n",
context.getTestId(), threadName);
// Simulate test work
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
Assert.assertTrue(true);
}
@AfterMethod
public void tearDown() {
TestContext context = testContext.get();
long duration = System.currentTimeMillis() - context.getStartTime();
System.out.printf("Test %s completed in %d ms%n",
context.getTestId(), duration);
testContext.remove();
}
}
Key Takeaways
- TestNG provides powerful features for advanced test scenarios
- XML configuration enables flexible test suite organization
- Data providers support comprehensive data-driven testing
- Built-in parallel execution improves test performance
- Test dependencies help organize complex test workflows
- Groups enable logical test organization and selective execution
- Rich reporting capabilities provide detailed test insights
- Thread-safe practices are essential for parallel testing
TestNG is particularly well-suited for enterprise testing scenarios where advanced features like test dependencies, parallel execution, and comprehensive reporting are crucial for maintaining large test suites.