Certainly! Let’s continue building a comprehensive guide based on your detailed transcript. This section will cover TestNG Listeners and Generating Extent Reports. Understanding these concepts is crucial for enhancing your test automation framework by enabling post-execution actions and generating detailed, customizable reports.
Table of Contents
- 1. Introduction to TestNG Listeners
- 2. Implementing TestNG Listeners
- 3. Generating Extent Reports Using TestNG Listeners
- 4. Common Pitfalls and How to Avoid Them
- 5. Conclusion
1. Introduction to TestNG Listeners
• What Are TestNG Listeners?
TestNG Listeners are interfaces that allow you to modify the default TestNG behavior. By implementing these listeners, you can execute custom code at specific points during the test execution lifecycle. This is particularly useful for performing post-actions such as logging, reporting, or taking screenshots based on test outcomes.
• Use Cases for TestNG Listeners
- Logging Test Execution: Capture logs before and after test methods run.
- Custom Reporting: Generate detailed reports with additional information.
- Taking Screenshots on Failure: Automatically capture screenshots when tests fail.
- Retry Mechanism: Implement logic to retry failed tests.
- Resource Management: Allocate or release resources before and after tests.
2. Implementing TestNG Listeners
• Step 1: Create Test Cases with Multiple Test Methods
Start by creating a test class containing multiple test methods. These methods will simulate various test outcomes: pass, fail, and skip.
Example: OrangeHRMTest.java
package day46;
import org.testng.Assert;
import org.testng.annotations.AfterClass;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.Test;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
public class OrangeHRMTest {
WebDriver driver;
@BeforeClass
public void setup() {
System.setProperty("webdriver.chrome.driver", "path/to/chromedriver");
driver = new ChromeDriver();
driver.manage().window().maximize();
driver.manage().timeouts().implicitlyWait(java.time.Duration.ofSeconds(10));
driver.get("https://www.orangehrm.com");
}
@Test(priority = 1)
public void testLogo() {
// Logic to verify logo presence
boolean isLogoPresent = driver.findElement(By.id("logo")).isDisplayed();
Assert.assertTrue(isLogoPresent, "Logo is not present on the homepage.");
}
@Test(priority = 2)
public void testAppURL() {
// Intentionally incorrect URL to demonstrate failure
String actualURL = driver.getCurrentUrl();
String expectedURL = "https://www.orangehrm.com/home";
Assert.assertEquals(actualURL, expectedURL, "Application URL does not match.");
}
@Test(priority = 3, dependsOnMethods = {"testAppURL"})
public void testHomePageTitle() {
String actualTitle = driver.getTitle();
String expectedTitle = "OrangeHRM";
Assert.assertEquals(actualTitle, expectedTitle, "Home page title does not match.");
}
@AfterClass
public void tearDown() {
if (driver != null) {
driver.quit();
}
}
}
Explanation:
testLogo:
Verifies the presence of the logo. Expected to pass.testAppURL:
Checks the application URL. Intentionally set to fail by providing an incorrect expected URL.testHomePageTitle:
Depends ontestAppURL
. SincetestAppURL
fails, this test will be skipped.
• Step 2: Create a Listener Class
Listeners enable you to perform actions based on test execution events. You can create a listener class by either:
- Implementing the
ITestListener
Interface - Extending the
TestListenerAdapter
Class
Method 1: Implementing ITestListener Interface
Pros:
- Complete control over all listener methods.
- Ability to implement only the required methods.
Cons:
- Requires implementing all methods of the interface, even if not needed.
Example: MyListener.java
package day46;
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;
public class MyListener implements ITestListener {
@Override
public void onStart(ITestContext context) {
System.out.println("Test Execution Started: " + context.getName());
}
@Override
public void onFinish(ITestContext context) {
System.out.println("Test Execution Finished: " + context.getName());
}
@Override
public void onTestStart(ITestResult result) {
System.out.println("Test Started: " + result.getName());
}
@Override
public void onTestSuccess(ITestResult result) {
System.out.println("Test Passed: " + result.getName());
}
@Override
public void onTestFailure(ITestResult result) {
System.out.println("Test Failed: " + result.getName());
System.out.println("Error Message: " + result.getThrowable());
}
@Override
public void onTestSkipped(ITestResult result) {
System.out.println("Test Skipped: " + result.getName());
}
// Other methods can be overridden as needed
}
Method 2: Extending TestListenerAdapter
Pros:
- Provides default implementations for all listener methods.
- Allows overriding only the methods you need.
Cons:
- Less flexibility if you need to implement multiple interfaces or complex logic.
Example: MyListenerAdapter.java
package day46;
import org.testng.ITestResult;
import org.testng.TestListenerAdapter;
public class MyListenerAdapter extends TestListenerAdapter {
@Override
public void onTestStart(ITestResult result) {
System.out.println("Test Started (Adapter): " + result.getName());
}
@Override
public void onTestSuccess(ITestResult result) {
System.out.println("Test Passed (Adapter): " + result.getName());
}
@Override
public void onTestFailure(ITestResult result) {
System.out.println("Test Failed (Adapter): " + result.getName());
System.out.println("Error Message: " + result.getThrowable());
}
@Override
public void onTestSkipped(ITestResult result) {
System.out.println("Test Skipped (Adapter): " + result.getName());
}
// Override other methods as needed
}
• Step 3: Integrate Listener Class Using TestNG XML
To ensure that the listener is applied across all test classes, it’s recommended to integrate it via the TestNG XML file. This approach avoids the need to annotate each test class individually, especially beneficial when dealing with multiple test classes.
Example: extentReport.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="ExtentReportSuite" parallel="tests" thread-count="3">
<!-- Define Listeners -->
<listeners>
<listener class-name="day46.MyListener"/>
</listeners>
<!-- Define Tests -->
<test name="OrangeHRMTest">
<classes>
<class name="day46.OrangeHRMTest"/>
</classes>
</test>
</suite>
Explanation:
<listeners>
Tag:- Contains one or more
<listener>
tags specifying the listener classes. class-name
Attribute:- Fully qualified name of the listener class.
- Contains one or more
<test>
Tag:- Defines individual tests, each containing one or more classes.
parallel="tests"
andthread-count="3"
:- Configures parallel execution of tests (not directly related to listeners but often used together).
Running the Test:
- Right-click on the
extentReport.xml
file in Eclipse. - Select Run As > TestNG Suite.
Expected Console Output:
Test Execution Started: OrangeHRMTest
Test Started: testLogo
Test Passed: testLogo
Test Started: testAppURL
Test Failed: testAppURL
Error Message: java.lang.AssertionError: Application URL does not match.
Test Started: testHomePageTitle
Test Skipped: testHomePageTitle
Test Execution Finished: OrangeHRMTest
• Alternative Integration: Using @Listeners Annotation
If you prefer not to use the XML file for listener integration, you can use the @Listeners
annotation directly in your test classes. However, this approach requires adding the annotation to every test class, which can be cumbersome for large projects.
Example: OrangeHRMTest.java with @Listeners Annotation
package day46;
import org.testng.Assert;
import org.testng.annotations.AfterClass;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.Listeners;
import org.testng.annotations.Test;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
@Listeners(day46.MyListener.class)
public class OrangeHRMTest {
WebDriver driver;
@BeforeClass
public void setup() {
System.setProperty("webdriver.chrome.driver", "path/to/chromedriver");
driver = new ChromeDriver();
driver.manage().window().maximize();
driver.manage().timeouts().implicitlyWait(java.time.Duration.ofSeconds(10));
driver.get("https://www.orangehrm.com");
}
@Test(priority = 1)
public void testLogo() {
// Logic to verify logo presence
boolean isLogoPresent = driver.findElement(By.id("logo")).isDisplayed();
Assert.assertTrue(isLogoPresent, "Logo is not present on the homepage.");
}
@Test(priority = 2)
public void testAppURL() {
// Intentionally incorrect URL to demonstrate failure
String actualURL = driver.getCurrentUrl();
String expectedURL = "https://www.orangehrm.com/home";
Assert.assertEquals(actualURL, expectedURL, "Application URL does not match.");
}
@Test(priority = 3, dependsOnMethods = {"testAppURL"})
public void testHomePageTitle() {
String actualTitle = driver.getTitle();
String expectedTitle = "OrangeHRM";
Assert.assertEquals(actualTitle, expectedTitle, "Home page title does not match.");
}
@AfterClass
public void tearDown() {
if (driver != null) {
driver.quit();
}
}
}
Pros:
- Quick integration for individual test classes.
- Useful for small projects or specific test classes requiring unique listeners.
Cons:
- Requires annotating each test class, leading to redundancy.
- Not scalable for large projects with multiple test classes.
• Best Practices for Listener Integration
- Use XML-Based Integration: Prefer integrating listeners via the TestNG XML file to ensure consistency and scalability across all test classes.
- Single Listener Class: Maintain a single listener class for the entire project to manage report generation and post-actions uniformly.
- Avoid Multiple Listeners: Multiple listener classes can lead to conflicting behaviors and duplicate reports.
- Centralize Listener Logic: Implement all post-action logic within a centralized listener class to simplify maintenance and updates.
3. Generating Extent Reports Using TestNG Listeners
• What Are Extent Reports?
Extent Reports are a third-party reporting library for test automation frameworks. They provide visually appealing, interactive, and comprehensive reports compared to TestNG’s default reports. Extent Reports offer features like:
- Detailed test case information.
- Interactive dashboards with charts and graphs.
- Attachments such as screenshots.
- Customizable themes and layouts.
• Setting Up Extent Reports
Adding Extent Reports Dependencies
To use Extent Reports in your project, you need to add the necessary dependencies. If you’re using Maven, add the following to your pom.xml
:
<dependencies>
<!-- Existing dependencies -->
<!-- Extent Reports Dependency -->
<dependency>
<groupId>com.aventstack</groupId>
<artifactId>extentreports</artifactId>
<version>5.1.5</version>
</dependency>
</dependencies>
Note: Replace the version number with the latest version available from the Maven Repository.
Steps:
- Open pom.xml:
- Located at the root of your Maven project.
- Add Extent Reports Dependency:
- Insert the above
<dependency>
block within the<dependencies>
section.
- Insert the above
- Update Maven Project:
- Right-click on the project in Eclipse.
- Select Maven > Update Project to download the dependencies.
Creating a Utility Class for Extent Reports
To manage Extent Reports efficiently, create a utility class that initializes and configures the reports. This class will be used by the listener to generate and manage reports.
Example: ExtentReportManager.java
package day46;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.reporter.ExtentSparkReporter;
public class ExtentReportManager {
private static ExtentReports extent;
public static ExtentReports getInstance() {
if (extent == null) {
// Create ExtentSparkReporter
ExtentSparkReporter spark = new ExtentSparkReporter(System.getProperty("user.dir") + "/reports/" + getReportName() + ".html");
spark.config().setDocumentTitle("Automation Report");
spark.config().setReportName("Functional Testing");
spark.config().setTheme(com.aventstack.extentreports.reporter.configuration.Theme.STANDARD);
// Initialize ExtentReports and attach the reporter
extent = new ExtentReports();
extent.attachReporter(spark);
// Set system info
extent.setSystemInfo("OS", System.getProperty("os.name"));
extent.setSystemInfo("Browser", "Chrome");
extent.setSystemInfo("Tester", "Your Name");
}
return extent;
}
private static String getReportName() {
DateTimeFormatter dtf = DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss");
LocalDateTime now = LocalDateTime.now();
return "AutomationReport_" + dtf.format(now);
}
}
Explanation:
- Singleton Pattern: Ensures only one instance of
ExtentReports
exists. - Dynamic Report Naming: Appends a timestamp to the report name to prevent overwriting previous reports.
- Configuration:
- Document Title: Sets the title of the HTML report.
- Report Name: Names the report section.
- Theme: Chooses between STANDARD and DARK.
- System Info: Populates common details like OS, Browser, and Tester name.
Directory Structure:
Ensure that a reports
folder exists at the project root to store the generated reports. If it doesn’t exist, you can create it manually or modify the code to create it programmatically.
• Integrating Extent Reports with Listener Class
Now, enhance the listener class to utilize Extent Reports for generating detailed reports based on test execution outcomes.
Updated Listener Class: MyListener.java
package day46;
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
public class MyListener implements ITestListener {
ExtentReports extent = ExtentReportManager.getInstance();
ExtentTest test;
@Override
public void onStart(ITestContext context) {
System.out.println("Test Execution Started: " + context.getName());
}
@Override
public void onFinish(ITestContext context) {
extent.flush();
System.out.println("Test Execution Finished: " + context.getName());
}
@Override
public void onTestStart(ITestResult result) {
System.out.println("Test Started: " + result.getName());
test = extent.createTest(result.getMethod().getMethodName());
}
@Override
public void onTestSuccess(ITestResult result) {
System.out.println("Test Passed: " + result.getName());
test.pass("Test Passed");
}
@Override
public void onTestFailure(ITestResult result) {
System.out.println("Test Failed: " + result.getName());
test.fail(result.getThrowable());
// Capture and attach screenshot (to be implemented)
}
@Override
public void onTestSkipped(ITestResult result) {
System.out.println("Test Skipped: " + result.getName());
test.skip(result.getThrowable());
}
// Other methods can be overridden as needed
}
Explanation:
- ExtentReports Instance:
- Obtained from
ExtentReportManager
.
- Obtained from
- ExtentTest Instance:
- Represents individual test cases in the report.
- Overridden Methods:
onTestStart:
Creates a new test entry in the report.onTestSuccess:
Logs the test as passed.onTestFailure:
Logs the test as failed and attaches the error.onTestSkipped:
Logs the test as skipped and attaches the reason.
Enhancements:
- Dynamic Report Naming: Prevents overwriting by using timestamps.
- System Info Population: Adds contextual information to the report.
• Customizing Extent Reports
Dynamic Report Naming with Timestamps
To maintain a history of reports and prevent overwriting, modify the report name to include a timestamp. This is already implemented in the ExtentReportManager
class using the getReportName()
method.
Code Snippet:
private static String getReportName() {
DateTimeFormatter dtf = DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss");
LocalDateTime now = LocalDateTime.now();
return "AutomationReport_" + dtf.format(now);
}
Benefit:
- Historical Tracking: Allows you to compare reports over time.
- Prevent Data Loss: Ensures that each report is unique and previous reports are retained.
Capturing Screenshots on Test Failures
Enhance the listener to capture screenshots when tests fail and attach them to the Extent Report.
Steps:
- Create a Utility Method to Capture Screenshots:
Example: ScreenshotUtil.java
package day46;
import java.io.File;
import java.io.IOException;
import org.apache.commons.io.FileUtils;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.TakesScreenshot;
import org.openqa.selenium.WebDriver;
public class ScreenshotUtil {
public static String captureScreenshot(WebDriver driver, String screenshotName) {
String destination = System.getProperty("user.dir") + "/screenshots/" + screenshotName + ".png";
TakesScreenshot ts = (TakesScreenshot) driver;
File src = ts.getScreenshotAs(OutputType.FILE);
File dest = new File(destination);
try {
FileUtils.copyFile(src, dest);
} catch (IOException e) {
e.printStackTrace();
}
return destination;
}
}
Explanation:
- captureScreenshot Method:
- Captures the screenshot and saves it to the specified location.
- Returns the path of the saved screenshot for attachment.
2. Modify the Listener to Use the Screenshot Utility:
Updated MyListener.java
package day46;
import org.openqa.selenium.WebDriver;
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.Status;
public class MyListener implements ITestListener {
ExtentReports extent = ExtentReportManager.getInstance();
ExtentTest test;
@Override
public void onStart(ITestContext context) {
System.out.println("Test Execution Started: " + context.getName());
}
@Override
public void onFinish(ITestContext context) {
extent.flush();
System.out.println("Test Execution Finished: " + context.getName());
}
@Override
public void onTestStart(ITestResult result) {
System.out.println("Test Started: " + result.getName());
test = extent.createTest(result.getMethod().getMethodName());
}
@Override
public void onTestSuccess(ITestResult result) {
System.out.println("Test Passed: " + result.getName());
test.log(Status.PASS, "Test Passed");
}
@Override
public void onTestFailure(ITestResult result) {
System.out.println("Test Failed: " + result.getName());
test.log(Status.FAIL, result.getThrowable());
// Capture Screenshot
Object currentClass = result.getInstance();
WebDriver driver = ((OrangeHRMTest) currentClass).driver;
String screenshotPath = ScreenshotUtil.captureScreenshot(driver, result.getName());
try {
test.addScreenCaptureFromPath(screenshotPath);
} catch (IOException e) {
e.printStackTrace();
}
}
@Override
public void onTestSkipped(ITestResult result) {
System.out.println("Test Skipped: " + result.getName());
test.log(Status.SKIP, result.getThrowable());
}
// Other methods can be overridden as needed
}
Explanation:
- Capturing Screenshots on Failure:
- Retrieves the WebDriver instance from the test class.
- Captures the screenshot using
ScreenshotUtil
. - Attaches the screenshot to the Extent Report.
- Status Logging:
- Uses
Status
enum for more detailed logging.
- Uses
Enhancing Report Details
Adding Test Case IDs and Class Names
Instead of logging individual test methods, it’s more manageable to log test case IDs and class names, especially when dealing with a large number of tests.
Modification in Listener:
- Update Test Creation with Class Name and Test Case ID:
@Override
public void onTestStart(ITestResult result) {
System.out.println("Test Started: " + result.getName());
String testName = result.getMethod().getMethodName();
String className = result.getMethod().getRealClass().getSimpleName();
test = extent.createTest(className + " :: " + testName);
}
Explanation:
- Combines the class name with the test method name for better identification in the report.
Assign Test Case IDs (Optional):
@Override
public void onTestStart(ITestResult result) {
System.out.println("Test Started: " + result.getName());
String testCaseID = "TC_" + result.getMethod().getMethodName(); // Example ID
String className = result.getMethod().getRealClass().getSimpleName();
test = extent.createTest(className + " :: " + testCaseID);
}
Benefit:
- Enhances traceability between test cases and reports.
4. Common Pitfalls and How to Avoid Them
- Overwriting Reports:
- Issue: Hardcoding report names leads to overwriting previous reports.
- Solution: Implement dynamic report naming using timestamps as shown in the
ExtentReportManager
class.
- Missing Screenshots on Failures:
- Issue: Failing to capture screenshots makes debugging difficult.
- Solution: Integrate screenshot capture in the
onTestFailure
method of the listener.
- Multiple Listener Integrations:
- Issue: Adding multiple listeners can cause duplicate logging and reports.
- Solution: Use a single, centralized listener class for consistency.
- Not Managing WebDriver Instances Properly:
- Issue: Incorrectly retrieving WebDriver instances can lead to
NullPointerException
. - Solution: Ensure that the WebDriver instance is accessible to the listener, possibly by using getter methods or thread-safe mechanisms.
- Issue: Incorrectly retrieving WebDriver instances can lead to
- Ignoring Listener Configuration in XML:
- Issue: Forgetting to integrate the listener in the TestNG XML leads to listeners not being triggered.
- Solution: Always verify that the listener is correctly referenced in the XML file.
- Not Flushing the Extent Reports:
- Issue: Failing to call
extent.flush()
prevents the report from being written. - Solution: Ensure that
extent.flush()
is called in theonFinish
method.
- Issue: Failing to call
5. Conclusion
TestNG Listeners and Extent Reports significantly enhance your test automation framework by enabling dynamic post-actions and generating comprehensive, user-friendly reports. By implementing listeners, you can automate tasks like logging, reporting, and screenshot capturing based on test outcomes. Integrating Extent Reports with listeners provides a robust reporting mechanism that offers detailed insights into your test executions.
Key Takeaways:
- TestNG Listeners (ITestListener):
- Allow execution of custom code at various points in the test lifecycle.
- Facilitate actions like logging, reporting, and resource management.
- Implementing Listeners:
- Can be done by implementing the
ITestListener
interface or extending theTestListenerAdapter
class. - Integration via TestNG XML is preferred for scalability and maintainability.
- Can be done by implementing the
- Extent Reports:
- Offer advanced, customizable, and visually appealing reports.
- Support features like dynamic naming, system information, and screenshot attachments.
- Best Practices:
- Use a centralized listener class to manage post-actions uniformly.
- Avoid hardcoding report names and system info; implement dynamic retrieval.
- Ensure proper directory structures for reports and screenshots.
- Limit the number of listener classes to prevent conflicts and redundancy.
Next Steps:
- Enhance Extent Reports:
- Implement dynamic system information retrieval.
- Add functionality to capture and attach screenshots on test failures.
- Integrate with Continuous Integration (CI) Tools:
- Automate test executions and report generation using tools like Jenkins, GitLab CI, or CircleCI.
- Explore Advanced Reporting Features:
- Customize report themes and layouts.
- Incorporate interactive elements and detailed logs.
- Implement Additional Listeners:
- Explore other TestNG listeners like
IReporter
,ISuiteListener
, etc., for more granular control.
- Explore other TestNG listeners like
Happy Testing! 🚀