Working with Shadow DOM Elements using Webdriver

When we try to find Shadow DOM elements using selenium locators, it will throw 'NoSuchElementException'. To access these Shadow DOM elements, we need to use JavascriptExecutor executeScript() function. If you look at the DOM structure, every element that has ShadowDOM also has a shadowRoot property which describes the underlying elements.

Before looking at the example, first let see about DOM and Shadow Dom.
DOM - It is a programming interface that treats an HTML, XHTML, or XML document as a tree structure wherein each node is an object representing a part of the document.

Shadow DOM provides encapsulation for the JavaScript, CSS, and templating in a Web Component. Shadow DOM is just normal DOM with two differences, one is 'how it is created/used' and other one is 'how it behaves in relation to the rest of the page'. Shadow DOM separates content from presentation thereby eliminating naming conflicts and improving code expression.

We will look an example using Chrome's download page. If you observe the below image, to get the header text (h1 tag), we need to navigate to 3 nested shadow root elements.

Selenium webdriver to access Shadow Elements

In order to do this, first we will try to get the WebElement of the Shadow-root using below method. The below commend will return a web element after executing executeScript() function.

public WebElement expandRootElement(WebElement element) {
	WebElement ele = (WebElement) ((JavascriptExecutor)driver)
.executeScript("return arguments[0].shadowRoot", element);
	return ele;
}

Now using this root WebElement, we will try to find the next root element and proceed until we reach the final Shadow Root element. So here in our example, we will navigate till shadow root 3 to access header tag h1.

Here is the complete example on accessing shadow DOM Elements(polymer elements) through selenium webdriver javascriptExecutor in Java.
package com.easy;

import org.junit.Assert;
import org.openqa.selenium.By;
import org.openqa.selenium.JavascriptExecutor;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.annotations.AfterTest;
import org.testng.annotations.BeforeTest;
import org.testng.annotations.Test;

public class ShadowDOMExample {

	WebDriver driver;
	String driverPath = "F:/Jars/chromedriver/";

	@BeforeTest
	public void setUp() {
		System.out.println("Opening chrome browser");
		System.setProperty("webdriver.chrome.driver", driverPath + "chromedriver.exe");
		driver = new ChromeDriver();
	}

	@Test
	public void testGetText_FromShadowDOMElements() {
		System.out.println("Open Chrome downloads");
		driver.get("chrome://downloads/");

		System.out.println("Validate downloads page header text");
		WebElement root1 = driver.findElement(By.tagName("downloads-manager"));

                //Get shadow root element
		WebElement shadowRoot1 = expandRootElement(root1);

		WebElement root2 = shadowRoot1.findElement(By.cssSelector("downloads-toolbar"));
		WebElement shadowRoot2 = expandRootElement(root2);

		WebElement root3 = shadowRoot2.findElement(By.cssSelector("cr-toolbar"));
		WebElement shadowRoot3 = expandRootElement(root3);

		String actualHeading = shadowRoot3.findElement(By.cssSelector("div[id=leftContent]>h1")).getText();
		// Verify header title
		Assert.assertEquals("Downloads", actualHeading);

	}

	//Returns webelement
	public WebElement expandRootElement(WebElement element) {
		WebElement ele = (WebElement) ((JavascriptExecutor) driver)
.executeScript("return arguments[0].shadowRoot",element);
		return ele;
	}

	@AfterTest
	public void tearDown() {
		driver.quit();
	}
}

Chrome Browser has native support for ShadowDom from Chrome Version v49. So chrome browser renders Shadow DOM when the application is built with Polymer webcomponent.

In order to access Shadow DOM elements with selenium, traditional locators will throw 'NoSuchElementException'. We have to use JavascriptExecutor execute() function to work. But which is not the case with Firefox, all selenium locators will work as usual as Shadow DOM is not currently supported by default.

Below image shows the browsers which has native support for Shadow DOM and that are in development.

Shadow DOM supported browsers

Firefox Shadow DOM is not supported by default (until now with Firefox v52), but that can be enabled. To Enable Web Components in Firefox, type about:config in the address bar to navigate to config page and dismiss any warning that appears. Then search for the preference called dom.webcomponents.enabled, and set it to true. Be cautious before making any changes to config.

You can check the status here. Firefox internally uses a Polyfill which is a browser fallback, which is made in JavaScript, that allows functionality you expect to work in modern browsers to work in older browsers, e.g., to support canvas (an HTML5 feature) in older browsers.

Let me try to execute below test in Firefox browser (Executed in Firefox v49.0.1) on a website which was developed with Polymer Webcomponent and has Shadow DOM elements.

Application URL : - https://shop.polymer-project.org/

	public void testClickMenuItem_And_ValidatePageHeader() throws InterruptedException {
		System.out.println("Open Online Shop");
		driver.get("https://shop.polymer-project.org/");
		
		wait = new WebDriverWait(driver, 5);
		By byMenu = By.linkText("Ladies Outerwear");
		wait.until(ExpectedConditions.visibilityOfElementLocated(byMenu));
		driver.findElement(byMenu).click();
		
		//Validate Page title
		By byHeading = By.cssSelector("header>h1");
		wait.until(ExpectedConditions.presenceOfElementLocated(byHeading));
		
		String getActualHeaderText = driver.findElement(byHeading).getText();
		Assert.assertEquals(getActualHeaderText, "Ladies Outerwear");
	}

The above test will be executed without any issues in Firefox browser using Geckodriver

But if executed in chrome, it throws NoSuchEelementException, as chrome browser has native support of Shadow DOM by default. Try to execute the above test in both Firefox and Chrome browser to see the difference.

Selenium Tutorials: 

Comments

Thank you for writing this article. I was looking for a better way to validate downloads. Whenever the server had a hiccup my tests would fail because a download was taking too long. Now I can verify that a new download item was added to the chrome download page before I move on to the next test. Thanks again.

Add new comment