I've been trying to scrape some information of this real estate website with selenium. However when I access the website I need to accept cookies to continue. This only happens when the bot accesses the website, not when I do it manually. When I try to find the corresponding element either by xpath or id, as I find it when I inspect the page manually I get following error.
selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//*[@id="uc-btn-accept-banner"]"}
from selenium import webdriver from selenium.webdriver.support.ui import WebDriverWait from selenium.webdriver.support import expected_conditions as EC PATH = "/usr/bin/chromedriver" driver = webdriver.Chrome(PATH) driver.get("https://www.immoweb.be/en/search/house/for-sale?countries=BE&page=1&orderBy=relevance") driver.find_element_by_xpath('//*[@id="uc-btn-accept-banner"]').click() Does anyone know how to get around this? Why can't I find the element?
Below is an image of the accept cookies popup.
This is the HTML corresponding to the button "Keep browsing". The XPATH is as above.
<button aria-label="Keep browsing" id="uc-btn-accept-banner" class="uc-btn-new uc-btn-accept">Keep browsing <span id="uc-optin-timer-display"></span></button> 
driver.get()call. Now you can inspect the page directly in Chrome to discover how to access the popup.alert = driver.switch_to.alertpart, your code works.