How to Scrape WebElements: Periodic Table Data Guide
Extract precise chemical element data from WebElements. Scrape atomic weights, physical properties, and discovery history for research and AI applications.
About WebElements
Learn what WebElements offers and what valuable data can be extracted from it.
WebElements is a premier online periodic table maintained by Mark Winter at the University of Sheffield. Launched in 1993, it was the first periodic table on the World Wide Web and has since become a high-authority resource for students, academics, and professional chemists. The site offers deep, structured data on every known chemical element, from standard atomic weights to complex electronic configurations.
The value of scraping WebElements lies in its high-quality, peer-reviewed scientific data. For developers building educational tools, researchers conducting trend analysis across the periodic table, or materials scientists training machine learning models, WebElements provides a reliable and technically rich source of truth that is difficult to aggregate manually.

Why Scrape WebElements?
Discover the business value and use cases for extracting data from WebElements.
Collection of high-quality scientific data for educational tool development.
Aggregating element properties for materials science research and machine learning models.
Automated population of laboratory inventory systems with chemical specifications.
Historical analysis of element discoveries and scientific advancement.
Creation of comprehensive chemical property datasets for academic publications.
Scraping Challenges
Technical challenges you may encounter when scraping WebElements.
Data is spread across multiple sub-pages per element (e.g., /history, /compounds).
Older table-based HTML layouts require precise selection logic.
Domain name confusion with Selenium's 'WebElement' class when searching for support.
Scrape WebElements with AI
No coding required. Extract data in minutes with AI-powered automation.
How It Works
Describe What You Need
Tell the AI what data you want to extract from WebElements. Just type it in plain language — no coding or selectors needed.
AI Extracts the Data
Our artificial intelligence navigates WebElements, handles dynamic content, and extracts exactly what you asked for.
Get Your Data
Receive clean, structured data ready to export as CSV, JSON, or send directly to your apps and workflows.
Why Use AI for Scraping
AI makes it easy to scrape WebElements without writing any code. Our AI-powered platform uses artificial intelligence to understand what data you want — just describe it in plain language and the AI extracts it automatically.
How to scrape with AI:
- Describe What You Need: Tell the AI what data you want to extract from WebElements. Just type it in plain language — no coding or selectors needed.
- AI Extracts the Data: Our artificial intelligence navigates WebElements, handles dynamic content, and extracts exactly what you asked for.
- Get Your Data: Receive clean, structured data ready to export as CSV, JSON, or send directly to your apps and workflows.
Why use AI for scraping:
- No-code navigation through hierarchical element structures.
- Handles extraction of complex scientific tables automatically.
- Cloud execution allows for full-dataset extraction without local downtime.
- Easy export to CSV/JSON for direct use in scientific analysis tools.
- Scheduled monitoring can detect updates to confirmed element data.
No-Code Web Scrapers for WebElements
Point-and-click alternatives to AI-powered scraping
Several no-code tools like Browse.ai, Octoparse, Axiom, and ParseHub can help you scrape WebElements. These tools use visual interfaces to select elements, but they come with trade-offs compared to AI-powered solutions.
Typical Workflow with No-Code Tools
Common Challenges
Learning curve
Understanding selectors and extraction logic takes time
Selectors break
Website changes can break your entire workflow
Dynamic content issues
JavaScript-heavy sites often require complex workarounds
CAPTCHA limitations
Most tools require manual intervention for CAPTCHAs
IP blocking
Aggressive scraping can get your IP banned
No-Code Web Scrapers for WebElements
Several no-code tools like Browse.ai, Octoparse, Axiom, and ParseHub can help you scrape WebElements. These tools use visual interfaces to select elements, but they come with trade-offs compared to AI-powered solutions.
Typical Workflow with No-Code Tools
- Install browser extension or sign up for the platform
- Navigate to the target website and open the tool
- Point-and-click to select data elements you want to extract
- Configure CSS selectors for each data field
- Set up pagination rules to scrape multiple pages
- Handle CAPTCHAs (often requires manual solving)
- Configure scheduling for automated runs
- Export data to CSV, JSON, or connect via API
Common Challenges
- Learning curve: Understanding selectors and extraction logic takes time
- Selectors break: Website changes can break your entire workflow
- Dynamic content issues: JavaScript-heavy sites often require complex workarounds
- CAPTCHA limitations: Most tools require manual intervention for CAPTCHAs
- IP blocking: Aggressive scraping can get your IP banned
Code Examples
import requests
from bs4 import BeautifulSoup
import time
# Target URL for a specific element (e.g., Gold)
url = 'https://www.webelements.com/gold/'
headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36'}
def scrape_element(element_url):
try:
response = requests.get(element_url, headers=headers)
response.raise_for_status()
soup = BeautifulSoup(response.text, 'html.parser')
# Extracting the element name from the H1 tag
name = soup.find('h1').get_text().strip()
# Extracting Atomic Number using table label logic
atomic_number = soup.find('th', string=lambda s: s and 'Atomic number' in s).find_next('td').text.strip()
print(f'Element: {name}, Atomic Number: {atomic_number}')
except Exception as e:
print(f'An error occurred: {e}')
# Following robots.txt recommendations
time.sleep(1)
scrape_element(url)When to Use
Best for static HTML pages where content is loaded server-side. The fastest and simplest approach when JavaScript rendering isn't required.
Advantages
- ●Fastest execution (no browser overhead)
- ●Lowest resource consumption
- ●Easy to parallelize with asyncio
- ●Great for APIs and static pages
Limitations
- ●Cannot execute JavaScript
- ●Fails on SPAs and dynamic content
- ●May struggle with complex anti-bot systems
How to Scrape WebElements with Code
Python + Requests
import requests
from bs4 import BeautifulSoup
import time
# Target URL for a specific element (e.g., Gold)
url = 'https://www.webelements.com/gold/'
headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36'}
def scrape_element(element_url):
try:
response = requests.get(element_url, headers=headers)
response.raise_for_status()
soup = BeautifulSoup(response.text, 'html.parser')
# Extracting the element name from the H1 tag
name = soup.find('h1').get_text().strip()
# Extracting Atomic Number using table label logic
atomic_number = soup.find('th', string=lambda s: s and 'Atomic number' in s).find_next('td').text.strip()
print(f'Element: {name}, Atomic Number: {atomic_number}')
except Exception as e:
print(f'An error occurred: {e}')
# Following robots.txt recommendations
time.sleep(1)
scrape_element(url)Python + Playwright
from playwright.sync_api import sync_playwright
def run():
with sync_playwright() as p:
browser = p.chromium.launch(headless=True)
page = browser.new_page()
# Elements are linked from the main periodic table
page.goto('https://www.webelements.com/iron/')
# Wait for the property table to be present
page.wait_for_selector('table')
element_data = {
'name': page.inner_text('h1'),
'density': page.locator('th:has-text("Density") + td').inner_text().strip()
}
print(element_data)
browser.close()
run()Python + Scrapy
import scrapy
class ElementsSpider(scrapy.Spider):
name = 'elements'
start_urls = ['https://www.webelements.com/']
def parse(self, response):
# Follow every element link in the periodic table
for link in response.css('table a[title]::attr(href)'):
yield response.follow(link, self.parse_element)
def parse_element(self, response):
yield {
'name': response.css('h1::text').get().strip(),
'symbol': response.xpath('//th[contains(text(), "Symbol")]/following-sibling::td/text()').get().strip(),
'atomic_number': response.xpath('//th[contains(text(), "Atomic number")]/following-sibling::td/text()').get().strip(),
}Node.js + Puppeteer
const puppeteer = require('puppeteer');
(async () => {
const browser = await puppeteer.launch();
const page = await browser.newPage();
await page.goto('https://www.webelements.com/silver/');
const data = await page.evaluate(() => {
const name = document.querySelector('h1').innerText;
const meltingPoint = Array.from(document.querySelectorAll('th'))
.find(el => el.textContent.includes('Melting point'))
?.nextElementSibling.innerText;
return { name, meltingPoint };
});
console.log('Extracted Data:', data);
await browser.close();
})();What You Can Do With WebElements Data
Explore practical applications and insights from WebElements data.
Materials Science AI Training
Training machine learning models to predict the properties of new alloys based on elemental attributes.
How to implement:
- 1Extract physical properties for all metallic elements.
- 2Clean and normalize values like density and melting points.
- 3Input the data into regression or predictive material models.
- 4Verify predictions against existing experimental alloy data.
Use Automatio to extract data from WebElements and build these applications without writing code.
What You Can Do With WebElements Data
- Materials Science AI Training
Training machine learning models to predict the properties of new alloys based on elemental attributes.
- Extract physical properties for all metallic elements.
- Clean and normalize values like density and melting points.
- Input the data into regression or predictive material models.
- Verify predictions against existing experimental alloy data.
- Educational App Content
Populating interactive periodic tables for chemistry students with peer-reviewed data.
- Scrape atomic numbers, symbols, and element descriptions.
- Extract historical context and discovery details.
- Organize the data by periodic group and block.
- Integrate into a user interface with visual crystal structures.
- Chemical Trend Analysis
Visualizing periodic trends like ionization energy or atomic radius across periods and groups.
- Gather property data for every element in numerical order.
- Categorize elements into their respective groups.
- Use graphing libraries to visualize trends.
- Identify and analyze anomalous data points in specific blocks.
- Lab Inventory Management
Auto-populating chemical management systems with physical safety and density data.
- Map internal inventory list to WebElements entries.
- Scrape density, storage hazards, and melting point data.
- Update the centralized lab database via API.
- Generate automated safety warnings for high-risk elements.
Supercharge your workflow with AI Automation
Automatio combines the power of AI agents, web automation, and smart integrations to help you accomplish more in less time.
Pro Tips for Scraping WebElements
Expert advice for successfully extracting data from WebElements.
Respect the Crawl-delay
1 specified in the site's robots.txt file.
Use Atomic Number as your primary key for database consistency.
Crawl the 'history' and 'compounds' sub-pages for a complete dataset per element.
Focus on table-based selectors as the site structure is highly traditional and stable.
Verify data against IUPAC standards if used for critical research.
Store numeric values like density or melting points as floats for easier analysis.
Testimonials
What Our Users Say
Join thousands of satisfied users who have transformed their workflow
Jonathan Kogan
Co-Founder/CEO, rpatools.io
Automatio is one of the most used for RPA Tools both internally and externally. It saves us countless hours of work and we realized this could do the same for other startups and so we choose Automatio for most of our automation needs.
Mohammed Ibrahim
CEO, qannas.pro
I have used many tools over the past 5 years, Automatio is the Jack of All trades.. !! it could be your scraping bot in the morning and then it becomes your VA by the noon and in the evening it does your automations.. its amazing!
Ben Bressington
CTO, AiChatSolutions
Automatio is fantastic and simple to use to extract data from any website. This allowed me to replace a developer and do tasks myself as they only take a few minutes to setup and forget about it. Automatio is a game changer!
Sarah Chen
Head of Growth, ScaleUp Labs
We've tried dozens of automation tools, but Automatio stands out for its flexibility and ease of use. Our team productivity increased by 40% within the first month of adoption.
David Park
Founder, DataDriven.io
The AI-powered features in Automatio are incredible. It understands context and adapts to changes in websites automatically. No more broken scrapers!
Emily Rodriguez
Marketing Director, GrowthMetrics
Automatio transformed our lead generation process. What used to take our team days now happens automatically in minutes. The ROI is incredible.
Jonathan Kogan
Co-Founder/CEO, rpatools.io
Automatio is one of the most used for RPA Tools both internally and externally. It saves us countless hours of work and we realized this could do the same for other startups and so we choose Automatio for most of our automation needs.
Mohammed Ibrahim
CEO, qannas.pro
I have used many tools over the past 5 years, Automatio is the Jack of All trades.. !! it could be your scraping bot in the morning and then it becomes your VA by the noon and in the evening it does your automations.. its amazing!
Ben Bressington
CTO, AiChatSolutions
Automatio is fantastic and simple to use to extract data from any website. This allowed me to replace a developer and do tasks myself as they only take a few minutes to setup and forget about it. Automatio is a game changer!
Sarah Chen
Head of Growth, ScaleUp Labs
We've tried dozens of automation tools, but Automatio stands out for its flexibility and ease of use. Our team productivity increased by 40% within the first month of adoption.
David Park
Founder, DataDriven.io
The AI-powered features in Automatio are incredible. It understands context and adapts to changes in websites automatically. No more broken scrapers!
Emily Rodriguez
Marketing Director, GrowthMetrics
Automatio transformed our lead generation process. What used to take our team days now happens automatically in minutes. The ROI is incredible.
Related Web Scraping

How to Scrape GitHub | The Ultimate 2025 Technical Guide

How to Scrape RethinkEd: A Technical Data Extraction Guide

How to Scrape Britannica: Educational Data Web Scraper

How to Scrape Wikipedia: The Ultimate Web Scraping Guide

How to Scrape Pollen.com: Local Allergy Data Extraction Guide

How to Scrape Weather.com: A Guide to Weather Data Extraction

How to Scrape Worldometers for Real-Time Global Statistics

How to Scrape American Museum of Natural History (AMNH)
Frequently Asked Questions About WebElements
Find answers to common questions about WebElements