Resources/SEO Glossary
35 Terms Defined

SEO Glossary

Master the language of search engine optimization. 35 essential SEO terms defined clearly and concisely.

A

Alt Text

Alt text (alternative text) is an HTML attribute added to image tags that describes the content of an image. Search engines cannot see images, so alt text helps them understand what an image depicts. Well-written alt text also improves accessibility for visually impaired users who rely on screen readers. It should be descriptive, concise, and include relevant keywords where natural.

Anchor Text

Anchor text is the clickable, visible text of a hyperlink. When a link points from one page to another, the anchor text signals to search engines what the linked page is about. Exact-match anchor text (matching the target keyword exactly) can influence rankings but should be used naturally to avoid over-optimization penalties. Common types include branded, generic, partial-match, and exact-match anchor text.

B

Bounce Rate

Bounce rate is the percentage of visitors who land on a page and leave without interacting further (clicking another link, filling a form, etc.). A high bounce rate can indicate that content is not meeting user expectations, or that visitors found what they needed immediately. Context matters — informational pages naturally have higher bounce rates than conversion-focused pages. In GA4, the metric has been replaced by 'engagement rate'.

C

Canonical URL

A canonical URL is the preferred version of a web page when multiple URLs show similar or identical content. The canonical tag (<link rel='canonical' href='...'>) tells search engines which URL to index and rank. This prevents duplicate content issues caused by URL parameters, HTTPS vs HTTP versions, or www vs non-www variations. Using canonicals correctly consolidates link equity to a single preferred URL.

Click-Through Rate (CTR)

Click-through rate (CTR) is the percentage of users who click on a search result after seeing it. It is calculated as (Clicks ÷ Impressions) × 100. A higher CTR means your title tag and meta description are compelling and relevant to the search query. Improving CTR can increase organic traffic without changing your ranking position. Factors that influence CTR include title tag, meta description, URL, rich snippets, and featured snippets.

Crawling

Crawling is the process by which search engine bots (also called spiders or crawlers) systematically browse the web to discover and index new and updated pages. Googlebot is Google's primary crawler. When a crawler visits a page, it follows links to discover more pages and collects data about content, structure, and metadata. Pages that are not crawlable cannot be indexed or ranked in search results.

D

Domain Authority

Domain Authority (DA) is a proprietary score developed by Moz that predicts how likely a domain is to rank in search engine result pages (SERPs). It ranges from 1 to 100 — higher scores indicate greater likelihood of ranking. DA is calculated based on factors like the number and quality of backlinks. It is a comparative metric, not a Google ranking factor. Similar metrics include Ahrefs' Domain Rating (DR) and Semrush's Authority Score.

Dofollow

A dofollow link is a standard hyperlink that passes PageRank (link equity or 'link juice') from the linking page to the linked page. By default, all links are dofollow unless specified otherwise. Dofollow links from authoritative pages are valuable for SEO because they contribute to the linked page's authority and ranking potential. The opposite of a dofollow link is a nofollow link.

Duplicate Content

Duplicate content refers to blocks of content that are identical or very similar across multiple URLs on the same site or across different sites. It can confuse search engines about which version to rank and dilute link equity. Common causes include URL parameters, HTTP/HTTPS variations, printer-friendly pages, and scraped content. Canonical tags, 301 redirects, and noindex tags are common solutions for managing duplicate content.

E
G

Google Search Console

Google Search Console (GSC) is a free web service provided by Google that helps website owners monitor, maintain, and troubleshoot their site's presence in Google Search results. It provides data on search queries, impressions, clicks, CTR, average position, index coverage, Core Web Vitals, mobile usability, and more. GSC is an essential tool for diagnosing SEO issues and understanding how Google sees your site.

H

H1 Tag

The H1 tag is the primary heading of a web page, marked up as <h1> in HTML. It is the most important heading on the page and signals to search engines and users what the page's main topic is. Best practice is to use a single H1 per page that includes the primary keyword naturally. It should be descriptive, concise, and match the intent of the page. H2 and H3 tags are used for subheadings.

HREFLANG

HREFLANG is an HTML attribute used to specify the language and regional targeting of a web page. It tells search engines which version of a page to show to users based on their language or region. For example, an English page for US users and a Spanish page for Spain. Correct hreflang implementation prevents duplicate content issues for multilingual sites and ensures users see the most relevant version of a page.

I

Impressions

In SEO, an impression occurs each time a URL appears in a search result that a user sees — even if they scroll past it or do not click. Impressions are reported in Google Search Console. High impressions with low clicks indicate a low click-through rate, which may be improved by optimizing title tags and meta descriptions. Impressions are a key metric for understanding your site's visibility in search results.

Indexing

Indexing is the process by which search engines store and organize web page content in their database (the index) after crawling. Only indexed pages can appear in search results. Pages may fail to be indexed due to noindex tags, crawl errors, thin content, or being blocked by robots.txt. You can check index status using Google Search Console's URL Inspection tool or the 'site:' search operator.

K

Keyword Density

Keyword density is the percentage of times a keyword appears in a piece of content relative to the total word count. Historically, some SEOs tried to 'stuff' pages with keywords to manipulate rankings, but modern search engines use sophisticated NLP to understand context. There is no ideal keyword density percentage. Focus on writing naturally for users while including your target keyword and related terms where they fit contextually.

Keyword Difficulty

Keyword difficulty (KD) is a metric that estimates how hard it would be to rank on the first page of search results for a given keyword. It typically ranges from 0 to 100 — higher scores indicate more competition. KD is calculated based on factors such as the authority of pages currently ranking, the number of backlinks they have, and the quality of their content. It helps SEOs prioritize which keywords to target first.

L

Long-Tail Keyword

A long-tail keyword is a specific, usually longer search query (typically 3+ words) that targets a niche audience. Long-tail keywords have lower search volume but also lower competition, making them easier to rank for. They often indicate higher purchase intent. For example, 'buy red running shoes for women size 8' is a long-tail version of the head keyword 'running shoes'. Targeting long-tail keywords is an effective strategy for newer sites.

M

Meta Description

A meta description is an HTML attribute that provides a brief summary (typically 150-160 characters) of a web page's content. While it is not a direct ranking factor, a well-written meta description can significantly improve click-through rate (CTR) by enticing users to click in the search results. If a meta description is not specified, Google may auto-generate one from the page content. Include your target keyword naturally in the meta description.

Meta Tag

Meta tags are HTML elements in the <head> section of a web page that provide structured metadata about the page. Key SEO-related meta tags include the title tag (<title>), meta description (<meta name='description'>), meta robots (<meta name='robots'>), and viewport meta tag. While many meta tags have minimal direct impact on rankings, they influence how search engines interpret and display your pages in results.

N

Nofollow

A nofollow link is a hyperlink with the rel='nofollow' attribute, which instructs search engines not to pass PageRank through the link. Nofollow links are commonly used for paid links, user-generated content (comments, forums), and links to untrusted sources. Google updated its nofollow guidance in 2019, introducing rel='sponsored' for paid links and rel='ugc' for user-generated content as additional link attributes.

Noindex

The noindex directive tells search engines not to include a specific page in their search index. It can be implemented via the meta robots tag (<meta name='robots' content='noindex'>) or the X-Robots-Tag HTTP header. Pages with noindex applied will not appear in search results. It is commonly used for thank-you pages, internal search result pages, admin pages, and any pages you do not want ranked.

O

Organic Traffic

Organic traffic refers to visitors who arrive at your website through unpaid search engine results, as opposed to paid traffic from PPC ads. It is generated when a page ranks in SERPs for a query a user searches. Increasing organic traffic is a primary goal of SEO. Key metrics for organic traffic include sessions, users, impressions, clicks, and average position, typically tracked via Google Analytics and Google Search Console.

P

PageRank

PageRank is Google's original algorithm, developed by Larry Page and Sergey Brin, that measures the importance of a web page based on the quantity and quality of links pointing to it. While Google no longer publishes public PageRank scores, it remains a core part of their ranking algorithm. Links from highly authoritative pages pass more PageRank than links from low-authority pages. Internal linking also distributes PageRank across your site.

R

Robots.txt

Robots.txt is a text file placed in the root directory of a website that gives instructions to search engine crawlers about which pages or sections of the site they can and cannot access. While it controls crawling, it does not prevent indexing — a page blocked by robots.txt can still appear in search results if other sites link to it. For true noindexing, use the noindex meta tag. Incorrect robots.txt configurations can accidentally block entire sites from being crawled.

S

Schema Markup

Schema markup (also called structured data) is code added to a web page to help search engines understand the content's meaning and context. It uses the vocabulary from Schema.org and can be implemented in JSON-LD, Microdata, or RDFa format. Correct schema markup can enable rich results in SERPs, such as star ratings, FAQ dropdowns, breadcrumbs, product details, and event listings — improving CTR and visibility.

SERP

SERP stands for Search Engine Results Page — the page displayed by a search engine in response to a user's query. SERPs contain organic results, paid ads (Google Ads), and SERP features such as featured snippets, People Also Ask boxes, knowledge panels, image carousels, local packs, and video results. Understanding SERP features for your target keywords helps you optimize your content to capture more visibility.

Site Audit

A site audit is a comprehensive analysis of a website's technical SEO health, identifying issues that may prevent the site from ranking well in search engines. Common issues uncovered in a site audit include broken links, slow page speed, crawl errors, duplicate content, missing meta tags, thin content, redirect chains, and Core Web Vitals failures. Regular site audits are essential for maintaining SEO performance.

Sitemap

A sitemap is a file (typically XML format) that lists the URLs of a website, helping search engines discover and crawl all important pages efficiently. An XML sitemap can also include metadata like when a page was last modified and how frequently it changes. Sitemaps are especially useful for large websites, sites with complex navigation, or new sites with few external links. Submit your sitemap to Google Search Console to speed up indexing.

SSL Certificate

An SSL (Secure Sockets Layer) certificate encrypts data transmitted between a user's browser and your web server, enabling the HTTPS protocol. Since 2014, Google has used HTTPS as a ranking signal. Sites without SSL show a 'Not Secure' warning in browsers, which can deter users and harm trust. Most modern hosting providers offer free SSL certificates via Let's Encrypt. Migrating from HTTP to HTTPS requires proper 301 redirects to preserve SEO value.

T

Title Tag

The title tag (<title>) is an HTML element that specifies the title of a web page. It appears as the clickable headline in search engine results and in browser tabs. The title tag is one of the most important on-page SEO elements. Best practices include keeping it under 60 characters, placing the primary keyword near the beginning, making it unique per page, and writing it to be compelling for users as well as descriptive for search engines.

U

URL Structure

URL structure refers to how web addresses are formatted and organized. A clean, descriptive URL structure benefits both users and search engines by making URLs predictable and readable. Best practices include using lowercase letters, hyphens (not underscores) to separate words, including the primary keyword, keeping URLs short and descriptive, and avoiding unnecessary parameters or session IDs. A logical URL structure (e.g., /blog/category/post-title) also helps establish site hierarchy.

Ready to put your SEO knowledge to work?

Start analyzing your website with Rank Crown — professional SEO tools for keyword research, backlink analysis, site audits, and rank tracking.