Page 1 of 1

Move URL parameters to static URLs

Posted: Thu Jan 23, 2025 5:34 am
by kexej28769@nongnue
Block crawlers with Disallow
URL parameters intended for sorting and filtering can potentially create infinite URLs with non-unique content. You can choose to block crawlers from accessing these sections of your website using the disallow tag.

Blocking crawlers, such as Googlebot, from crawling parametric duplicate content means controlling what they can access on your website via your robots.txt file . Robots.txt is checked by bots before crawling a website, so it is a list of italy consumer email great starting point for optimizing parametric URLs.

The following robots.txt file disallows access to all URLs containing a question mark:

Disallowed:/*?tag=*

This disavow tag blocks all URL parameters from being crawled by search engines. Before choosing this option, make sure that no other part of your URL structure uses parameters , otherwise they will also be blocked.

You may need to crawl to find all URLs containing a question mark (?).

This falls into the larger discussion of dynamic vs. static URLs . Rewriting dynamic pages as static improves the URL structure of your website .

However, especially if parametric URLs are currently indexed, you should spend time not only rewriting the URLs, but also redirecting these pages to their new corresponding static locations.

Google developers also suggest:

remove unnecessary parameters, but keep a dynamic looking URL
create static content equivalent to the original dynamic content
limit dynamic/static rewrites to those that remove unnecessary parameters.