Night Night Parameters

 

Google announced yesterday that they are sunsetting the URL parameters tool. The URL parameters tool was originally launched to deal with problematic CMS and poorly coded sites generating thousands and thousands of effectively duplicated pages to the purposes of filtering ,sorting and configuration of products on on e-commerce websites. This also happens on dynamically generated websites like gaming websites, live blogs and news.

I’ve recently been working with a large e-commerce retailer who had an issue that fell into this category. Every filter that you applied to a page added an additional parameter string to the end of the URL this. Because there were so many filters within the site this created infinitely scaling parameters into the tens of millions which Google had identified in search console and blocked out. 

With the sun setting of the URL Parameter tool,  how we deal with this issue now needs to be fixed as crawl budget is still at a premium and much as Google say they are great at sorting this experience and evidence suggests that’s not necessarily so especially if they still try to call draw many many many parameterize pages.

 I’d encourage you to look at your websites and investigate are there instances you are using ‘?’ Parameters and ‘&’ parameters or any other parameters that really shouldn’t be indexed or crawled by Google. If yes,  I would encourage you to investigate this further and see if you can disallow all of these with robots.txt instruction otherwise Google will be wasting lots and lots of your crawl budget on irrelevant pages. A great way to stress test this if you’re more serious is to look at your log files. These will give you a very good indication of where Google are actually crawling. If they are not really looking at parameterised pages you can rest assured that you’re probably fine. I’d also encourage you to read Google’s best practice on how to implement parameters.

Any questions, just let me know

.