The e-mail explained that the spiders need full (or as near full as possible) access to the site so that it can get the ranking it deserves. Websites may be shooting themselves in the foot by continuing to block certain resources, as Google has demonstrated how differently their bots view pages versus users.
There will, however, always be skeptics in the crowd that will need more assurance that they’re not exposing their sites to anything negative. Google seems to have taken this concern seriously, and recently launched a new feature on the Google Search Console.
This is Severe News
John Mueller first shared the news on Google+, just like anyone else would, to mixed reviews.
To cut the description short, we’ll call this the new severity feature. Its function is simply to tell webmasters how important the resources they’re blocking are to the accurate rendering of the website. The tool only gives webmasters a low, medium, or high warning; one word that gets Google’s message across.
The operation of a fetch and render remains virtually the same.
- Enter the path of the component of your site’s URL that you want Google to fetch into the textbox.
- Choose to Googlebot type you want to do the crawl.
- Click either Fetch or Fetch and Render.
- Wait for the request to complete.
There are four different fetch errors that can occur, but it’s only Partial that webmasters should concern themselves with in this case. A Partial error means that there were certain parts that Googlebot couldn’t crawl because of blocks. Simply request to open the request details page to see the resources causing the error.
What Low Looks Like
Here’s what John Mueller used as an example to demonstrate how Googlebots saw a website, and how a user would.
In this example, the blocked resource is an image of the German flag. It’s a negligible difference, which would understandably return a low severity warning.
How Do You Get High?
The goal of a webmaster is to remove the blocks on any items labelled as low severity. It’s still unclear the parameters that Google uses to classify when something is either medium or high severity. Most of the posts of examples of the new feature have all been low so far. If your site managed to fetch and render a resource that registered as Medium or High, we’d love to test it and see why they returned such a result.
In fact, this isn’t the only question that’s bothering SEO companies about the severity column. There are still quite a few unclear parts regarding the severity column. What do webmasters do if a resource does return as medium or high? Do they keep them blocked, or is there a procedure they can follow to lower the severity level? What does the severity level even mean? Do high and medium resources pose any danger to the health of the website? What if the resources are blocked externally by third party providers? Should advertisement scripts get the same treatment as everything else?
The classification of these warnings may raise more questions than settle any concerns regarding unblocking resources. We’ll keep our eyes open for any additional news on this issue, and hopefully shed some light on a few of these concerns.
If you have any information that would help our cause in learning more about the severity feature, contact us as soon as you can. We want to get as many brains in on this as possible, and help other people having trouble with their Google Search Console accounts.