By: Todd Withrow | August 3rd, 2015
On the morning of June 28th, 2015, webmasters all over the world found themselves inundated with email inboxes full of fresh errors from Google Webmasters. As a top SEO agency, we like getting notices from Google because they are into Google’s proprietary algorithm for getting higher page rankings.
Entitled, “Googlebot cannot access CSS and JS files”, the messages to SEO agencies lead with this rather cryptic statement:
Our SEO team thought there may have been something going on with our servers, but when we started looking around, we realized we weren’t alone. Like other SEO agencies and webmasters, we scrambled to understand what it was and why they were getting these messages all of a sudden to keep our clients in the best possible position. We got clarity pretty quickly and are passing along what we shared with our SEO clients.
So what happened, exactly?
How do we fix it on our site?
The first thing to do is to jump into Google Webmasters and check the Google Index Blocked Resources subsection. Alternately, you can go to the Crawl Fetch As Google page in Google Webmasters.
Some legacy CMS settings like WordPress and WordPress plugin settings may be blocking these resources by default, so look closely at the resources along with your robots.txt file and clean up any identified problems. Re-run the test to ensure the problems have been cleared and you’re good to go.
If you or your webmaster received this message, we and other top SEO agencies encourage you to take action for your website and make sure the team in charge of your website resolves this as soon as possible.
As always, if you don’t feel comfortable tinkering with files on your server, let SEO professional teams like ours at NicheLabs take care of these for you.