How to test the “Don’t block CSS and JS files by robots.txt” rule from Google

For years Google  has recommended not blocking CSS and JS files with robots.txt. In Forecheck we implemented this issue and it lists every blocked CSS or JS file as an error in the section “Blocked by robots.txt”.

Blocked CSS and JS files errors

Only blocked CSS and JS files are errors. So unblocking these files will solve this issue.

In 2009 Matt Cutts from Google said that he would personally not block CSS and JS files:

In 2012 Matt Cutts again created a video about this topic, this time stressing the importance of crawling CSS and JS files:

Now an official announcement from the Google Webmaster Central states that blocking CSS and JS files may harm the rankings in the search engine results pages of Google. Every webmaster should check if all CSS and JS files are crawlable. Pay attention to cascaded CSS files. Within CSS files, other integrated CSS files may exist. Forecheck will crawl all CSS files and JS files integrated within CSS and JS files.

About Thomas Kaiser

Thomas Kaiser, founder and CEO of Forecheck LLC and cyberpromote GmbH, launched his first company at 23. He developed the first MPEG-2 video coder for Windows at the Technical University of Munich. In 1997 he invented “RankIt!!”, the first SEO software program in Germany. He has also written several books and is a sought-after speaker at SEO conferences and events. He loves playing guitar, enjoys his 5 kids and has drunk SEO milk since birth. You can write him at thomas /at/ forecheck.com.
Facebook
Twitter