How to test the “Don’t block CSS and JS files by robots.txt” rule from Google
For years Google has recommended not blocking CSS and JS files with robots.txt. In Forecheck we implemented this issue and it lists every blocked CSS or JS file as an error in the section “Blocked by robots.txt”.
Only blocked CSS and JS files are errors. So unblocking these files will solve this issue.
In 2009 Matt Cutts from Google said that he would personally not block CSS and JS files:
In 2012 Matt Cutts again created a video about this topic, this time stressing the importance of crawling CSS and JS files:
Now an official announcement from the Google Webmaster Central states that blocking CSS and JS files may harm the rankings in the search engine results pages of Google. Every webmaster should check if all CSS and JS files are crawlable. Pay attention to cascaded CSS files. Within CSS files, other integrated CSS files may exist. Forecheck will crawl all CSS files and JS files integrated within CSS and JS files.