How to Fix “Issues With Blocked Internal Resources in Robots.txt” from Semrush Audit

Category:

Table of Contents

When conducting a website audit using tools like Semrush, you might encounter the issue of “Blocked Internal Resources in Robots.txt.” This problem occurs when essential resources, such as CSS, JavaScript, or images, are blocked by the robots.txt file, preventing search engine bots from accessing them. 

How to Fix “Issues With Blocked Internal Resources in Robots.txt" Detected by a Semrush Audit

Fixing blocked internal resources is crucial because these resources play a significant role in the visual and functional rendering of your website. When search engine bots cannot access these files, they might misinterpret the content or functionality of your site, leading to poor indexing and ranking. Ensuring these resources are accessible helps search engines better understand and rank your site, improving your visibility and performance in search results.

How to Fix “Issues With Blocked Internal Resources in Robots.txt” Detected by a Semrush Audit

How to Fix “Issues With Blocked Internal Resources in Robots.txt" Detected by a Semrush Audit

Follow these steps to resolve the issue of blocked internal resources:

Step 1: Identify the Blocked Resources

  1. Run a Semrush Audit: Start by running a site audit in Semrush.
  2. Locate the Issue: In the audit report, find the section that highlights “Blocked Internal Resources.” This will list the specific resources being blocked.

Step 2: Analyze Your robots.txt File

  1. Access the robots.txt File: Navigate to yourwebsite.com/robots.txt to view the file.
  2. Review the Disallowed Directories: Look for any Disallow directives that may be blocking important resources. For example:

User-agent: *

Disallow: /css/

Disallow: /js/

Step 3: Modify the robots.txt File

  • Allow Essential Resources: Edit the robots.txt file to allow search engine bots to access the necessary resources. For instance:

User-agent: *

Allow: /css/

Allow: /js/

  • Ensure Proper Syntax: Make sure there are no typos or syntax errors in the robots.txt file.

Step 4: Test the Changes

  • Use the Google Search Console: Go to Google Search Console, navigate to the “Robots.txt Tester” tool, and submit your updated robots.txt file to ensure it is correctly formatted.
  • Check Access to Resources: Use the “Fetch as Google” tool in Google Search Console to verify that Googlebot can access your CSS, JavaScript, and other resources.

Step 5: Re-run the Semrush Audit

  • Verify the Fix: After making the changes, run a new site audit in Semrush to confirm that the issue has been resolved.
  • Monitor Regularly: Keep an eye on future audits and updates to ensure that no new issues arise.


 

By ensuring that essential internal resources are not blocked by your robots.txt file, you help search engine bots accurately render and understand your website. This improves your site’s indexing and ranking, ultimately enhancing your online visibility and user experience. Regular audits and updates to your robots.txt file are key to maintaining optimal search engine performance. If the steps above don’t resolve the issue, feel free to contact our team for assistance.

Share a post

Request a
free website audit

What to read next