How to Add or Edit robots.txt in Shopify: A Comprehensive Guide
Understanding robots.txt: Your SEO Gatekeeper
The robots.txt
file serves as a critical directive for search engine crawlers (like Googlebot), specifying which areas of your Shopify store they can access. Proper configuration:
- Prevents indexing of sensitive areas (admin, carts)
- Conserves crawl budget for important pages
- Blocks duplicate or low-value content
- Directs crawlers to your sitemap
Step-by-Step: Editing robots.txt in Shopify
- Access Shopify Admin: Log into your store dashboard
- Navigate to Preferences: Go to Online Store → Themes → Edit code
- Locate robots.txt: In the Layout directory, find
robots.txt.liquid
- Edit Carefully: Modify directives following Shopify's syntax rules
- Save & Verify: Click Save and check at
yourstore.com/robots.txt
Default Shopify robots.txt Configuration
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkouts/
Disallow: /checkout
Disallow: /account
Disallow: /collections/*+*
Disallow: /collections/*%2B*
Disallow: /collections/*%2b*
Disallow: /blogs/*+*
Disallow: /blogs/*%2B*
Disallow: /blogs/*%2b*
Disallow: /*/collections/*filter*
Disallow: /*/blogs/*filter*
Disallow: /collections/*filter*
Allow: /collections/*.json
Allow: /blogs/*.json
Sitemap: https://your-store.com/sitemap.xml
Practical Customization Examples
Scenario 1: Block a test collection
Disallow: /collections/test-products
Scenario 2: Allow Googlebot exclusive access to special directory
User-agent: Googlebot
Allow: /special-offers/
User-agent: *
Disallow: /special-offers/
Essential robots.txt FAQs
1. Can I remove Shopify's default rules?
No. Core protections like Disallow: /admin
are mandatory. You can only add new directives below them.
2. Why aren't my changes visible immediately?
Shopify uses global CDNs. Changes propagate within 5-15 minutes. Clear cache or append ?v=1
to your URL to force refresh.
3. How to completely block page indexing?
Combine methods:
robots.txt
for crawling control<meta name="robots" content="noindex">
in page HTML- Password-protect sensitive pages
4. Can I add multiple sitemaps?
Yes! Add separate lines:
Sitemap: https://store.com/sitemap.xml
Sitemap: https://store.com/blog_sitemap.xml
Proven Best Practices
- Test with Google Search Console: Use the robots.txt Tester before deployment
- Version Control: Maintain change history in a text document
- Combine with SEO Tactics: Use 404 redirects for blocked pages
- Annual Audit: Review directives every 6-12 months
Advanced Tactics
Crawl-Delay Directive: Manage server load from aggressive crawlers:
User-agent: *
Crawl-delay: 10
Wildcard Handling: Block all PDFs in a directory:
Disallow: /downloads/*.pdf
Final Recommendations
While Shopify's default robots.txt provides solid foundations, strategic customizations significantly impact SEO performance. Remember:
- Always test changes in development stores first
- Monitor crawl errors in Google Search Console
- Complement robots.txt with proper meta directives
- Document all modifications for future reference
Join the conversation