How to Block Site and Individual Pages from Search Engine Crawlers: A Complete Guide
Web crawlers are automated bots that scan and index website content for search engines. While most sites want to be found in search results, you may want to hide your site or specific pages from search engines while keeping them accessible to direct visitors.
Block Your Entire Site from Search Results:
- Go to Settings > Crawlers
- Check "Block search engine crawlers"
This adds a rule to your robots.txt file that prevents search engines from indexing your site.
Hide Specific Pages from Search Results:
You have two options:
Using Page Settings (Recommended)
- Open Pages panel
- Click settings (gear icon) for the page
- Go to SEO tab
- Enable "Hide page from search results"
Using Code Injection
- Add this code to the page header:
<meta name="robots" content="noindex" />
Important Notes:
- Hiding a collection page also hides all items within it (products, blog posts, etc.)
- Individual collection items cannot be hidden separately
- Hiding an index page doesn't hide its sub-pages
- These methods only affect external search engines, not site search
- Pages hidden via Page Settings are removed from the sitemap
- Code injection method keeps pages in the sitemap
- Page Settings method isn't available for homepages
Remember that while these methods hide content from search engines, the pages remain accessible to anyone with a direct link. For complete privacy, consider setting up password protection or member areas.
These changes can take several days or weeks to reflect in search results as search engines need time to recrawl and reindex your site.