How to Allow Minifetch to Fetch Your Pages

This tutorial is for site owners. Minifetch checks every website's robots.txt before fetching its pages and respects whatever rules are set. If your site blocks all bots by default — which is common — users won't be able to fetch your pages through Minifetch. This guide shows you how to selectively allow Minifetch while keeping other bots blocked, and how to verify it's working.

Building an AI agent? This tutorial is also available as a skill your AI agent can load directly: minifetch.com/skills/unblock-minifetch/SKILL.md
  1. How Minifetch identifies itself

    Minifetch sends the following user-agent string with every request:

    minifetch/1.0 (+https://minifetch.com/site-owner-faq)

    Minifetch matches on the minifetch token, so any User-agent directive containing minifetch (case-insensitive) will be picked up correctly.

    If your robots.txt is missing or returns an error, Minifetch defaults to allowed. If it returns a status code 403, 418, or 429, Minifetch treats the entire site as blocked.

  2. Allow Minifetch while blocking all other bots

    Add the following to your robots.txt. Order of blocks doesn't matter — Minifetch's parser matches on the most specific user-agent rule:

    User-agent: minifetch
    Allow: /
    
    User-agent: *
    Disallow: /

    This explicitly grants Minifetch access to all pages while blocking every other crawler.

  3. Allow Minifetch on specific paths only

    To restrict Minifetch to certain sections of your site, use path-level rules:

    User-agent: minifetch
    Allow: /blog/
    Allow: /products/
    Disallow: /
    
    User-agent: *
    Disallow: /
  4. Set a crawl delay

    If you want to allow Minifetch but limit how frequently it can fetch pages, add a Crawl-delay directive (value in seconds):

    User-agent: minifetch
    Allow: /
    Crawl-delay: 10

    Minifetch strictly observes crawl delays. Without one set, it defaults to 1 second between requests to your domain.

  5. Block Minifetch entirely

    To block Minifetch along with all other bots:

    User-agent: *
    Disallow: /

    Or to block Minifetch specifically while keeping other bots allowed:

    User-agent: minifetch
    Disallow: /
    
    User-agent: *
    Allow: /
  6. Verify your robots.txt is working

    After updating your robots.txt, use the free preflight endpoint to confirm Minifetch can fetch your pages correctly:

    From the homepage, enter your URL in the search box and click "Check URL".

    Or with the minifetch api client:

    await client.preflightCheck("https://yoursite.com/your-page");

    Or from your cli:

    curl "https://minifetch.com/api/v1/free/preflight/url-check?url=https://yoursite.com/your-page"

    A successful allow response looks like this:

    {
      "success": true,
      "results": [
        {
          "data": {
            "url": "https://yoursite.com/your-page",
            "allowed": true,
            "crawlDelay": 1
          }
        }
      ]
    }

    If allowed is still false after updating, confirm your robots.txt is accessible at https://yoursite.com/robots.txt and has been re-deployed. Minifetch caches robots.txt for 24 hours, so changes may take up to a day to propagate.