Robots.txt Generator API: The Complete Guide

Need to control how search engines crawl your site? This guide covers everything you need to know about generating robots.txt files via API, including directives, user-agent rules, and implementation examples.

What is Robots.txt?

Robots.txt is a text file placed in a website's root directory that tells web crawlers which pages or sections of the site should not be crawled or indexed. It follows the Robots Exclusion Protocol and is respected by most major search engines.

A robots.txt file is always located at: https://example.com/robots.txt

Robots.txt Directives

The main directives used in robots.txt are:

Directive Description Example
User-agentSpecifies which crawler the rules apply toUser-agent: Googlebot
DisallowBlocks access to specified pathsDisallow: /admin/
AllowExplicitly allows access (overrides Disallow)Allow: /public/
SitemapPoints to your XML sitemapSitemap: /sitemap.xml
Crawl-delaySets delay between requests (some bots)Crawl-delay: 10
Pro Tip: Use User-agent: * to apply rules to all crawlers, or specify individual bots for more granular control.

Using the Robots.txt API

TinyFn provides an endpoint to generate robots.txt files:

API Request
POST https://api.tinyfn.io/v1/generate/robots
Headers: X-API-Key: your-api-key
Content-Type: application/json

{
  "rules": [
    {"user_agent": "*", "disallow": ["/admin/", "/private/"]},
    {"user_agent": "Googlebot", "allow": ["/"], "disallow": ["/tmp/"]}
  ],
  "sitemap": "https://example.com/sitemap.xml"
}
Response
{
  "robots_txt": "User-agent: *\nDisallow: /admin/\nDisallow: /private/\n\nUser-agent: Googlebot\nAllow: /\nDisallow: /tmp/\n\nSitemap: https://example.com/sitemap.xml"
}

Parameters

Parameter Type Description
rules array Array of rule objects (required)
rules[].user_agent string User agent to apply rules to (required)
rules[].allow array Paths to allow crawling
rules[].disallow array Paths to block from crawling
sitemap string URL of your sitemap

Code Examples

JavaScript / Node.js

const response = await fetch('https://api.tinyfn.io/v1/generate/robots', {
  method: 'POST',
  headers: {
    'X-API-Key': 'your-api-key',
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    rules: [
      { user_agent: '*', disallow: ['/api/', '/admin/', '/_next/'] }
    ],
    sitemap: 'https://mysite.com/sitemap.xml'
  })
});
const { robots_txt } = await response.json();
// Save to robots.txt file

Python

import requests

response = requests.post(
    'https://api.tinyfn.io/v1/generate/robots',
    json={
        'rules': [
            {'user_agent': '*', 'disallow': ['/private/']},
            {'user_agent': 'Googlebot', 'allow': ['/public/']}
        ],
        'sitemap': 'https://mysite.com/sitemap.xml'
    },
    headers={'X-API-Key': 'your-api-key'}
)
data = response.json()
with open('robots.txt', 'w') as f:
    f.write(data['robots_txt'])

Common Patterns

  • Block all bots: User-agent: * with Disallow: /
  • Allow all: User-agent: * with empty Disallow:
  • Block admin areas: Disallow /admin/, /dashboard/, /wp-admin/
  • Block search results: Disallow /search, /?s=, /?q=
  • Block staging/dev: Consider noindex meta tags instead

Best Practices

  1. Test your robots.txt: Use Google Search Console's robots.txt tester
  2. Don't hide sensitive data: Robots.txt is public; use authentication instead
  3. Include sitemap URL: Helps search engines find your sitemap
  4. Keep it simple: Too many rules can cause confusion

Use via MCP

Your AI agent can call this tool directly via Model Context Protocol — no HTTP code needed. Add TinyFn to Claude Desktop, Cursor, or any MCP client:

{
  "mcpServers": {
    "tinyfn-generate": {
      "url": "https://api.tinyfn.io/mcp/generate/",
      "headers": {
        "X-API-Key": "your-api-key"
      }
    }
  }
}

See all generator tools available via MCP in our Generator MCP Tools for AI Agents guide.

Try the Robots.txt Generator API

Get your free API key and start creating robots.txt files in seconds.

Get Free API Key

Ready to try TinyFn?

Get your free API key and start building in minutes.

Get Free API Key