Paste a domain. We crawl the Terms of Service, parse it with AI, and output a valid machine-readable permissions file.
1. Save the file as openterms.json
2. Serve it at with Content-Type: application/json
3. Optionally reference it in your robots.txt
Billions of AI agents are being deployed to browse, buy, post, and transact. None of them can read a terms of service. They don't know what's allowed.
Robots.txt told crawlers where they could go. OpenTerms tells agents what they can do.
A standardized protocol that translates terms of service, privacy policies, and usage restrictions into structured data any AI agent can parse.
Probes common paths and scans the homepage for ToS and Privacy Policy links.
Fetches both pages and strips HTML, preserving the legal content for analysis.
Maps each clause to the v0.3.0 schema — permissions, data handling, rate limits, jurisdiction.
Output is validated against the JSON Schema. Copy, download, and deploy.
Three lines of Python. Your agent knows what it's allowed to do before it acts.
import openterms # Check what's allowed before acting result = openterms.check("acme.com", "scrape_data") if result: # Permission granted, proceed data = fetch_data("https://acme.com/api/items") else: # Blocked: result.decision is "deny" or "not_specified" print(f"Cannot scrape acme.com: {result.decision}") # Generate a compliance receipt (local, no server call) receipt = openterms.receipt("acme.com", "scrape_data", result.decision)
Works with LangChain, CrewAI, AutoGen, or any Python agent. pip install openterms-py
OpenTerms is an open protocol. The spec is public. The generator is free. Adoption is the only thing that matters.
Get early access to the compliance API and SDK.