---
title: Making Craft CMS AI-Ready — What GEO Actually Means in Practice
date: 2026-04-10T11:02:00+02:00
author: admin
canonical_url: "https://craft-kit.dev/blog/making-craft-cms-ai-ready-what-geo-actually-means-in-practice"
section: Blog
---
Everyone's talking about Generative Engine Optimization. Most articles explain the same three concepts and call it a day. So instead of writing another overview, I want to share what I actually did to make Craft Kit AI-agent ready — and be honest about what matters and what doesn't.

## The one thing that actually works: serving Markdown

LLMs understand text. Not HTML — text. A typical HTML page is full of navigation, scripts, SVGs, cookie banners, and a hundred other things an AI agent has to filter through just to get to your content. Markdown cuts all of that out.

The good news: there's already a standard for this. AI agents like Claude Code send an HTTP Accept header with text/markdown. If your server responds accordingly, the agent gets a clean Markdown version of your page — same URL, no separate route, no magic.

Craft Kit now supports this out of the box via the LLM Ready plugin. Every page automatically has a Markdown version, and the Accept header is handled for you. Token usage for agents drops significantly. Content is easier to process. Everyone wins.

## llms.txt — useful, but not what you think

There's a lot of hype around llms.txt. The idea is simple: a file at your site root that acts as an AI-friendly sitemap, linking to Markdown versions of your pages with short descriptions.

Here's the honest reality: no major AI provider officially uses it. Google's John Mueller said it plainly in mid-2025 — no AI system currently reads llms.txt. An audit of 1,000 domains confirmed that LLM-specific bots like ClaudeBot or GPTBot don't fetch it at all.

That said, I kept it in Craft Kit — and here's why. The cost is near zero when it's auto-generated. For developer tooling and documentation sites like craft-kit.dev, it's the exact use case where it could eventually matter. And if adoption does pick up, you're already set.

What I'd avoid: manually maintaining a llms.txt, treating it as an SEO lever, or including pages like Imprint and Privacy Policy in the index. None of that helps anyone.

## Recycling SEO fields for GEO

One small practical win: the LLM Ready plugin lets you define which field to use as the description in your llms.txt entries. Instead of adding a new field, I pointed it at the existing SEO meta description. It's already written, already concise, and already describes the page accurately.

Less duplication, more signal.

## What I skipped

A few things I looked at and decided against:

**&lt;script type="text/llms.txt"&gt;** — a Vercel proposal that embeds LLM-facing content directly in the HTML via an unknown script type. Clever idea, but browsers still load the full HTML and no AI provider actively reads it. Not worth the effort.

**A separate /docs section in Markdown** — some sites maintain full parallel documentation in Markdown. For a demo project like Craft Kit this would be overkill. The per-page Markdown generation covers it.

## The short version

If you want your Craft CMS site to be AI-agent ready:

1. Serve Markdown on Accept: text/markdown — this is the only thing with real, confirmed adoption right now
2. Add a llms.txt as an auto-generated link index — low effort, no downside
3. Use your existing SEO descriptions as LLM descriptions
4. Keep focusing on good SEO and accessibility — if a screen reader struggles with your content, so will an AI

**Craft Kit has all of this set up out of the box. One less thing to figure out.**
