Why I’m retooling my websites for AI, not search engines.
For thirty years, we have been writing for Search Spiders.
We optimized our headers, counted our keywords, and begged for backlinks—all to rank on the first page of Google. We were optimizing for Search.
The era of Search is transforming. An era of Synthesis has begun.
When the Internet took off, people saw a great new way to find answers to questions. No longer did it require cracking open a book; by using a personal computer, they could search for answers from home.
Google launched in 1998 and provided the best list of links to places where you were likely to find your answers. Other search engines followed suit and modeled their solutions after the market leader.
As you know, AI has given people a better way to find answers through conversations with Claude, ChatGPT, and Gemini. They don’t return lists of places you might find answers; they provide direct answers.
So what do people do? They do what is easiest and get them the answers they need faster.
Google knows this and has Gemini standing right out on the front porch, answering questions directly.
But Large Language Models (LLMs) work differently from search. They don’t look up answers; they calculate the probability of the best answers to deliver based on their training data.
Becoming visible to AI bots means shifting from SEO (Search Engine Optimization) to AEO (Answer Engine Optimization). SEO is not going away; it is growing up. SEO is leaving adolescence and entering adulthood.
So, with the introduction of AEO, the goal is no longer to get a click to a website where the answer can be found; it is to be the trusted source that the AI cites when answering questions.
My first foray into this transformation was to optimize AgileSymbiosis.com. It’s no longer just for humans; it exposes structured, machine-readable content at every entry point.
7 steps to make my site “AI-Visible.”
1. The “Cheat Sheet” (llm.txt)
In the old days (like yesterday), we built search-engine-optimized files like robots.txt and sitemap.xml. Today, we also need to speak the language of AI.
When an AI crawls your website, it has to wade through HTML, CSS, JavaScript, and marketing fluff to find the point. This introduces noise and friction.
Instead, give the bots a clean signal. You can see an example of one of these simple files at agilesymbiosis.com/llm.txt.
This file contains no code. It is a plain-text summary of my entire book, my bio, and the core thesis of the D.I.S.T. Framework, my work redesign methodology. It’s the “Executive Summary” written specifically for a machine context window.
So now, when someone asks ChatGPT about my book, the bot doesn’t have to guess; it can read the cheat sheet.
2. The “Machine Door” (Hosted JSON)
One of the formats I’ve used to publish Agile Symbiosis is as a Prompt-Native Application (PNA)—a JSON file containing the manuscript and executable tools, and a digital book format I created.
Instead of hiding this file behind a download wall, I hosted it openly at agilesymbiosis.com/agile-symbiosis.json.
It’s not easily read by humans, but it contains the full manuscript in an intuitive format for AI. An AI can read the entire book in seconds.
This gives Answer Engines direct, API-like (direct) access to the full source material. I am not forcing the AI to scrape a webpage; I am handing it the database in an AI-native format. This also reduces hallucinations by grounding the model in the book’s source code.
3. The “Invisible Handshake” (HTML Header)
Just because the files exist doesn’t mean the bot knows where to look. I added a simple line of code to the <head> of my home page:
HTML
<link rel="alternate" type="text/markdown" href="https://agilesymbiosis.com/llm.txt" title="AI Context" />
This acts as an invisible signpost. When a crawler hits my visual homepage, this tag whispers, “If you are a machine, the full-text version is right here.”
4. The “Identity Card” (Schema Markup)
AI models think in “Entities”—People, Books, Concepts—not keywords. If you want them to know who you are, you have to tell them.
I injected JSON-LD Schema markup into the site. This code explicitly defines:
- Person: Michael Janzen
- Book: Agile Symbiosis
- Relation: Author
Now, the AI doesn’t have to infer that I wrote the book based on text placement; it knows it as a structured fact.
5. The “Answer Unit” Strategy
I skipped creating the traditional book blog for Agile Symbiosis. AIs don’t care about my “thoughts on the industry.” They care about answering human questions as accurately as possible.
I replaced the blog with a Navigator’s Field Guide. Each article is structured as a specific Answer Unit targeting a high-probability query:
- Query: “Will AI replace software engineers?”
- Article: “The Short Answer is No. The Long Answer is…”
By structuring content as Question -> Direct Answer -> Nuanced Context, I increase the probability that an AI will pull my specific paragraph as the definitive answer for its user.
I will expand this library over time, just as I would for a blog, but it will focus entirely on questions people ask about the impact of AI on careers and the future of work. This builds context for AI crawlers and increases the accuracy of their responses.
6. Owning the Vocabulary
I coined many terms in Agile Symbiosis, not by preference, but because these forces impacting our jobs had yet to be named.
If you don’t define your terms, the AI will be forced to invent plausible nonsense as it attempts to define concepts on the fly.
In my llm.txt and Field Guide, I explicitly define this vocabulary:
- The Augmentation Tide
- The Automation Headwind
- The Augmentation Wager
- The Drudgery Tax
- The D.I.S.T. Framework
Now, when a user asks, “What is the Augmentation Tide?”, the AI doesn’t need to invent something; it can quote my definition.
7. The “Bot-First” Sitemap
Search and AI crawlers have a “crawl budget,” so they index only a limited number of pages at a time. I updated my sitemap.xml to prioritize the AI files (llm.txt, agile-symbiosis.json) above my legal pages and contact forms.
I am literally telling the crawler: “Read the book first.”
We are also in a transition phase, during which website owners cannot submit llm.txt files directly to the Answer Engines (Gemini, Claude, ChatGPT). But they are looking for these files and your content in these formats. For now, make the content visible as described above, open your robots.txt file, and include the key content in the <meta> tags and the sitemap.xml file.
The Verdict
This is the Augmentation Wager applied to marketing.
If you continue to build for legacy search spiders, you are implementing a soon-to-be-lost art. If you build for AI Answer Engines, you are building for how information will be accessed moving forward.
Stop building for the blue links on page one. Start building to be the Answer at the top of the page.
Update: April 14, 2026
I’ve gone a few steps further now and coded a plugin for the WordPress platform that you can install to automate your site’s Answer Engine Optimization. It’s called AEO Pugmill.com.