1-Minute Summary

The Past: SEO focused on keywords.

The Present: Google’s Knowledge Graph and MUM understand website Entities (topics) through multimedia and user actions both within and outside a given website. Optimizing your website structure, content, and conversion flow for these Entities is crucial.

The Future: This Entity-based SEO produces enduring SEO performance, even as search engines evolve, because Entities represent the durable real world. Keyworded media subordinated to websites’ main Entities can be executed at 50x human speed by LLMs, slashing SEO costs and thus rocketing ROI. Technical SEO will still require human pros for at least 3 years, probably 5, maybe 10.

Bonus: “Generative Engine Optimization” (GEO) is mostly rebranding of traditional SEO. Gaming LLMs with Q&A citing your website is likely futile. 

Overview: Search Engines Humanized

Designing your websites’ overall “SEO Architecture & Writing” (SAW) will make the site rank well in perpetuity. Here’s why.

AI makes search engines work like collective human brains that think about things and not merely strings of text. Google’s Entity model of search understands topics as composites of all kinds of media and human actions. Imagine that all the people in your target markets are one mind, remembering everything related to your brand: as in human brains, keywords are now a small part of what drives that big AI brain. 

Therefore, a crucial and mostly one-time task in SEO today entails structuring your website content and conversion flow to reflect Entities (which are like topics) pertinent to your market. This work precedes and, for smaller budgets, can replace most of SEO’s laborious keywording. In other words, starting in 2022 the advice of a great adman from any decade would produce a compelling flow of intent though your site and to your bottom line.

Although Google has completed only ~30% of its Knowledge Vault’s “KGMIDs” (Entity coding), the hierarchical connections within that 30% define most if not all main Entities addressed by most websites. Where Google has not yet established more specific sub-sub-Entities, traditional SEO keyword writing will help rank and will connect to future, more granular Entities.

History: Google’s 2021+ MUM Begins This SEO Revolution

Now in “SEO Architecture & Writing” (SAW), the Writing includes any website content, not merely text. This brief history of the underlying tech explains:

  • The Multitask (and multimedia) Unified Model (MUM), introduced in 2021, leverages advancements in large language models (LLMs) and AI to understand the relationships between different types of information, not just text. MUM builds upon Google’s earlier work with the Knowledge Graph (Introducing the Knowledge Graph: things, not strings, 2012), which aimed to relate real-world Entities with one another, like humans do. MUM and later LLMs required the increased processing power provided by graphical CPUs (called GPUs, with “TensorFlow” architecture), making Nvidia a trillion dollar titan.

By roughly mid-2022, SEO firms should have pivoted from keyword-based to Entity-based strategies. However, years of huge investments in keyword tools and procedures continue to motivate most SEO firms and tool suites to sell obsolete solutions.  

Now: Uniting Entity and LLM Models to Replace Experts

Entity-based knowledge graphs and LLMs are fundamentally different. Google and other big tech firms are now racing to integrate the two. Knowledge graphs are rather structured, showing relationships among key nodes. Picture the public transit map of a big city. LLMs, on the other hand, are more free-form, like a brain’s sprawling, tangled network of nerve connections, wherein topics are related in fine gradations of similarity (via numbers in a vector database).

Uniting the two databases entails Retrieval-Augmented Generation (RAG), a technique essential for creating GPTs tailored to specific fields of expertise or to a firm’s cloud. RAG marries knowledge graphs or any separate group of information with LLM processes, to achieve the Holy Grail of enterprise computing: Replacing costly executives with AI. For example, a specialized GPT can integrate all State laws with ChatGPT to create a legal GPT assistant that replaces paralegals and probably plenty of junior attorneys as well.

Replacement of experts and managers requires that LLMs (1) move far beyond current mediocre math and sequential reasoning, and (2) are allowed to use “Agents” to take actions within workflows. Such expert reasoning combined with freedom to act remains 3 to 7 years away. Until then, technical SEO requires human pros. Professional tasks that depend on precise or near-perfect output in some steps, as in technical SEO, require training and supervision proportionate to the number of such steps. Of course the same is true for middle managers and new top executives. SAW, law, even large parts of coding and CPA work, have plenty of parts for which a range of outputs works well.

Important Caveat: AI/LLM replacement of experts and managers requires that most if not all expert information and company cloud content respectively is available to RAG. For example, because top medical journals aren’t free, retail-level medical GPTs buy limited, incomplete access, meaning that a truly good doctor can do better. On the other hand, many professions, like SEO writing, law, and graphic design, follow established public rules or guidelines, so that dedicated GPTs can replace a big portion of billable time. Professional firms with integrity will tell you this, and price services accordingly. 

The Future: The Durability of Entity Classifications

Chats with LLMs, like those linked below, point out that such Entity databases as knowledge graphs may change substantially. However, though LLM’s algorithms and pre-trained databases will change, Entities refer to enduring things in the real world (shoes, Tolstoy, lavender, anything) as well as abstract yet enduring concepts like democracy, expensive, brilliant, zero, and evolution. So, except in new and highly specialized fields for which Entity classification is not yet established, websites’ Entity structure can be designed now to endure for many years.  

In July 2024, top LLMs agreed with the SEO durability thesis of this blog post. They point out the importance of separate tech SEO–good SEO firms have long known that a sound technical SEO foundation must precede all subsequent content optimization. The LLMs also emphasize that keywording remains important; however, now SEO firms should use Entity research tools that first ascertain a business’s knowledge graph and then organize keyword clusters under each main Entity (offering and market segment). SEO GPTs can then produce market-focussed, keyworded content at 50x human speed. For details, see this ChatGPT thread (start about half way down, though the first half elucidates core tech) and Gemini’s (Claude 3 reiterated those two).

Details for Data Scientist: Some firms are now pioneering the use of LLMs to make entirely new structured knowledge graphs, but any new graph will still refer to the same real world that a website does. For more about this, see Neo4j Finds the Vector for Graph-LLM Integration and Unifying LLMs & Knowledge Graphs for GenAI: Use Cases & Best Practices

Is “GEO” Mostly Marketing Spin?

Now some SEO firms use the term “Generative Engine Optimization” (GEO) to attract clients. However, GEO entails what good SEO firms have done for many years. Indeed, the neural net connections implicit in both LLMs and search engines’ knowledge graphs mirror such PageRank-like connections as interlinking, reviews, mentions, conversion rates, subsequent off-site actions, and other measures of helpfulness and legitimacy. So the SEO-CRO approach explained in this post encompasses “GEO.”

One potential new SEO tactic emerges from the possibility that a user’s interactions with an LLM could be incorporated into future updates to the core pre-trained database. [They aren’t incorporated between updates–a problem which RAG aims to reduce.) For instance, if an LLM user frequently mentions a website that allows filtering features of door locks, later the LLM might provide that website to anyone asking about what kind of lock to buy. Such strenuous manipulation would likely prove futile because a single user or group of users is a spit in the ocean of LLMs’s vast data updates. Still, this tactic is worth future investigation.

Sources and Credits

DISC’s 3-part series on AI/LLMs (17-minute read time) or equivalent knowledge helps comprehend this post.

Writing came from DISC’s decades of web marketing R&D and practice. LLMs were used only to verify some statements and to help with basic edits for clarity.