How AI is Shaping the Future of SEO.
May 28, 2025 | Blog

How AI is Shaping the Future of SEO

The SEO team at Socium Media is always being proactive and engaging with the AI trends that are reshaping the digital marketing industry. Here is how we’re helping our clients stay ahead of the AI curve.

What Has AI and LLMs Shown Us So Far?

  • AI citations are coming from deep internal links.
  • We’ve found blog posts, products and FAQs tend to be cited the most in the AI SERP Overview and searches within LLMs.
    • Originally, it was mostly blogs and FAQs when the early LLMs were more informational.
    • Around April 2025, products began appearing in AI results. Some LLMs even began including buy now buttons in their output.
    • In ChatGPT, we also began seeing local listings with a map feature. Local businesses and restaurants appear to be ranked based on the relevancy of the homepage and the Google Business Directory.
      • For example, restaurants that had “brunch” mentioned on the homepage or in their Google Business Profile were ranking for “best Mother’s Day brunch in (location)” queries.
  • AI results tend to favor pages that directly answer a question or fulfill a specific intent.

What AI Considerations Is Our SEO Team Prioritizing?

  • Human-written content that answers common questions clearly and directly.
  • FAQ sections with concise and useful answers after each question.
  • Longer-form blog content targeting long-tail, informational keywords.
  • Expanding structured data to help LLMs and search engines understand the content layout.
    • Our team has been implementing more specific schemas to help AI parse the website’s data.
    • We still use the “standard” product, organization, breadcrumbList, blogPosting/article schemas, but we’re finding that additional schema markups are helping with AI results.

What Are We Testing?

  • We started building out Travel Guide landing pages for an outdoor brand client of ours. These pages contain travel tips, places to eat, drink, stay, and hike/climb/bike. The idea here is to meet informational intent while subtly promoting products and targeting potential buyers in the research phase.
    • It’s still too early to report results, but the content structure is optimized for AI visibility.
    • We’ve found directories like these also work well in traditional SEO.
  • We will soon begin testing Model Context Protocol (MCP).
    • This may help increase visibility in AI search results and provide LLMs with more guidance on the content within a website.

What Are We Currently Recommend For Better AI Results?

  • Build useful informational pages that answer specific questions. 
  • Use clear question-based subheadings and answer the question in the first sentence.
  • Add FAQ sections to blog posts and landing pages to further target questions and link to full blog posts that answer questions more thoroughly.
  • Include tips, quotes, or proprietary data to make content more unique.
    • Proprietary data or quotes may be a ranking factor in LLMs, according to data we’ve seen from Neil Patel. This would make sense since it’s original useful content.
  • Keep pages updated with current info, awards, and seasonal content.
    • Recency of content, aka publication date, is also considered a LLM ranking factor.
  • Use schema markup across the board for everything that is applicable to your content.
    • This is not only stated in Google Developer Docs but also seems to be what LLMs are currently using to understand content more efficiently.
    • Don’t be afraid to try experimental schemas that aren’t “officially” being utilized (according to Schema.org documentation).
    • We are expecting MCP (Model Context Protocol) to become the new standard. This is essentially the 5.0 version of structured data, but specifically used for LLMs (see below).
  • Make sure pages are crawlable, lightning fast, and easy to parse.

What’s Next For AI and SEO?

LLMs.txt is a proposed standard that tells AI models what content they can or cannot use. This is similar to how robots.txt works for search engines.

  • At this point, most major AI platforms like Google and OpenAI don’t recognize or follow LLMs.txt rules.
  • From the data we’ve seen so far, there is no real impact on visibility, rankings, or how content is used in AI-generated results.
  • It also requires root-level file access. For security reasons, many hosting platforms don’t support this type of access.
  • Recommendation: We don’t suggest spending time implementing LLMs.txt at this point, since it doesn’t provide a measurable benefit for improving your AI presence.

Model Context Protocol, or MCP, helps AI tools better understand the structure of your website, what types of content you have, and how different pages are connected with one another.

  • It allows you to define key areas of your website, like product pages, blog posts, service offerings, and calls to action.
  • This can help AI models display your content more accurately in search experiences that rely on summarization and recommendations.
  • A static MCP file can be added to your site even if your CMS doesn’t support dynamic endpoints.
  • Recommendation: We are still early but we are further exploring this idea. Claude and other LLMs seem to be recognizing MCP and we expect this to become more popular.
  • You can learn more about MCPs on the Anthropic or Model Context Protocol website.

Ricky Weiss
Latest posts by Ricky Weiss (see all)