AI & LLMs
Integrate AI functionality to Fumadocs.
Docs for LLM
You can make your docs site more AI-friendly with dedicated docs content for large language models.
To begin, make a getLLMText function that converts pages into static MDX content.
In Fumadocs MDX, you can do:
import { source } from '@/lib/source';
import type { InferPageType } from 'fumadocs-core/source';
export async function getLLMText(page: InferPageType<typeof source>) {
  const processed = await page.data.getText('processed');
  return `# ${page.data.title} (${page.url})
${processed}`;
}It requires includeProcessedMarkdown to be enabled:
import { defineDocs } from 'fumadocs-mdx/config';
export const docs = defineDocs({
  docs: {
    postprocess: {
      includeProcessedMarkdown: true,
    },
  },
});llms-full.txt
A version of docs for AIs to read.
import { source } from '@/lib/source';
import { getLLMText } from '@/lib/get-llm-text';
// cached forever
export const revalidate = false;
export async function GET() {
  const scan = source.getPages().map(getLLMText);
  const scanned = await Promise.all(scan);
  return new Response(scanned.join('\n\n'));
}*.mdx
Allow AI agents to get the content of a page as Markdown/MDX, by appending .mdx to the end of path.
Make a route handler to return page content, and a middleware to point to it:
import { getLLMText } from '@/lib/get-llm-text';
import { source } from '@/lib/source';
import { notFound } from 'next/navigation';
export const revalidate = false;
export async function GET(
  _req: Request,
  { params }: RouteContext<'/llms.mdx/[[...slug]]'>,
) {
  const { slug } = await params;
  const page = source.getPage(slug);
  if (!page) notFound();
  return new Response(await getLLMText(page), {
    headers: {
      'Content-Type': 'text/markdown',
    },
  });
}
export function generateStaticParams() {
  return source.generateParams();
}import type { NextConfig } from 'next';
const config: NextConfig = {
  async rewrites() {
    return [
      {
        source: '/docs/:path*.mdx',
        destination: '/llms.mdx/:path*',
      },
    ];
  },
};Accept
To serve the Markdown content instead for AI agents, you can leverage the Accept header.
import { NextRequest, NextResponse } from 'next/server';
import { isMarkdownPreferred, rewritePath } from 'fumadocs-core/negotiation';
const { rewrite: rewriteLLM } = rewritePath('/docs/*path', '/llms.mdx/*path');
export default function proxy(request: NextRequest) {
  if (isMarkdownPreferred(request)) {
    const result = rewriteLLM(request.nextUrl.pathname);
    if (result) {
      return NextResponse.rewrite(new URL(result, request.nextUrl));
    }
  }
  return NextResponse.next();
}Page Actions
Common page actions for AI, require *.mdx to be implemented first.

npx @fumadocs/cli add ai/page-actionsUse it in your docs page like:
<div className="flex flex-row gap-2 items-center border-b pt-2 pb-6">
  <LLMCopyButton markdownUrl={`${page.url}.mdx`} />
  <ViewOptions
    markdownUrl={`${page.url}.mdx`}
    githubUrl={`https://github.com/${owner}/${repo}/blob/dev/apps/docs/content/docs/${page.path}`}
  />
</div>Ask AI

You can install the AI search dialog using Fumadocs CLI:
npx @fumadocs/cli add ai/searchYou can add the trigger component to your root layout.
AI Model
By default, it's configured for Inkeep AI using Vercel AI SDK. Update the configurations in useChat and /api/chat route to connect to your own AI model instead.
Note that Fumadocs doesn't provide the AI model, it's up to you.
Your AI model can use the llms-full.txt file generated above, or more diversified sources of information when combined with 3rd party solutions.
How is this guide?
Last updated on
