I built a directory website for AI tools. Not because the world needed another one, but because I wanted to understand how directory sites work from the inside. I had been paying to list my own product on directory sites for months. Some of those listings brought traffic. Most did not. I wanted to know why.
ToolIndex.net is the result of that curiosity. It is a searchable, categorized directory of AI tools built with SvelteKit, PostgreSQL, and Meilisearch, deployed on a K3s cluster. The process of building it taught me more about SEO than any blog post or course ever could.
This is the full story of how it was built, the technical decisions behind it, and the SEO lessons that came from watching real traffic data after launch.
Why Another Directory Site
The AI tool directory space is crowded. There is An AI For That, Futurepedia, AI Scout, Toolify, and dozens of smaller ones. Every week a new one launches. Most of them die within six months.
But the ones that survive do extremely well. Toolify has a Domain Rating above 70. There Is An AI For That charges $49 per featured listing and has thousands of tools listed. These sites print money because they sit at the intersection of two powerful forces: people actively searching for solutions, and companies willing to pay to be found.
The economics are simple. Build a site that ranks for "best AI tool for X" queries. Charge companies to be listed or featured. The operating costs are minimal because the content is largely submitted by the tool creators themselves. It is a flywheel: more tools listed means more pages indexed means more search traffic means more tools wanting to be listed.
I wanted to understand this flywheel from the inside and build a small one for myself.
Choosing the Tech Stack

I chose SvelteKit for the frontend and server-side rendering because SEO was the entire point of the project. Server-side rendering is not optional for a directory site. Search engines need to crawl fully rendered HTML pages. Client-side rendered React applications can be indexed by Google, but the process is slower, less reliable, and introduces unnecessary risk.
SvelteKit generates HTML on the server by default. Every tool page, category page, and search result page is a fully rendered HTML document before it reaches the browser. This means Googlebot sees the same content a human does, without waiting for JavaScript to execute.
PostgreSQL handles all the structured data. Each tool has a name, description, categories, pricing model, URL, and various metadata fields. The relational model makes it easy to query tools by category, filter by pricing type, and generate sitemap URLs.
Meilisearch is the search engine. I tried Postgres full-text search first and it worked, but the relevance ranking was mediocre. Searching for "image generation" would return results where "image" appeared in one field and "generation" appeared in another, ranked by frequency rather than relevance.
Meilisearch handles typo tolerance, synonym matching, and relevance ranking out of the box. The search experience went from "functional" to "actually good" in about an hour of integration work.
The entire stack runs on the same K3s cluster that hosts my other projects. SvelteKit runs as a Node.js container, PostgreSQL runs in its own pod, and Meilisearch runs as a single-node deployment. Total resource usage is about 512 MB of RAM and minimal CPU.
Building the Directory
The core data model is straightforward. A Tool has a name, slug, description, long description, website URL, logo, pricing model (free, freemium, paid, open-source), and a many-to-many relationship with Categories. Categories have names, slugs, and descriptions.
Each tool gets its own page at /tool/[slug]. Each category gets a page at /category/[slug]. The homepage shows featured tools and popular categories. There is a search page that queries Meilisearch and displays results with infinite scroll.
I initially planned to build a submission form where tool creators could add their own listings. I ended up seeding the database manually with about 500 tools to start.
The submission form came later. Here is why: an empty directory has a chicken-and-egg problem. No one wants to submit their tool to a site with 10 listings. But you cannot get listings without traffic, and you cannot get traffic without listings. Seeding the database with a critical mass of tools solves the cold start problem.
The seeding process was semi-automated. I wrote a script that pulled tool data from public sources, cleaned it up, categorized it, and inserted it into the database. Each listing was reviewed manually before going live. This took about two full days but gave the site enough content to be useful from day one.
SEO Architecture: What Actually Matters
Here is where the real lessons start. Building a technically correct website is only half the SEO battle. The other half is understanding what Google actually wants to see and structuring your site to provide it.
URL Structure
Every URL on ToolIndex follows a flat, descriptive structure. Tool pages are at /tool/tool-name. Category pages are at /category/category-name. There are no unnecessary path segments, no numeric IDs in URLs, and no query parameters for content pages.
This matters more than most people realize. Google uses URL structure as a signal for content hierarchy. A URL like /tool/midjourney tells Google this is a page about Midjourney. A URL like /tools/detail?id=4827 tells Google nothing.
Title Tags and Meta Descriptions
Every page has a unique, descriptive title tag and meta description. Tool pages follow the pattern: "[Tool Name] — [Short Description] | ToolIndex". Category pages use: "Best [Category Name] AI Tools in 2026 | ToolIndex".
I generate these programmatically from the database. The title tag includes the tool name and a brief description. The meta description expands on this with pricing information and key features. These are not afterthoughts. They are the first thing a searcher sees in Google results. A compelling meta description directly affects click-through rate, which affects rankings.
Structured Data (Schema.org)
This is the single most impactful SEO optimization I implemented. Every tool page includes JSON-LD structured data using the SoftwareApplication schema type. This tells Google exactly what the page is about in a machine-readable format.
The structured data includes the application name, description, operating system (web), pricing model, aggregate rating (when available), and the official URL. When Google understands your content through structured data, it can display rich results: star ratings, pricing information, and other enhanced snippets that dramatically increase click-through rates.
Category pages use the CollectionPage schema with ItemList markup, telling Google these pages contain a curated list of related items.
Implementing structured data took about three hours. The traffic impact was noticeable within two weeks. Pages with structured data consistently outperformed identical pages without it in search results.
Sitemap and Indexing
SvelteKit generates a dynamic XML sitemap at /sitemap.xml. It includes every tool page, every category page, and the core static pages. Each URL includes a lastmod date that updates when the tool listing is modified.
I submitted the sitemap to Google Search Console immediately after launch. Then I used the URL Inspection tool to manually request indexing for the most important pages: the homepage, top category pages, and high-value tool pages. Google indexed the core pages within 48 hours and the full site within about two weeks.
One thing that surprised me: Google indexed category pages faster than individual tool pages. My theory is that category pages have more internal links pointing to them (every tool in the category links back to the category page) and Google interprets this as a signal of importance.
Internal Linking
Internal linking is the most underrated SEO technique for directory sites. Every tool page links to its categories. Every category page links to every tool in that category. The homepage links to popular categories and featured tools. Related tools are displayed on each tool page.
This creates a dense internal link graph where every page is reachable within two or three clicks from the homepage. Google can crawl the entire site efficiently, and link equity flows naturally from high-authority pages (homepage, popular categories) to individual tool pages.
I also added breadcrumb navigation on every page. Tool pages show: Home > Category > Tool Name. This serves both UX and SEO purposes. Google displays breadcrumbs in search results, and they help users understand where they are in the site hierarchy.
Page Speed
ToolIndex loads in under one second on a 3G connection. This is not because I spent weeks optimizing performance. It is because the site does not have much to load. SvelteKit ships minimal JavaScript. Images are lazy-loaded and served through Cloudflare with automatic WebP conversion. There are no third-party scripts, no analytics heavy enough to block rendering, and no unnecessary client-side frameworks.
Google has been explicit about page speed being a ranking factor. For a directory site where pages are mostly text and images, there is no excuse for slow load times. If your directory takes more than two seconds to load, you are leaving rankings on the table.

What I Learned About SEO (The Hard Way)
Content Depth Beats Content Volume
My initial strategy was to list as many tools as possible. More pages means more chances to rank, right? Partially. But I noticed that tool pages with detailed descriptions (300+ words, feature breakdowns, pricing details) consistently outranked tool pages with minimal descriptions (50 words, basic overview).
Google can tell the difference between a useful page and a thin page. A directory listing that just says "Midjourney is an AI image generation tool" and links to the website is not providing value. A listing that explains what makes it different, who it is best for, how the pricing works, and what alternatives exist is actually helpful.
I went back and expanded the top 100 tool listings with detailed descriptions. Within a month, those pages saw a measurable increase in impressions and clicks.
Category Pages Are Your Money Pages
Individual tool pages rank for branded searches (people searching for a specific tool by name). These are useful but limited. The real traffic comes from category pages ranking for non-branded queries: "best AI image generators," "free AI writing tools," "AI video editing software."
Category pages aggregate the authority of every tool page that links to them. They target high-volume keywords. They are the pages people actually want when they are comparing options. I spent more time optimizing category page titles, descriptions, and layouts than any other page type, and it paid off proportionally.
Backlinks Still Matter More Than Anything
I can optimize every technical SEO element perfectly and still lose to a competitor with more backlinks. This is the uncomfortable truth about SEO in 2026. Technical SEO is table stakes. Content quality is expected. Backlinks are the differentiator.
For a directory site, there are a few natural backlink sources. Tool creators sometimes link to their listing on your directory from their own website. "As featured on" badges can generate links. Guest posts about AI tools on other blogs can include directory links. But acquiring backlinks is slow, manual work.
The directory sites with the highest Domain Ratings did not get there through technical SEO alone. They got there through consistent link building over years. This is the part of directory SEO that cannot be shortcut.
Google Discover Is Unpredictable But Powerful
About six weeks after launch, one of my category pages appeared in Google Discover. Traffic spiked by 10x for about three days, then returned to normal. I have no idea why that specific page was picked up. I could not replicate it.
Google Discover traffic is volatile and unreliable, but when it hits, it brings thousands of new visitors who would never have found your site through search. The lesson is not to optimize for Discover specifically, but to make sure your pages look good with proper Open Graph images and compelling titles, because Discover pulls from those.
Monetization Lessons
I launched ToolIndex with three monetization paths: featured listings (paid placement at the top of category pages), sponsored tool reviews (detailed write-ups for a fee), and affiliate links (commission on signups through the directory).
Featured listings generate the most predictable revenue. Tool creators pay a flat monthly fee for prominent placement. This works because the value proposition is clear: your tool appears first when someone browses the category.
Affiliate links generate passive income but require tools with affiliate programs. Not every AI tool offers one, which limits the approach.
Sponsored reviews generate the most revenue per unit but are the most time-intensive. Each review requires actually testing the tool and writing an honest assessment. Scaling this is difficult as a solo operator.
The honest truth is that directory monetization takes time. You need traffic before anyone will pay for placement. You need traffic before affiliate clicks add up to anything meaningful. The first three months were essentially unpaid work, building content and waiting for Google to index and rank pages.
Technical Gotchas
Meilisearch indexing needs to be triggered after database changes. I set up a post-save signal in the application that updates the Meilisearch index whenever a tool is created or modified. Without this, search results can be stale.
SvelteKit's SSR caching needs careful configuration. Category pages that change rarely can be cached aggressively. Tool pages that show dynamic data (like user ratings) need shorter cache durations. I use a 1-hour cache for category pages and a 15-minute cache for tool pages.
Image handling for tool logos is more work than expected. Tool creators submit logos in every format and size imaginable. I built a processing pipeline that resizes, converts to WebP, and stores in Cloudflare R2. Without this, the homepage would take 30 seconds to load.
What I Would Do Differently
I would start with more content depth from day one. Instead of listing 500 tools with short descriptions, I would list 200 tools with comprehensive descriptions. Depth beats breadth for SEO, especially in the first few months when Google is evaluating your site's quality.
I would build the submission form earlier and actively reach out to tool creators for listings. Every tool creator who submits their listing becomes a potential promoter of your directory.
I would invest in backlink acquisition from week one instead of waiting until the site was "ready." The site is never ready. Start building links while you are building pages.
And I would set realistic expectations. A directory site is not a quick win. It is a slow-burn project that compounds over time. The first three months feel like shouting into a void. Then one day you check Search Console and see real traffic, real clicks, and real revenue. The compounding effect of hundreds of indexed pages, each bringing a trickle of traffic, is powerful. But it requires patience.
ToolIndex is still small. It is not competing with the big directories yet. But it taught me more about SEO in three months than I learned in the previous three years. And the infrastructure cost is about $5/month because it shares a K3s cluster with my other projects.
If you are thinking about building a directory site, the barrier to entry is low. SvelteKit, a database, and a search engine are all you need technically. The real barrier is the patience to wait for SEO compounding to kick in. That part cannot be optimized.