Skip to Content

Meilisearch: AI-powered hybrid search that feels instant

Inside the Rust engine bringing 50 ms answers, vectors, and relevance you can tune

Get All The Latest Research & News!

Thanks for registering!

The open-source meilisearch/meilisearch repository houses a lightning-fast search engine with a developer-friendly REST API, built in Rust and designed for "search-as-you-type" experiences that blend semantic vectors with classic full-text relevance. If you need delightful discovery in apps, e-commerce, docs, or internal tools, without a search team, this project aims to be your ready-to-run engine.

Why search still hurts for app teams

Great search is deceptively hard. You must parse many languages, handle typos, faceting, synonyms, filters, and new data, all while keeping response times under a human-perception threshold (~50 ms). In 2025, AI raised the bar, users expect results that understand meaning, not just keywords. Many teams still juggle heavy infrastructure, complex query DSLs, and costly vector add-ons, or they postpone search improvements because of time and skill constraints.

How Meilisearch turns it into a product problem

Meilisearch packages fast indexing and instant retrieval with sensible defaults and a minimal API, so you can ship a credible search experience early and iterate. It embraces hybrid search, combining vector semantics with full-text features like typo tolerance, so you can capture intent without losing precision. The README.md highlights a sub-50 ms target, language support, geosearch, multi-tenancy, secure API keys, and more, while demos such as Movies, E-commerce, and SaaS illustrate real UX patterns.

What you can build today

The repository's documentation and code surface a compact but capable feature set you can wire into web or mobile front-ends in hours. Here are the capabilities most teams reach for first:

  • Hybrid search: Blend semantic vectors with full-text for relevance that "feels" human
  • Search-as-you-type: Instant feedback under ~50 ms drives higher engagement
  • Typo tolerance & synonyms: Users don't need perfect spelling to find the right items
  • Filtering, sorting, and faceting: Build commerce-grade refinement UIs with minimal backend work (README.md)
  • Geosearch: Filter and sort by location for maps, local inventory, and listings 
  • Security & multi-tenancy: API keys and tenant tokens gate access and personalize results 
  • AI-ready integrations: Works with LangChain and the Model Context Protocol

Under the hood: a Rust engine with pragmatic choices

The project is nearly pure Rust (GitHub "Languages: Rust 99.9%") and splits responsibilities into focused crates. The HTTP layer uses actix-web and actix-http for high-performance async I/O (crates/meilisearch/Cargo.toml). Storage rides on heed, LMDB bindings, chosen for stability and predictable latency (crates/milli/Cargo.toml). 

The core ranking and indexing engine lives in milli, which also integrates charabia for multilingual tokenization; specialized features (e.g., Chinese, Japanese, Hebrew) are toggled via feature flags visible in Cargo manifests (crates/milli/Cargo.toml and crates/meilisearch/Cargo.toml).

For AI and embeddings, milli includes dependencies like candle-core and tokenizers, enabling vectorization and hybrid scoring strategies without pulling an entire ML stack into your app (crates/milli/Cargo.toml). The server binary exposes OpenAPI via utoipa (with an optional Swagger UI through the swagger feature), and ships a small in-app dashboard behind a mini-dashboard feature (crates/meilisearch/Cargo.toml). Tracing hooks (tracing, tracing-subscriber) make profiling and production diagnostics first class.

The repository documents a careful approach to change management: a clear versioning policy (documentation/versioning-policy.md) and an "experimental features" process (documentation/experimental-features.md). The latter lets the team ship capabilities behind explicit flags, like --experimental-..., so users can test and give feedback without destabilizing the stable API.

A quick taste: from zero to relevant results

The basic workflow remains intentionally simple: start the server, create an index, add documents, and search. Here's a minimal HTTP sequence to show the feel of the API.

# Start Meilisearch (see README for platform-specific binaries)
# Then, from a terminal:

# 1) Create an index
curl -X POST "http://localhost:7700/indexes" \
     -H "Content-Type: application/json" \
     -H "Authorization: Bearer <MASTER_KEY>" \
     --data '{"uid": "movies", "primaryKey": "id"}'

# 2) Add documents
curl -X POST "http://localhost:7700/indexes/movies/documents" \
     -H "Content-Type: application/json" \
     -H "Authorization: Bearer <MASTER_KEY>" \
     --data '[{"id": 1, "title": "Interstellar", "genres": ["Sci-Fi"], "year": 2014},{"id": 2, "title": "Arrival", "genres": ["Sci-Fi"], "year": 2016}]'

# 3) Search-as-you-type
curl -X POST "http://localhost:7700/indexes/movies/search" \
     -H "Content-Type: application/json" \
     -H "Authorization: Bearer <MASTER_KEY>" \
     --data '{"q": "arr", "limit": 3}'
  

Out of the box, you'll get typo tolerance, prefix matching, and fast defaults. From there, dial in synonyms, ranking rules, filters, and, when relevant, hybrid scoring with embeddings. The README.md links to production-ready demos you can adapt.

Community, contribution, and governance

The project maintains an active community (50k+ GitHub stars) and transparent processes. To contribute, start with CONTRIBUTING.md: open an issue, discuss scope, then submit a PR against main. Developers are encouraged to use the cargo workflow (with --release for performance), run tests (cargo test), and use snapshot testing via insta (CONTRIBUTING.md). Behavioral expectations follow the CODE_OF_CONDUCT.md (Contributor Covenant). Feature proposals happen in the dedicated product repository discussions, real time help lives in Discord (README.md). Release and prototype processes are documented under documentation/, making roadmap evolution visible and testable.

Usage rights and license

Meilisearch is MIT-licensed (LICENSE). In plain terms: you can use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the software, as long as you include the copyright and license notice. The software is provided "as is," without warranty. For commercial deployments that prefer managed ops, the team offers Meilisearch Cloud, for self-hosting, Docker and platform binaries are available (README.md).

Impact and what's next

Meilisearch's impact is two fold, it lowers the barrier to adding excellent search, and it brings AI era relevance into pragmatic developer workflows. Recent releases continue the pace, v1.16 introduced multi modal embeddings and a new API to transfer data between instances (Cazanove, 2025) https://www.meilisearch.com/blog/meilisearch-1-16. The combination of Rust performance, a tight API surface, and clear docs positions Meilisearch as a sweet spot between heavyweight enterprise stacks and DIY search scripts. Expect further progress on hybrid search ergonomics, model flexibility, and operational tooling.

Wrap up: try it, tweak it, and ship

If you've been postponing search because it felt like a platform rewrite, this repository proves it doesn't have to be. Start with defaults, reach production in days, then refine with metrics and user feedback. Read the README.md, scan documentation/experimental-features.md and documentation/versioning-policy.md, and check CONTRIBUTING.md if you plan to extend the engine. When you're ready, explore demos, wire up the SDK for your favorite framework, and bring your search to life.


Meilisearch: AI-powered hybrid search that feels instant
Joshua Berkowitz August 8, 2025
Share this post
Tags