Open Lovable is a small but punchy example app from the Firecrawl team that lets you chat with an AI to build a modern React app in minutes. It pairs the Firecrawl scraping API with an isolated code-execution sandbox so the assistant can read real websites, plan a component structure, and generate a working Next.js UI with Tailwind CSS.
The problem and the solution
Most of us copy UI patterns from the web and hand-wire them into React. That is slow, error-prone, and forgettable the next time we need the same pattern. Open Lovable shows a better loop: scrape a target site with the Firecrawl API, analyze the structure, and let an AI assistant assemble a React app you can immediately run and refine. The app is built on Next.js 15 and React 19, with a clear setup in README.md and permissive terms in LICENSE. For the scraping side, see Firecrawl (Firecrawl, 2025). For secure code execution and tooling, it uses E2B sandboxes (E2B, 2025).
Key features
- AI-guided React app generation: The main chat flow orchestrated in app/page.tsx turns site input and user prompts into components and pages. Tailwind is configured via tailwind.config.ts and app/globals.css.
- Scrape to structure via Firecrawl: The app expects
FIRECRAWL_API_KEY
to fetch and parse target websites into LLM-ready content, speeding up UI reconstruction (Firecrawl; Firecrawl, 2025).- Safe code execution with E2B: With
@e2b/code-interpreter
, generations can be executed and validated in an isolated sandbox, improving reliability without compromising security (E2B; E2B, 2025).- Multiple AI providers through AI SDK: The repo includes adapters for Anthropic, OpenAI, Google Gemini, and Groq through
ai
and@ai-sdk/*
packages; pick one by setting the corresponding API key (see package.json dependencies).- Typed context and file manifests: Types like types/conversation.ts and types/file-manifest.ts keep the assistant state and generated files coherent across steps.
Why I like it
Three things stand out. First, the repository is practical: it focuses on the thin end-to-end that matters and avoids sprawling boilerplate.
Second, the team documented tricky parts like streaming and tool-calling with concise notes in docs/STREAMING_FIXES_SUMMARY.md and docs/TOOL_CALL_FIX_SUMMARY.md.
Third, the UI has helpful diagnostics such as components/HMRErrorDetector.tsx and progress affordances in components/CodeApplicationProgress.tsx, which make the assistant feel responsive rather than opaque.
# Local quickstart
# 1) Clone and install
# git clone https://github.com/firecrawl/open-lovable.git
# cd open-lovable
# npm install
# 2) Create .env.local with at least Firecrawl + one model provider
E2B_API_KEY=your_e2b_api_key
FIRECRAWL_API_KEY=your_firecrawl_api_key
# choose one provider
OPENAI_API_KEY=your_openai_api_key
# or
ANTHROPIC_API_KEY=your_anthropic_api_key
# or
GEMINI_API_KEY=your_gemini_api_key
# or
GROQ_API_KEY=your_groq_api_key
# 3) Run the app
# npm run dev
# open http://localhost:3000
Under the hood
This is a Next.js 15 app directory project with React 19. The core UI route is app/page.tsx, wired by app/layout.tsx and global styles in app/globals.css. UI primitives and helpers live in components/, including a sandbox preview (components/SandboxPreview.tsx).
The assistant's planning, search, and file operations are organized in lib/ - e.g., lib/edit-intent-analyzer.ts, lib/file-parser.ts, and lib/file-search-executor.ts.
Styling is Tailwind-first via tailwind.config.ts and postcss.config.mjs (Vercel, 2025). The codebase is TypeScript, linted with eslint.config.mjs.
The project intentionally keeps server complexity low and delegates heavy lifting to best-in-class services: Firecrawl for scraping, E2B for sandboxed execution, and model providers through ai
SDK connectors. The docs/ folder includes useful notes like PACKAGE_DETECTION_GUIDE.md and a UI flow demo in UI_FEEDBACK_DEMO.md.
Use cases
Rapid prototyping is the sweet spot: point the assistant at a marketing site or documentation page and get a React rendition you can tweak and deploy. Teams can also use it to bootstrap design systems from inspiration sites, migrate legacy HTML to a typed component library, or augment internal tooling with quickly-assembled dashboards. Because Firecrawl returns clean, structured content, you can blend scraped context into chat-driven builders or doc summarizers beyond UI cloning.
Community and contribution
The repository is MIT-licensed and active: it has thousands of stars and an engaged issues list with real-world requests like Docker support, OpenRouter and local model providers, and lockfile hygiene. A few notable threads include Docker support, OpenRouter API key support, local models such as Ollama, and lockfile consistency. The mix of bug reports and feature ideas suggests healthy usage and a straightforward path for contributions via pull requests.
Usage and license terms
The project uses the MIT License. In short: you can use, copy, modify, merge, publish, distribute, sublicense, and sell copies of the software, provided you include the copyright and license notice. The software is provided as-is, without warranty. See LICENSE for details.
About the company
Firecrawl builds an open-source web scraping and crawling stack that turns websites into LLM-ready data via APIs and SDKs. The platform emphasizes speed, resilience on JS-heavy pages, and integrations across popular AI tooling. It also ships an MCP server and examples for editors and agents.
The Open Lovable app demonstrates how Firecrawl's data layer unlocks AI-native builders. For a hosted, end-to-end app-building experience, see Lovable.dev, which offers cloud features like persistence, collaboration, and deployment.
Impact and what's next
By compressing the distance from website inspiration to working React code, Open Lovable lowers the activation energy for shipping. Expect the community to harden deployment paths (for example, Docker and docker-compose), expand provider support (OpenRouter and local models), and refine reliability across streaming and tool-calling flows. Because the core is a plain Next.js app, it slots into existing CI and hosting choices, while Firecrawl and E2B handle the specialized parts.
Conclusion
If you are curious about AI-assisted UI generation that you can actually run locally, this repo is a great sandbox. Start with README.md, set your .env.local
, and try cloning a page you like. Then peel back app/page.tsx and the lib/ helpers to see how the pipeline fits together. When you outgrow local, explore Lovable.dev for the hosted path.
Open Lovable: Chat your way to a working React app