Bolt.new - AI Code Assistants Tool
Overview
Bolt.new is a browser-first “vibe coding” code assistant and prompt-to-app platform that generates full-stack web projects from natural-language prompts. In the editor you get a live preview, file tree, terminal, and an AI chatbox that can run commands, install packages, and deploy apps — all inside the browser using WebContainers technology. Bolt targets rapid prototypes, MVPs and hackathon builds: describe an idea (for example a to‑do app or a blog) and Bolt will scaffold frontend, backend, authentication, and database wiring so you can preview and iterate immediately. ([techradar.com](https://www.techradar.com/pro/best-vibe-coding-tools?utm_source=openai)) Bolt emphasizes practical integrations and workflow controls rather than purely generating code: it supports Figma imports, database/auth providers (Supabase/Firebase), deploy targets (Netlify/Vercel), payments (Stripe), GitHub and mobile packaging via Expo. The product separates “Discussion/Plan” modes (low‑token brainstorming and planning) from “Build” mode (which makes code changes), includes token-management tooling and cleanup utilities, and offers project export (zip/Git) for local use or further development. Bolt has also partnered with Anthropic to surface Claude-based models inside the editor and provides a model selector and agent features in recent releases. ([support.bolt.new](https://support.bolt.new/docs/discussion-mode?utm_source=openai))
Model Statistics
- Likes: 71
Model Details
Runtime & architecture: Bolt runs a Node.js development environment directly in the browser via WebContainers (WebAssembly + service-worker based containerization), giving a real filesystem, terminal, package manager, and live server preview without local installs. This enables Bolt to run build tools (npm/yarn), start servers, and let the AI open/modify files and run commands in a reproducible sandbox. ([techradar.com](https://www.techradar.com/pro/best-vibe-coding-tools?utm_source=openai)) AI & workflows: Bolt exposes an AI agent inside the IDE with two main interaction styles: Discussion/Plan (context-aware, search-grounded planning that avoids immediate code changes) and Build (authoring / code edits). Discussion Mode is optimized for token efficiency and can use search grounding to pull up-to-date documentation; Build Mode applies generated edits directly to the project. Bolt’s release notes and blog also document a partnership with Anthropic (Claude family) and a built-in “Claude Agent” option; users can toggle among Claude variants where available. The platform includes token-management features (daily/monthly limits, cleanup tools) and version history to roll back changes without consuming tokens. ([support.bolt.new](https://support.bolt.new/docs/discussion-mode?utm_source=openai)) Supported stacks & integrations: Bolt is JS-first — it commonly scaffolds React/Next/Vite frontends with Tailwind and a Node/Express or serverless-style backend, and integrates with Supabase/Firebase for auth/storage, Stripe for payments, Netlify/Vercel for hosting, GitHub for version control, and Expo for mobile packaging. Complex non-JS backends or enterprise governance features may still require conventional stacks. ([upgradewithsom.com](https://upgradewithsom.com/replit-alternatives/?utm_source=openai))
Key Features
- Prompt-to-app builder: scaffold full frontend/backend projects from a single natural-language prompt.
- Browser runtime: Node.js, terminal and live server run in-browser via WebContainers.
- Discussion/Plan modes: plan and troubleshoot with far fewer tokens before applying code changes.
- Integrations: Figma import, Supabase/Firebase, Stripe, Netlify/Vercel, GitHub, and Expo support.
- Export & deploy: download project zip, push to GitHub, or deploy to Netlify/Vercel from the UI.
Example Usage
Example (python):
import subprocess
import zipfile
from pathlib import Path
# Example: run a Bolt-generated project downloaded as project.zip
zip_path = Path('project.zip')
extract_dir = Path('bolt_project')
# Unzip (assumes you already downloaded the Bolt export)
with zipfile.ZipFile(zip_path, 'r') as z:
z.extractall(extract_dir)
# Change into project dir (assume top-level folder)
# Note: many Bolt projects are standard Node.js web apps. Run the usual commands.
subprocess.run(['npm', 'install'], cwd=extract_dir)
# Start dev server (may be 'npm run dev' or 'npm start' depending on template)
subprocess.run(['npm', 'run', 'dev'], cwd=extract_dir)
# If you prefer to build and serve for production:
# subprocess.run(['npm', 'run', 'build'], cwd=extract_dir)
# subprocess.run(['npm', 'run', 'start'], cwd=extract_dir)
# References: Bolt suggests using npm install / npm run dev for local preview after export or zip download. ([reddit.com](https://www.reddit.com/r/boltnewbuilders/comments/1jh3mb5?utm_source=openai)) Benchmarks
Hugging Face Space likes (demo by friuns): 71 likes (Source: ([huggingface.co](https://huggingface.co/spaces/friuns/bolt.new?utm_source=openai)))
Free plan token allowance (monthly): 1,000,000 tokens monthly (with 300k/day limit) (Source: ([support.bolt.new](https://support.bolt.new/building/quickstart?utm_source=openai)))
Discussion Mode token reduction: Discussion/Plan modes use ~90% fewer tokens than Build Mode (documentation & community tooling reference) (Source: ([support.bolt.new](https://support.bolt.new/docs/discussion-mode?utm_source=openai)))
Default LLM/agent (as of recent releases): Claude Agent (Anthropic) set as default LLM option in release notes (Source: ([support.bolt.new](https://support.bolt.new/release-notes?utm_source=openai)))
Key Information
- Category: Code Assistants
- Type: AI Code Assistants Tool