Dive - AI Developer Tools Tool

Overview

Dive is an open-source MCP Host desktop application that integrates with multiple large language models (LLMs) that support function calling. It provides universal LLM support (for example ChatGPT, Anthropic, Ollama), and runs on Windows, macOS, and Linux. The app includes features for custom instructions, API management, and auto-update mechanisms implemented via the Model Context Protocol (MCP). The project is published as a GitHub repository for inspection and contribution.

Key Features

  • Universal LLM support (ChatGPT, Anthropic, Ollama, and more)
  • Cross-platform desktop: Windows, macOS, Linux
  • Function-calling integration via Model Context Protocol (MCP)
  • Custom instructions management
  • API management for connected models
  • Auto-update mechanisms powered by MCP
  • Open-source repository for inspection and contributions

Ideal Use Cases

  • Experiment with multiple LLMs from a single desktop client
  • Build and test function-calling workflows
  • Manage model APIs and credentials locally
  • Prototype agent-driven desktop applications
  • Cross-platform validation of LLM integrations

Getting Started

  • Clone the project's GitHub repository.
  • Install dependencies for your operating system.
  • Configure LLM and API credentials in the app settings.
  • Add or connect MCP-compatible models.
  • Launch the desktop application and verify connectivity.
  • Enable auto-updates if using MCP update mechanisms.

Pricing

No pricing information is disclosed in the project repository; the project is published as open-source.

Limitations

  • Full functionality requires LLMs that support function calling and MCP.

Key Information

  • Category: Developer Tools
  • Type: AI Developer Tools Tool