Edge AI Sizing Tool - AI Developer Tools Tool

Overview

Edge AI Sizing Tool is an open-source utility for planning and sizing edge AI deployments, focused on Intel-based edge hardware. The project provides a zero-code web UI for selecting inputs, accelerators, performance modes and models, and exposes real-time system telemetry (CPU/GPU/memory/inference speed) so teams can evaluate throughput and resource Utilization before committing to hardware. The tool supports predefined model catalogs, direct downloads from Hugging Face or ModelScope, and custom OpenVINO IR uploads for on-premise inference experimentation. ([github.com](https://github.com/open-edge-platform/edge-ai-sizing-tool)) Installation and run are script-driven: the repository includes a root install script that configures Ubuntu (adds Intel repos, installs Intel XPU Manager and DLStreamer, builds Intel PCM, and installs Node.js), plus start/stop scripts that launch the services and web UI on localhost:8080. Validated hardware and software targets (e.g., Intel Core Ultra processors, Intel Arc GPUs, 64GB RAM, Ubuntu 24.04/22.04, Python 3.10+, Node.js 22+) are documented in the README. The project is distributed under Apache‑2.0 and explicitly carries a “not for production” disclaimer for untrusted environments. ([raw.githubusercontent.com](https://raw.githubusercontent.com/open-edge-platform/edge-ai-sizing-tool/main/install.sh))

GitHub Statistics

  • Stars: 7
  • Forks: 3
  • Contributors: 3
  • License: Apache-2.0
  • Primary Language: TypeScript
  • Last Updated: 2026-01-20T05:26:17Z

The repository is hosted under the Open Edge Platform organization and shows modest community size: 7 stars and 3 forks, with roughly 105 commits and multiple active pull requests, indicating active maintenance but a small contributor base. The codebase contains install/start/stop scripts, a full frontend (package.json), documentation, and CONTRIBUTING/DEPLOYMENT guides, which lowers onboarding friction for engineers. The project uses an Apache‑2.0 license. Recent activity and open pull requests indicate development work continues, but the number of contributors remains small (few active authors), so community support is limited compared with larger OSS projects. ([github.com](https://github.com/open-edge-platform/edge-ai-sizing-tool))

Installation

Install via npm:

sudo ./install.sh  # installs OS packages, Intel XPU manager, DLStreamer, Node.js, builds Intel PCM (Ubuntu 24.04 required).
./start.sh         # launches services and frontend (web UI available at http://localhost:8080).
./stop.sh          # stops services and background workers.
export HF_TOKEN='your_huggingface_token'  # (Linux) set Hugging Face token if using private models, or set frontend/.env accordingly.
cd frontend && npm run dev  # optional: run frontend in development mode (see frontend/package.json).

Key Features

  • Zero-code web UI for composing inputs, accelerators, performance modes, and models.
  • Real-time telemetry: CPU, GPU, memory, and inference latency metrics for each workload.
  • Supports predefined models, Hugging Face/ModelScope IDs, and custom OpenVINO IR ZIP uploads.
  • Validated on Intel hardware (Core Ultra, Intel Arc GPUs) with Intel XPU Manager and DLStreamer.
  • Scripted installer configures Node.js, Intel repos, and build tools for Ubuntu 24.04/22.04.
  • Worker processes and PM2-based service orchestration with documented ports (frontend 8080, workers 5000–6000).

Community

Small but active project within Intel’s Open Edge Platform organization. Repository shows 7 stars, 3 forks, ~105 commits and active pull requests; contributor count is limited, so community support and third-party integrations are modest. Contributors and users are encouraged to open issues or PRs via the repository’s CONTRIBUTING.md. ([github.com](https://github.com/open-edge-platform/edge-ai-sizing-tool))

Last Refreshed: 2026-03-03

Key Information

  • Category: Developer Tools
  • Type: AI Developer Tools Tool