Edge AI Sizing Tool - AI Developer Tools Tool

Overview

Edge AI Sizing Tool is an open-source utility for sizing, testing, and planning AI deployments on Intel-based edge platforms. It provides a web UI and worker services that let engineers compose workloads with zero-code configuration: pick inputs (video, images, text), choose accelerators and performance modes, and select from a library of pretrained models or supply custom OpenVINO models. The tool runs locally (single-machine, trusted environment) and is intended for evaluation and capacity planning rather than production deployments. ([github.com](https://github.com/open-edge-platform/edge-ai-sizing-tool)) The application provides real-time system telemetry (CPU/GPU utilization, memory, inference throughput and latency) so teams can compare run-time characteristics across models and hardware profiles. It supports pulling models from Hugging Face (via HF_TOKEN), uploading OpenVINO IR model ZIPs, and running predefined models such as YOLOv8, Stable Diffusion, Whisper, and TinyLlama variants. The project includes scripts to install dependencies and start/stop services, and documentation references Docker Compose-based deployment options for isolated evaluation. Intel’s Open Edge Platform / Industry Solution Builders list the tool as part of their evaluation kit for edge AI sizing. ([github.com](https://github.com/open-edge-platform/edge-ai-sizing-tool))

GitHub Statistics

  • Stars: 7
  • Forks: 3
  • Contributors: 3
  • License: Apache-2.0
  • Primary Language: TypeScript
  • Last Updated: 2026-01-07T07:01:42Z

Repository activity and community are currently modest. The GitHub repository is Apache-2.0 licensed and shows a small but recent commit history (100+ commits), 7 stars and 3 forks on the main project page. There are few (or no) open issues and pull requests, and a small set of contributors, indicating the project is maintained but has a limited external contributor base. Users should expect to rely on the provided documentation and raise issues for feature requests or bugs. ([github.com](https://github.com/open-edge-platform/edge-ai-sizing-tool))

Installation

Install via docker:

git clone https://github.com/open-edge-platform/edge-ai-sizing-tool.git
cd edge-ai-sizing-tool
sudo ./install.sh    # installs dependencies, Python and Node.js environments (Linux)
./start.sh           # start the services and open http://localhost:8080
./stop.sh            # stop the application gracefully
export HF_TOKEN='your_huggingface_token'    # (optional) set before start for private Hugging Face models

Key Features

  • Zero-code workload builder: select inputs, accelerator, performance mode, and model.
  • Real-time telemetry: CPU/GPU usage, memory, inference latency, and throughput.
  • Predefined model library: YOLOv8/YOLOv11, Stable Diffusion, Whisper, TinyLlama examples.
  • Hugging Face integration: pull models by repo ID using HF_TOKEN.
  • Custom OpenVINO support: upload model ZIPs or use ./custom_models directory.
  • Local, single-machine eval: designed for trusted environments and capacity planning.
  • Start/stop scripts and Docker Compose deployment guidance for isolated testing.

Community

Small, focused community with limited open activity: repository shows 7 stars, 3 forks, ~103 commits and a few contributors. Official documentation and Intel Open Edge Platform pages provide setup guidance; issues and feature requests should be filed on the repository for the fastest maintainers’ response. Contributors and activity appear sufficient for evaluation and experimentation but expect limited third-party plugins or extensions at present. ([github.com](https://github.com/open-edge-platform/edge-ai-sizing-tool))

Last Refreshed: 2026-01-09

Key Information

  • Category: Developer Tools
  • Type: AI Developer Tools Tool