Hugging Face Spaces - AI Model Hubs Tool

Overview

Hugging Face Spaces is a hosted platform for publishing interactive machine-learning demo apps and web UIs (Gradio, Streamlit, and static apps) directly alongside models and datasets on the Hugging Face Hub. Spaces provides a git-based developer workflow: each Space is a repository you push code to (app files, requirements, Dockerfile), which is automatically built and served. Because Spaces is integrated with the Hub, apps can directly load models and datasets with minimal configuration and be discovered by the community (search, tags, and model pages). Spaces supports custom Python environments (requirements.txt), custom Docker containers (Dockerfile), and hardware acceleration on supported instances so you can run GPU-backed demos. It also provides features needed for production-style demos: repository-level privacy settings (public or private Spaces), environment variables and secrets, logs/console output, and the ability to expose REST endpoints for programmatic inference. The platform is designed for rapid prototyping and sharing of model demos, benchmarks, microservices, and interactive visualizations, with an emphasis on a simple git or huggingface_hub API workflow. For details and usage patterns, refer to the official docs (see sources).

Key Features

  • Host Gradio, Streamlit, and static web apps directly on the Hugging Face Hub
  • Custom Python environments via requirements.txt for reproducible runtime dependencies
  • Support for Dockerfiles to run fully custom containers in Spaces
  • GPU-accelerated instances available for compute-heavy demos and model inference
  • Git-based workflow and huggingface_hub API for CI/CD and programmatic deployment

Example Usage

Example (python):

from huggingface_hub import login, create_repo, upload_folder
import os

# 1) Authenticate with your HF token (scoped to repo and spaces permissions)
login(token=os.environ.get('HF_TOKEN'))

# 2) Create a new Space repository (repo_type='space')
repo_id = "your-username/my-demo-space"
create_repo(repo_id=repo_id, repo_type="space", exist_ok=True, space_sdk="gradio")

# 3) Prepare files locally: app.py (Gradio app) and requirements.txt
# Example: app.py
# import gradio as gr
# def greet(name):
#     return f"Hello {name}!"
# demo = gr.Interface(fn=greet, inputs="text", outputs="text")
# demo.launch(server_name='0.0.0.0', server_port=7860)

# 4) Upload the folder contents to the Space repo
local_folder = "./my_demo_space_files"  # contains app.py, requirements.txt, README.md
upload_folder(folder_path=local_folder, path_in_repo=".", repo_id=repo_id, repo_type="space")

print(f"Space created and files uploaded: https://huggingface.co/spaces/{repo_id}")
Last Refreshed: 2026-01-09

Key Information

  • Category: Model Hubs
  • Type: AI Model Hubs Tool