Projects

Explore our curated collection of open source AI projects that you can self-host.

Browse our collection of high-quality open source AI projects across various categories. Each project includes detailed installation instructions, system requirements, and practical exercises to help you get started.

Use the category filters below to find projects that match your interests or search for specific capabilities.

All
GPU Marketplace
Web Development
Agentic Frameworks
Search
Speech Recognition
Face Recognition
Image Generation
Local LLM Framework
AI Orchestration
Observability
Vector Databases
Low Code Automation
AI
Computer Vision
Deep Learning
Language Models
LLM Inference
AI Infrastructure
Self-Hosting
Development
Automation
AI Development

GPU Marketplace

☁️

RunPod

A cloud computing platform designed specifically for AI workloads, offering GPU instances, serverless GPUs, and AI endpoints.

Web Development

🎛️

Gradio

A Python library for quickly creating customizable web interfaces for machine learning models and data workflows.

An open-source Python library that transforms data scripts into shareable web applications in minutes, with no front-end experience required.

Agentic Frameworks

Pydantic AI is a Python framework for building type-safe, production-grade AI agents with seamless integration of generative models and robust data validation.

🤖

CrewAI

CrewAI is an open-source framework for orchestrating collaborative AI agents with defined roles, tools, and workflows to tackle complex tasks efficiently.

Search

A privacy-respecting, hackable metasearch engine that aggregates results from various search services without tracking users

Speech Recognition

Face Recognition

Self-host InsightFace for AI experimentation

Image Generation

Self-host InvokeAI for AI experimentation

Self-host Stable Diffusion for AI experimentation

Self-host Stable Diffusion WebUI for AI experimentation

Local LLM Framework

📱

Jan

Self-host Jan for AI experimentation

🧠

Ollama

Self-host Ollama for AI experimentation

Self-host privateGPT for AI experimentation

AI Orchestration

Self-host LangChain for AI experimentation

Self-host Langflow for AI experimentation

A natural language interface that lets LLMs run code on your computer

Observability

Self-host Logfire for AI experimentation

Vector Databases

🗄️

Milvus

Self-host Milvus for AI experimentation

🗄️

Supabase

Self-host Supabase for AI experimentation

Low Code Automation

AI

Self-host Open WebUI for AI experimentation

🗄️

pgvector

Self-host pgvector for AI experimentation

🎵

Piper

Self-host Piper for AI experimentation

🗄️

Qdrant

Self-host Qdrant for AI experimentation

Self-host Supervision for AI experimentation

Self-host Text Generation WebUI for AI experimentation

Computer Vision

👁️

YOLOv8

Self-host YOLOv8 for AI experimentation

Deep Learning

An open source machine learning framework that accelerates the path from research prototyping to production deployment.

Language Models

🦙

Llama

Meta's powerful open-source large language model that can be run locally on consumer hardware.

LLM Inference

vLLM

A high-throughput and memory-efficient inference and serving engine for Large Language Models

AI Infrastructure

vLLM

A high-throughput and memory-efficient inference and serving engine for Large Language Models

Self-Hosting

OpenFaaS is a platform for serverless functions that makes it simple to deploy both functions and existing code to Kubernetes with a unified experience.

Development

OpenFaaS is a platform for serverless functions that makes it simple to deploy both functions and existing code to Kubernetes with a unified experience.

Automation

OpenFaaS is a platform for serverless functions that makes it simple to deploy both functions and existing code to Kubernetes with a unified experience.

AI Development

State-of-the-art Machine Learning for PyTorch, TensorFlow, and JAX. A powerful library for working with pre-trained language models.

Desktop application providing Kubernetes and container management, with a user-friendly interface for running AI workloads locally.

👁️

Kind

Run local Kubernetes clusters using Docker containers, perfect for testing AI applications in a Kubernetes environment before production.