TinyPod
Dashboard
← Back to Apps
LocalAI
OpenAI-compatible API for running LLMs locally without a GPU.
AI
Website
Source
Minimum Requirements
CPU
1 Core
Memory
4.0 GB
Storage
20 GB
Pricing
Starting from
$9.00
/mo
$0.0123/hour
Deploy LocalAI
Tags
llm
ai
openai
local
Docker Image
quay.io/go-skynet/local-ai:latest