← Back to Apps

LocalAI

OpenAI-compatible API for running LLMs locally without a GPU.

Minimum Requirements

CPU
1 Core
Memory
4.0 GB
Storage
20 GB

Pricing

Starting from
$9.00/mo
$0.0123/hour

Tags

llmaiopenailocal
Docker Image
quay.io/go-skynet/local-ai:latest