localai

Run open-source LLMs locally as a drop-in OpenAI API replacement

brewmacoslinux
Try with needOr install directly
Source

About

OpenAI alternative

Commands

local-ai

Examples

Start the LocalAI server on default port 8080$ local-ai start
Query a locally running model using OpenAI-compatible API$ curl http://localhost:8080/v1/chat/completions -H 'Content-Type: application/json' -d '{"model":"gpt-4","messages":[{"role":"user","content":"Hello"}]}'
Start LocalAI with custom port and model$ local-ai start --port 9000 --model mistral