LocalAI
Last updated
Was this helpful?
Last updated
Was this helpful?
Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including your locally hosted models through .
First, ensure that your API is externally accessible. If you're running the API on http://localhost
, consider using a tool like ngrok
to create a public URL. Then, instantiate the Portkey client by adding your LocalAI URL (along with the version identifier) to the customHost
property, and add the provider name as openai
.
Note: Don't forget to include the version identifier (e.g., /v1
) in the customHost
URL
Use the Portkey SDK to invoke chat completions from your LocalAI model, just as you would with any other provider.
/chat/completions
(Chat, Vision, Tools support)
/images/generations
/embeddings
/audio/transcriptions
Explore the complete list of features supported in the SDK:
You'll find more information in the relevant sections: