LocalAI
Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including your locally hosted models through LocalAI.
Portkey SDK Integration with LocalAI
1. Install the Portkey SDK
npm install --save portkey-ai
2. Initialize Portkey with LocalAI URL
First, ensure that your API is externally accessible. If you're running the API on http://localhost
, consider using a tool like ngrok
to create a public URL. Then, instantiate the Portkey client by adding your LocalAI URL (along with the version identifier) to the customHost
property, and add the provider name as openai
.
Note: Don't forget to include the version identifier (e.g., /v1
) in the customHost
URL
import Portkey from 'portkey-ai'
const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
provider: "openai",
customHost: "https://7cc4-3-235-157-146.ngrok-free.app/v1" // Your LocalAI ngrok URL
})
3. Invoke Chat Completions
Use the Portkey SDK to invoke chat completions from your LocalAI model, just as you would with any other provider.
const chatCompletion = await portkey.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'ggml-koala-7b-model-q4_0-r2.bin',
});
console.log(chatCompletion.choices);
LocalAI Endpoints Supported
Next Steps
Explore the complete list of features supported in the SDK:
SDKYou'll find more information in the relevant sections:
Last updated
Was this helpful?