Ollama
Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including your locally hosted models through Ollama.
Portkey SDK Integration with Ollama Models
Portkey provides a consistent API to interact with models from various providers. To integrate Ollama with Portkey:
1. Install the Portkey SDK
Install the Portkey SDK in your application to interact with your Ollama API through Portkey.
npm install --save portkey-ai
2. Initialize Portkey with Ollama URL
Instantiate the Portkey client by adding your Ollama publicly-exposed URL to the customHost
property.
import Portkey from 'portkey-ai'
const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
provider: "ollama",
customHost: "https://7cc4-3-235-157-146.ngrok-free.app" // Your Ollama ngrok URL
})
Requests made to your localhost Ollama endpoints will fail. To integrate with Portkey and to observe your requests, you'll need to expose your localhost URLs publicly via a service like ngrok.
3. Invoke Chat Completions with Ollama
Use the Portkey SDK to invoke chat completions from your Ollama model, just as you would with any other provider.
const chatCompletion = await portkey.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'llama3',
});
console.log(chatCompletion.choices);
Next Steps
Explore the complete list of features supported in the SDK:
SDKYou'll find more information in the relevant sections:
Last updated
Was this helpful?