Azure OpenAI
Azure OpenAI is a great alternative to accessing the best models including GPT-4 and more in your private environments. Portkey provides complete support for Azure OpenAI.
Last updated
Was this helpful?
Azure OpenAI is a great alternative to accessing the best models including GPT-4 and more in your private environments. Portkey provides complete support for Azure OpenAI.
Last updated
Was this helpful?
With Portkey, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a system.
Portkey provides a consistent API to interact with models from various providers. To integrate Azure OpenAI with Portkey:
Here's a step-by-step guide:
Select your Foundation Model
from the dropdowon on the modal.
Now, on Azure OpenAI studio, go to any playground (chat or completions), click on a UI element called "View code". Note down the API version & API key from here. (This will be your Azure API Version & Azure API Key)
Now, let's make a request using this virtual key!
Add the Portkey SDK to your application to interact with Azure OpenAI's API through Portkey's gateway.
Use the Portkey instance to send requests to your Azure deployments. You can also override the virtual key directly in the API call if needed.
Once you're ready with your prompt, you can use the portkey.prompts.completions.create
interface to use the prompt in your application.
Portkey supports multiple modalities for Azure OpenAI and you can make image generation requests through Portkey's AI Gateway the same way as making completion calls.
Portkey's fast AI gateway captures the information about the request on your Portkey Dashboard. On your logs screen, you'd be able to see this request with the request and response.
Here's how you can pass your Azure OpenAI details & secrets directly without using the Virutal Keys feature.
In a typical Azure OpenAI request,
AZURE RESOURCE NAME
azureResourceName
azure_resource_name
x-portkey-azure-resource-name
AZURE DEPLOYMENT NAME
azureDeploymentId
azure_deployment_id
x-portkey-azure-deployment-id
API VERSION
azureApiVersion
azure_api_version
x-portkey-azure-api-version
AZURE API KEY
Authorization: "Bearer + {API_KEY}"
Authorization = "Bearer + {API_KEY}"
Authorization
AZURE MODEL NAME
azureModelName
azure_model_name
x-portkey-azure-model-name
If you have configured fine-grained access for Azure OpenAI and need to use JSON web token (JWT)
in the Authorization
header instead of the regular API Key
, you can use the forwardHeaders
parameter to do this.
For further questions on custom Azure deployments or fine-grained access tokens, reach out to us on support@portkey.ai
The complete list of features supported in the SDK are available on the link below.
You'll find more information in the relevant sections:
Request access to Azure OpenAI .
Create a resource in the Azure portal . (This will be your Resource Name)
Deploy a model in Azure OpenAI Studio . (This will be your Deployment Name)
When you input these details, the foundation model will be auto populated. More details in .
If you do not want to add your Azure details to Portkey vault, you can also directly pass them while instantiating the Portkey client.
Set up Portkey with your virtual key as part of the initialization configuration. You can create a for Azure in the Portkey UI.
You can manage all prompts to Azure OpenAI in the . All the current models of OpenAI are supported and you can easily start testing different prompts.
More information on image generation is available in the .