What is Portkey?
Teams use Portkey to monitor and improve the cost, performance, and accuracy of their Gen AI apps.
It takes 2 mins to integrate and with that, it starts monitoring all of your LLM requests and makes your app resilient, secure, performant, and more accurate at the same time.
Here's a product walkthrough (3 mins):
Integrate in 3 Lines of Code
# pip install portkey-ai
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders
client = OpenAI(
base_url=PORTKEY_GATEWAY_URL,
default_headers=createHeaders(
provider="openai",
api_key="PORTKEY_API_KEY"
)
)
chat_complete = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Say this is a test"}],
)
print(chat_complete.choices[0].message.content)
While you're here, why not give us a star? It helps us a lot!
Languages Supported
Language
Supported Library
Javascript
Java
Rust
Ruby
AI Providers Supported
Portkey is multimodal by default - along with chat and text models, we also support audio, vision, and image generation models.
AI Provider
Status
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
AzureML
partially supported
fully supportedpublic
fully supportedpublic
fully supportedpublic
partially supportedpublic
fully supportedpublic
ZhipuAI (ChatGLM)
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
fully supportedpublic
Deepbricks
fully supportedpublic
SiliconFlow
fully supportedpublic
View all the supported integration guides.
Frameworks Supported
Framework
Status
nativepythontypescript
nativepythontypescript
nativepython
nativetypescript
nativepythontypescript
nativetypescript
nativepython
Last updated
Was this helpful?