2.4 C
New York
Friday, March 7, 2025

Portkey: An open-source AI gateway for simple LLM orchestration




from portkey_ai import Portkey
import os

lb_config = {
    "technique": { "mode": "loadbalance" },
    "targets": [{
        "provider": 'openai',
        "api_key": os.environ["OPENAI_API_KEY"],
        "weight": 0.1
    },{
        "supplier": 'groq',
        "api_key": os.environ["GROQ_API_KEY"],
        "weight": 0.9,
        "override_params": {
            "mannequin": 'llama3-70b-8192'
        },
    }],
}

consumer = Portkey(config=lb_config)

response = consumer.chat.completions.create(
    messages=[{"role": "user", "content": "What's the meaning of life?"}],
    mannequin="gpt-4o-mini"
)

print(response.decisions[0].message.content material)

Implementing conditional routing:


from portkey_ai import Portkey
import os

openai_api_key = os.environ["OPENAI_API_KEY"]
groq_api_key = os.environ["GROQ_API_KEY"]

pk_config = {
    "technique": {
        "mode": "conditional",
        "situations": [
            {
                "query": {"metadata.user_plan": {"$eq": "pro"}},
                "then": "openai"
            },
            {
                "query": {"metadata.user_plan": {"$eq": "basic"}},
                "then": "groq"
            }
        ],
        "default": "groq"
    },
    "targets": [
        {
            "name": "openai",
            "provider": "openai",
            "api_key": openai_api_key
        },
        {
            "name": "groq",
            "provider": "groq",
            "api_key": groq_api_key,
            "override_params": {
                "model": "llama3-70b-8192"
            }
        }
    ]
}

metadata = {
    "user_plan": "professional"
}

consumer = Portkey(config=pk_config, metadata=metadata)

response = consumer.chat.completions.create(
    messages=[{"role": "user", "content": "What's the meaning of life?"}]
)
print(response.decisions[0].message.content material)

The above instance makes use of the metadata worth user_plan to find out which mannequin needs to be used for the question. That is helpful for SaaS suppliers who provide AI by a freemium plan.

Harnessing Portkey AI Gateway for LLM integration

Portkey represents a big innovation in LLM integration. It addresses essential challenges in managing a number of suppliers and optimizing efficiency. By offering an open-source framework that permits seamless interplay with varied LLM suppliers, the challenge fills a big hole in present AI growth workflows.

The challenge thrives on group collaboration, welcoming contributions from builders worldwide. With an energetic GitHub group and open points, Portkey encourages builders to take part in increasing its capabilities. The challenge’s clear growth method and open-source licensing make it accessible for each particular person builders and enterprise groups.



Supply hyperlink

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles