OpenAI Proxy

OpenAI Proxy Entry Point

There is now an OpenAI Proxy Entry Point in the DataEngine. This takes care of the request to AzureOpenAI (if the access data has been entered in the admin area), otherwise the request is sent to the “normal” OpenAI.

A POST request should be sent to the url : …?entryPoint=OpenAIProxy.

Example: https://xxx.marini.systems/index.php?entryPoint=OpenAIProxy

Example for the POST-Payload:

[
    'model' => 'gpt-35-turbo',
    'message' => 'Hi there, i have a question ...',
    'max_tokens' => 500,
    'temperature' => 0.7,
    'api_version' => '2024-02-15-preview'
]

The parameters that are considered “mandatory” are: “model” & “message“. (If AzureOpenAI should also be used “api_version“)

Other parameters that could be sent: “top_p“, “frequency_penalty“, “presence_penalty“, “debug“.

Endpoint & API key no longer need to be entered, the entry point takes care of this.

If “debug” is switched on, the actual error message is returned in the event of a problem, otherwise an empty string.

Marini Systems GmbH | Contact SupportMarini Website | Privacy Statement | Legal