• hi i have this error with gpt-5-mini-2025-08-07

    Errore: OpenAI API returned status 400: Unsupported value: ‘temperature’ does not support 0 with this model. Only the default (1) value is supported.

    Can you add the settings so we can change based on the model?

Viewing 2 replies - 1 through 2 (of 2 total)
  • Plugin Author Tim W

    (@timwhitlock)

    Thanks for the heads up.

    I’m trying to find time to fix various parts of this component. Including the use of other model vendors.

    Plugin Author Tim W

    (@timwhitlock)

    Quick follow up to say that the current dev version has a fix (more of a hack) that allows you to use gpt-5. It just sets 1.0 instead of the usual 0.0.

    In my brief experience of testing this, the new range of models are very slow compared with gpt-4 and I’d question whether that extra power produces proportionately better translations. I’ll be interested to hear.

    Going forwards – The current API configurations are inadequate for the various things needed to enhance these new AI features. It needs a full review.

    In the mean time, I’ve enabled more functionality via filters. The (currently undocumented) loco_api_provider_openai hook lets you override the temperature.

    You can even use Gemini or OpenRouter by modifying the loco_api_providers array and adding 'vendor'=>'gemini' or 'vendor'=>'openrouter'. Obviously change the model and key accordingly.

Viewing 2 replies - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.