Skip to content

OpenRouter: “No allowed providers are available for the selected model” despite valid model and providers #1255

@doc1005

Description

@doc1005

I’m seeing repeated 404 errors from OpenRouter when using Agent Zero with various OpenRouter models, even though the models are valid and I’ve enabled the suggested providers on the OpenRouter side.

Environment

Agent Zero: latest public release (self‑hosted)

LLM provider: OpenRouter

Working baseline model: mistralai/mistral-large-2407 (this works fine)

Additional OpenRouter providers allowed: Mistral, Cloudflare, plus others I tried such as OpenInference

I have multiple API keys and per‑key limits configured in OpenRouter; the “production” key with Mistral Large 2407 is stable.

Problem

When I change the model in Agent Zero to use newer or cheaper OpenRouter models, I get this error:
litellm.exceptions.NotFoundError: litellm.NotFoundError: NotFoundError: OpenrouterException - {"error":{"message":"No allowed providers are available for the selected model.","code":404,"metadata":{"available_providers":["google-ai-studio","liquid","arcee-ai","stepfun","open-inference","venice","nvidia"],"requested_providers":["openai","mistral","cloudflare"]}}}

I’ve also seen a similar error when trying openai/gpt-4.1-mini and openrouter/free.

Key points:

OpenRouter’s response shows available_providers for the chosen model as:
["google-ai-studio","liquid","arcee-ai","stepfun","open-inference","venice","nvidia"]

Agent Zero (via LiteLLM) is sending requested_providers:
["openai","mistral","cloudflare"]

There’s no overlap between those lists, so OpenRouter returns the 404 with "No allowed providers are available for the selected model."

From my side in the OpenRouter dashboard I’ve:

Allowed the providers suggested by the error (Mistral, Cloudflare, and even OpenInference, etc.).
Verified that the same account and keys work fine directly with mistralai/mistral-large-2407.

Expected

When I configure Agent Zero with a valid OpenRouter model ID (e.g., openai/gpt-4.1-mini or openrouter/free) and a valid API key, the connector should either:

Let OpenRouter pick any available provider for that model, or

Respect the available_providers list (e.g., google-ai-studio, venice, open-inference, etc.) rather than forcing ["openai","mistral","cloudflare"].

Actual

Agent Zero appears to hard‑code or override requested_providers to ["openai","mistral","cloudflare"], which leads to a 404 for many newer OpenRouter models whose providers are different.

Request

Please update the OpenRouter integration so that:

It does not force requested_providers when calling OpenRouter, or

It aligns requested_providers with the model’s actual provider list from OpenRouter, or makes this configurable in the UI/settings.

A short doc/example showing the correct model string format for newer OpenRouter models (e.g., openai/gpt-4.1-mini) in Agent Zero would also help.

I’m happy to test any patch or config change you suggest; I already have a working baseline with mistralai/mistral-large-2407 to compare against.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions