Skip to content

Inconsistent behavior in models list #3116

@bparees

Description

@bparees

System Info

Using llama-stack 0.2.17, I don't think any other deps/levels come into play in what i've observed.

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

The /v1/models api has inconsistent behavior with respect to populating the identifier field.

i'm running LLS standalone using this launch command:

uv run llama stack run run.yaml

and this run.yaml:

models:
  - model_id: my_llm
    provider_id: openai
    model_type: llm
    provider_model_id: gpt-4o-mini
  - model_id: gpt-4-turbo
    provider_id: openai
    model_type: llm
    provider_model_id: gpt-4-turbo

and the /v1/models api endpoint reports the following models (i'm truncating the list, in reality it shows many other openai models in addition):

{
  "data": [
    {
      "identifier": "my_llm",
      "provider_resource_id": "gpt-4o-mini",
      "provider_id": "openai",
      "type": "model",
      "metadata": {},
      "model_type": "llm"
    },
    {
      "identifier": "openai/gpt-4-turbo",
      "provider_resource_id": "gpt-4-turbo",
      "provider_id": "openai",
      "type": "model",
      "metadata": {},
      "model_type": "llm"
    },
    {
      "identifier": "openai/gpt-3.5-turbo-0125",
      "provider_resource_id": "gpt-3.5-turbo-0125",
      "provider_id": "openai",
      "type": "model",
      "metadata": {},
      "model_type": "llm"
    },

    .........................

The inconsistency is that there seem to be 3 ways the identifier is populated:

  1. for custom models the name comes from the run.yaml exactly as is, my_llm here.
  2. for custom models with a name that matches an existing provider_resource_id, the identifier is prepended with the provider id, openai/gpt-4-turbo here, instead of using the name i provided in run.yaml as is (gpt-4-turbo)
  3. for models that were auto-populated, the identifier is populated is $provider_id/$provider_resource_id (openai/gpt-3.5-turbo-0125)

I'm not sure if prepending the provider_id to the identifier in the case of prepopulated models is the right thing to do or not, but scenario (2) definitely seems wrong (the user provided name is mutated).

Error logs

See the description above for the behavior i'm seeing. There is no "error" per se.

Expected behavior

I would expect the "identifier" in the models response would reflect only the model_id from run.yaml(or provider_resource_id from the provider, in the case of prepopulated model entries), not a concatenation of provider_id+model_id. Uniqueness can be enforced/referenced by the [provider_id,identifier] tuple.

But if concatenation is desired as a way to make a single field unique key, then i'd expect prefixing to always be done, regardless of whether the identifier was drawn from the provider_resource_id, or a user-provided model_id in run.yaml, and regardless of whether the user provided alias matches the provider_resource_id or not.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingstale

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions