Skip to content

[Bug]: Wave doesn't work with GPT-OSS (Ollama) #2682

@w4mhi

Description

@w4mhi

Current Behavior

Hi,
I want to use the wave with GPT-OSS model(s).

When I try to ask anything, I got:
Failed to post message: failed to store message: model mismatch: expected gpt-5.1, got gpt-oss:20b

Is this supported or a bug? If is supported, please could you let me know what am I missing?
Thank you!

Expected Behavior

I am expecting to work with any local model I have installed in Ollama.

Steps To Reproduce

The configuration:
{
"ollama-llama": {
"display:name": "Ollama - GPT OSS",
"display:order": 1,
"display:icon": "microchip",
"display:description": "Local GPT OSS 20B model via Ollama",
"ai:apitype": "openai-responses",
"ai:model": "gpt-oss:20b",
"ai:thinkinglevel": "medium",
"ai:endpoint": "http://localhost:11434/v1/chat/completions",
"ai:apitoken": "ollama"
}
}

Wave Version

latest

Platform

macOS

OS Version/Distribution

No response

Architecture

arm64

Anything else?

No response

Questionnaire

  • I'm interested in fixing this myself but don't know where to start
  • I would like to fix and I have a solution
  • I don't have time to fix this right now, but maybe later

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingtriageNeeds triage

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions