-
Notifications
You must be signed in to change notification settings - Fork 668
Description
Current Behavior
Hi,
I want to use the wave with GPT-OSS model(s).
When I try to ask anything, I got:
Failed to post message: failed to store message: model mismatch: expected gpt-5.1, got gpt-oss:20b
Is this supported or a bug? If is supported, please could you let me know what am I missing?
Thank you!
Expected Behavior
I am expecting to work with any local model I have installed in Ollama.
Steps To Reproduce
The configuration:
{
"ollama-llama": {
"display:name": "Ollama - GPT OSS",
"display:order": 1,
"display:icon": "microchip",
"display:description": "Local GPT OSS 20B model via Ollama",
"ai:apitype": "openai-responses",
"ai:model": "gpt-oss:20b",
"ai:thinkinglevel": "medium",
"ai:endpoint": "http://localhost:11434/v1/chat/completions",
"ai:apitoken": "ollama"
}
}
Wave Version
latest
Platform
macOS
OS Version/Distribution
No response
Architecture
arm64
Anything else?
No response
Questionnaire
- I'm interested in fixing this myself but don't know where to start
- I would like to fix and I have a solution
- I don't have time to fix this right now, but maybe later