Skip to content

[Bug]: Telemetry Disabled does not allow to use local AI model / key #2713

@MassimoRovitti

Description

@MassimoRovitti

Current Behavior

As per the 0.13.1 release notes:

BYOK Without Telemetry - Wave AI now works with bring-your-own-key and local models without requiring telemetry to be enabled

but when opening the Wave AI, it shows the "Enable Telemetry and continue" without giving any option to use local Ollama models installed and running on localhost (Windows 11).

When telemetry is enabled, the local models are accessible.

As a note, in Settings-> GEneral I have configure ai:preset to a local model (this was working fine with the previous AI widget).

Expected Behavior

Wave AI does not allow to use local models without enebaling telemetry, despite the statement in release notes of version 0.13.1

Steps To Reproduce

1 - Wave Terminal 0.13.1 installed on Windows 11 machine
2 - Ollama installed on the smme machine, up and running (models are accessible via other tools)
3 - When opening Wave AI, it is not possible to choose and use any local models unless telemetry is enabled
4 - A defualt local model is configure within ai:preset in settings.json

Wave Version

0.13.1

Platform

Windows

OS Version/Distribution

Windows 11 25H2

Architecture

x64

Anything else?

No response

Questionnaire

  • I'm interested in fixing this myself but don't know where to start
  • I would like to fix and I have a solution
  • I don't have time to fix this right now, but maybe later

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingtriageNeeds triage

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions