-
Notifications
You must be signed in to change notification settings - Fork 631
Description
Summary
Please add native ollama provider support to memos-local-openclaw for both:
- embedding
- summarizer
This would make true fully-local MemOS deployments much easier.
Why this matters
memos-local-openclaw is positioned as a local memory solution, so users naturally expect to run both:
- local embedding model
- local summarizer model
In practice, Ollama is one of the most common local model runtimes for this use case.
Right now, the local plugin supports:
- embedding providers like
local,openai_compatible, etc. - summarizer providers like
openai,openai_compatible,anthropic,gemini, etc.
But there is no native ollama provider.
That means users who want a real fully-local setup cannot configure Ollama directly in the official plugin.
Expected support
Embedding
Support Ollama embedding via:
POST /api/embed
Example:
{
"model": "bge-m3:latest",
"input": "some text"
}Summarizer
Support Ollama text generation via:
POST /api/generate
Example:
{
"model": "llama3.1:latest",
"prompt": "summarize this text...",
"stream": false,
"options": {
"temperature": 0
}
}Suggested config shape
Embedding
{
"embedding": {
"provider": "ollama",
"endpoint": "http://127.0.0.1:11434",
"model": "bge-m3:latest"
}
}Summarizer
{
"summarizer": {
"provider": "ollama",
"endpoint": "http://127.0.0.1:11434",
"model": "llama3.1:latest",
"temperature": 0
}
}Viewer / Settings UI
It would also be helpful if the viewer settings page supported:
- provider =
ollama - endpoint
- model
- temperature (for summarizer)
Validation
We locally patched the plugin and verified that this works in practice with:
- embedding:
bge-m3:latest - summarizer:
llama3.1:latest
The patched flow successfully completed:
- summarize
- embed
- write to MemOS SQLite
- retrieve via search / recall
So this is not just a theoretical request — it already works as a practical implementation direction.
Benefits
- true fully-local MemOS deployment
- better privacy
- less dependency on external APIs
- better alignment with the “local memory” positioning of the plugin
If maintainers are interested, I can also help provide a patch / PR direction for the Ollama provider integration.