Skip to content

1ru.5: Integrate llama.cpp as a first-class local agent runtime#12

Draft
penso wants to merge 1 commit intomainfrom
beads-polyphony-1ru.5
Draft

1ru.5: Integrate llama.cpp as a first-class local agent runtime#12
penso wants to merge 1 commit intomainfrom
beads-polyphony-1ru.5

Conversation

@penso
Copy link
Owner

@penso penso commented Mar 16, 2026

Automated handoff for 1ru.5.

Issue:
Base branch: main
Head branch: beads-polyphony-1ru.5
Commit: ff95ce6

@penso
Copy link
Owner Author

penso commented Mar 16, 2026

Summary

This branch adds a new llama.cpp agent runtime and reuses the OpenAI-compatible chat loop for it. The overall shape is reasonable, but I found a few correctness gaps in wiring and config handling that are likely to show up in real use. I reviewed the diff and surrounding code only, I did not run the full Rust test/lint suite.

Risks

Recommended human checks

  • Exercise a real llama.cpp run that emits a tool call and confirm the event stream contains ToolCallCompleted, not unsupported_tool_call.
  • Try three config variants, kind: llama, kind: llama_cpp, and explicit transport: llama_cpp, and verify all three select the new runtime.
  • Test both startup modes explicitly: a pre-running remote/local server with no spawn command, and local auto-spawn via llama-server with and without a configured model path.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant