Replies: 5 comments 2 replies
-
I have kimi and deepseek official providers, but the model list is still empty
|
Beta Was this translation helpful? Give feedback.
-
|
Same for me. No models available although I have many configured. |
Beta Was this translation helpful? Give feedback.
-
|
restart api Server? |
Beta Was this translation helpful? Give feedback.
-
|
API Server restarted, still no models. I'm on Cherry 1.7.1, MacOS Tahoe 26.1, MacBook Pro M1, 16 GB. |
Beta Was this translation helpful? Give feedback.
-
|
For the time being, you can use LiteLLM in proxy mode to create a local anthropic API compatible endpoint pointing to whatever model you want. My example is creating a model litellm-config.yaml: |
Beta Was this translation helpful? Give feedback.







Uh oh!
There was an error while loading. Please reload this page.
-
Is the "Assistant Library" in the interface still exactly the same as the original?
Beta Was this translation helpful? Give feedback.
All reactions