mirror of
https://github.com/perstarkse/minne.git
synced 2026-01-11 12:40:24 +01:00
api_error communicating with ollama #3
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @ard0gg on GitHub (Oct 12, 2025).
Peeling the onion. After implementing update from #3, I now get an api error when trying to communicate to ollama. This also occurs in 0.2.2 if leaving vector dimensions set to default. See log below
@ard0gg commented on GitHub (Oct 13, 2025):
This may be issue with my ollama instance. I am running
intelanalytics/ipex-llm-inference-cpp-xpu:latestwith intel arc b580I think this is the log from my ollama instance associated with one of the
async_openai::client: Server errormessages above@perstarkse commented on GitHub (Oct 13, 2025):
The cause of the error seems to be an error in ollama. At this point in the ingestion process it’s sending embedding request to the ai backend. I’m not too experienced reading those logs, but could it be that you’re running embedding Gemma model?
In the minne application you have to set the correct embedding dimensions corresponding to your model. Some newer models can take any input in a range for dimensions.
During testing yesterday I had success with the nomic-text model for embeddings
@perstarkse commented on GitHub (Oct 13, 2025):
And I do really appreciate you trying this out. I think the configuration information might have room for improvement, and a faq/troubleshooting section could potentially be good.
@ard0gg commented on GitHub (Oct 13, 2025):
Good catch! At some point in my experimentation I had switched from nomic-embed-text to embeddinggemma for the embedding model. Switching back to nomic-embed-text seemed to solve the issue I was having.
Thank you for your responses. I am excited to have gotten this working and looking forward to seeing how it works for me.