-
Notifications
You must be signed in to change notification settings - Fork 102
Description
I have been using Docker Model Runner for gguf- based models for some time now. I have been using the following URL to
access it:
http://192.168.2.42:12434/engines/llama.cpp/v1/
This URL has been working well using WebUI as well as my own chat application.
I have recently installed the diffusers runner and stable-diffusion onto my machine. Unfortunately, since installing them no application can connect to my Docker Model Runner instance.
The only way I can access DMR is by using a curl command provided to me by @ilopezluna::
curl -s -X POST http://localhost:12434/engines/diffusers/v1/images/generations -H "Content-Type: application/json" -d '{
"model": "ai/stable-diffusion",
"prompt": "A cat sitting on a couch",
"size": "512x512"
}'
I believe that this returns an image (it does return a lot of data), though I cannot confirm that the data I am seeing is actually an image using curl.
This doesn't matter much, because if I move to another machine and use the curl to access my machine:
curl -s -X POST http://192.168.2.42:12434/engines/diffusers/v1/images/generations -H "Content-Type: application/json" -d '{
"model": "ai/stable-diffusion",
"prompt": "A cat sitting on a couch",
"size": "512x512"
}'
I still cannot connect to the machine.
With WebUI, I have tried placing it on my machine and using:
http://localhost:12434/engines/llama.cpp/v1/
and
http://localhost:12434/engines/diffusers/v1/images/generations
and I am still getting connection errors.
I suspect that this is a bug in DMR's runner. At this point, I cannot use it.