-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenWebUI doesn't connect to FastMLX #35
Comments
I haven't had a chance to submit proper PR's back to this repo, but meanwhile you can check my fork where I have implemented OpenWebUI support and some other stuff that I needed. |
@SwagMuffinMcYoloPants thanks for bringing this up. I’m working a major release on MLX-VLM and this weekend I will be updating FastMLX with lots of goodies. I can add OpenWebUI. |
@viljark feel free to propose the changes you want and open a PR with the OpenWebUI support for your fork. |
@Blaizzy hey, any update on the goodies 😊 |
Not yet. I started scoping it but porting Florence-2 to MLX-VLM had higher priority. If you can help me with a initial PR, I would appreacite it and take it from there. |
I was messing with LMStudio's local server and connecting with openwebui and it was really convenient to connect the two. I don't really need the LMStudio application and would love to just use FastMLX instead. When I try to connect FastMLX as an OpenAI connection, it doesn't return the models. Would it be possible to get FastMLX working with OpenWebUI?
The text was updated successfully, but these errors were encountered: