Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]:how to set default LLM and chat models for Ragflow? #4614

Open
AstronautKkkkkkkk opened this issue Jan 23, 2025 · 1 comment
Open
Labels
question Further information is requested

Comments

@AstronautKkkkkkkk
Copy link

Describe your problem

Currently, every time a new account is created, I need to reconfigure the LLM and chat model settings. I want to add a default option to the system so that newly registered users can start using it immediately. How should I implement this? I observed that the documentation only provides API and key-based configurations, but I have local models deployed on Ollama. How can I add these?

@AstronautKkkkkkkk AstronautKkkkkkkk added the question Further information is requested label Jan 23, 2025
@KevinHuSh
Copy link
Collaborator

Beside the configuration, you might need to refine the code here.

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants