You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
[FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
Please do not modify this template :) and fill in all the required fields.
Dify version
v0.15.1
Cloud or Self Hosted
Self Hosted (Source)
Steps to reproduce
I created a chatflow via Dify, but the LLM node doesn't work properly. To be specific, I encountered two issues:
The memory function is not working properly. If I turn the memory function on, there will be an error occur at the second turn of chat, raising an error about "result_text" variable. If the memory function is off, everything is fine. I pasted the error I got from Dify API service below.
Some of the LLM node didn't fetch response from LLM. In the chatflow there was one LLM node originally (which is the one mentioned in the first issue) and it worked well except the memory function. And when I added another LLM node using the same LLM provider, it didn't properly get response. Sometimes you will get part of the stream data returned from LLM followed by an error, sometimes nothing returned with an error raised directly. And none of the error detail was given. Even if I set the new node with the same prompt, parameters, input data as the first node, it still didn't work. I tried to delete the model provider and add it again but nothing changed. I checked the vLLM logs (the model is a self-hosted OpenAI compatible service based on vLLM) but didn't find anything strange. I pasted the error showed in the web frontend and vLLM logs below. I was asking the LLM to extract the keyword from my query about weather.
✔️ Expected Behavior
Memory function perform normally, and all the LLM sharing the same model provider work normally.
❌ Actual Behavior
Memory function is not working fine from second turn of a chat, and only one specific LLM node works fine.
The text was updated successfully, but these errors were encountered:
Memory Function Error: The error related to the "result_text" variable suggests that it might not be assigned a value before being accessed. This could be a bug in the workflow script where the memory function is not properly handling the variable assignment. You might want to check the specific lines in graph_engine.py, node.py in the base directory, and node.py in the llm directory where the error is occurring.
LLM Node Response Issue: When multiple LLM nodes use the same model provider, there might be conflicts or issues with how the nodes are configured or how they interact with the model provider. Since the error message is "[openai_api_compatible] Error" and no detailed error is provided, it could be related to the configuration of the self-hosted OpenAI compatible service based on vLLM. Double-check the configuration settings and ensure that the model provider is correctly set up to handle multiple requests.
For both issues, reviewing the error handling strategies in Dify might help manage these exceptions more effectively. You can configure error handling to retry on failure or execute a fail branch to continue the workflow despite errors [2].
Self Checks
Dify version
v0.15.1
Cloud or Self Hosted
Self Hosted (Source)
Steps to reproduce
I created a chatflow via Dify, but the LLM node doesn't work properly. To be specific, I encountered two issues:
✔️ Expected Behavior
Memory function perform normally, and all the LLM sharing the same model provider work normally.
❌ Actual Behavior
Memory function is not working fine from second turn of a chat, and only one specific LLM node works fine.
The text was updated successfully, but these errors were encountered: