Replies: 2 comments
-
同问,我如果现在想使用ollama服务,我需要重写llmchain,因为ollama会分段返回,只有设置stream=false才可以,加个变量导致要重写llmchain,我要怎么做? |
Beta Was this translation helpful? Give feedback.
0 replies
-
现在已经支持所有主流开源推理框架了~ |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
是否支持调用本地 ollama服务(没有装rt)?? 应该怎么配置技能? 我用customllm + conversationchain,前台对话一直转圈圈,docker logs bisheng-backend 报错如下


ollama本身没有问题(fastgpt)
Beta Was this translation helpful? Give feedback.
All reactions