Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Lobe does not support deep thinking feature for ollama deepseek r1 models #5591

Open
ShinChven opened this issue Jan 25, 2025 · 2 comments
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models

Comments

@ShinChven
Copy link

📦 Environment

Docker

📌 Version

v1.48.0

💻 Operating System

macOS

🌐 Browser

Chrome

🐛 Bug Description

When running deepseek-r1:8b via Ollama, Lobe displays the <think> tag in text instead of prompting the deep thinking feature as it did when calling the official DeepSeek API. Additionally, there is a padding issue at the top of the message box when displaying the <think> tag, see it in the screenshot.

Image

📷 Recurrence Steps

No response

🚦 Expected Behavior

No response

📝 Additional Information

No response

@ShinChven ShinChven added the 🐛 Bug Something isn't working | 缺陷 label Jan 25, 2025
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Jan 25, 2025
@lobehubbot
Copy link
Member

👀 @ShinChven

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

Copy link

dosubot bot commented Jan 25, 2025

The issue you're experiencing with the <think> tag not triggering the 'deep thinking' feature for the Ollama DeepSeek R1 model might be due to differences in how the feature is implemented or integrated for different AI providers. The <think> tag is rendered by the Render component in the src/features/Conversation/components/MarkdownElements/LobeThinking/Render.tsx file, which interacts with the chat interface to manage the display of the tag's content [1].

For the padding issue at the top of the message box, the padding is defined in the ActionBar component within the src/features/ChatInput/ActionBar/index.tsx file. You can adjust the padding by changing the value of the padding prop when using the ActionBar component [2].

To resolve the issue with the 'deep thinking' feature, you may need to check the integration logic for the Ollama DeepSeek R1 model and ensure it aligns with how the feature is triggered for the official DeepSeek API. This might involve reviewing the configuration files or settings that define how different AI models and their features are integrated into Lobe, which are located in the src/config/aiModels/index.ts file [3].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the ollama Relative to Ollama Provider and ollama models label Jan 25, 2025
@arvinxx arvinxx marked this as a duplicate of #5530 Jan 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

2 participants