Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using GPT 4, but Assistent says its GPT 3 #2575

Open
Creveoolus opened this issue Jan 16, 2025 · 6 comments
Open

Using GPT 4, but Assistent says its GPT 3 #2575

Creveoolus opened this issue Jan 16, 2025 · 6 comments
Assignees
Labels
bug Something isn't working

Comments

@Creveoolus
Copy link

Bug description

  1. Use any gpt 4 model (4, 4o, 4o-mini)
  2. Ask it about their full model name
  3. It says that its based on gpt 3.5, sometimes gpt 3

Screenshots

Image

Environment

  • python version: Python 3.11.9 (tags/v3.11.9:de54cf5, Apr 2 2024, 10:12:12) [MSC v.1938 64 bit (AMD64)] on win32
  • location: Russia

Additional context

@Creveoolus Creveoolus added the bug Something isn't working label Jan 16, 2025
@Creveoolus
Copy link
Author

Image

@TheFirstNoob
Copy link

@Creveoolus Hi, it is normal that the model responds like this because new models are trained using the layering method. In fact, the model has new capabilities and an updated database and works as stated, namely GPT-4 + "o" version

@depis13
Copy link

depis13 commented Jan 25, 2025

I think openai-sentinel-turnstile-token header is not generated correctly. That's why only gpt3 is used now.

@hlohaus
Copy link
Collaborator

hlohaus commented Jan 25, 2025

@depis13

I can't confirm that there is an issue with the models in the OpenAIChat provider. It’s possible that the language model is providing inaccurate information (sometimes referred to as “hallucinating”) or lacks sufficient knowledge on the topic However, when I use the O1 model, the response format includes clear reasoning, which aligns with the expected behavior of the O1 model.

@depis13
Copy link

depis13 commented Jan 25, 2025

Image
@hlohaus I don't use this repository, but I used the developments on generating authentication headers in my project. In my project, I noticed that I can no longer receive links from chatgpt. In the chatgpt gui, it looks like the attached screenshot. The first request is made using the library, the second request is made using a browser without automation. It is possible, of course, that I transferred the developments from this repository to my project not very correctly, but up to a certain point I was able to get links using my code.

@hlohaus
Copy link
Collaborator

hlohaus commented Jan 27, 2025

The issue is not with gpt4free; our responses include embedded links and a summarized list at the conclusion. It's possible your service provider is filtering these links from the response. @depis13

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants