Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: Can't add LLM API #4431

Open
ChengYan-Huang opened this issue Jan 9, 2025 · 3 comments
Open

[Question]: Can't add LLM API #4431

ChengYan-Huang opened this issue Jan 9, 2025 · 3 comments
Labels
question Further information is requested

Comments

@ChengYan-Huang
Copy link

Describe your problem

Fail to access embedding model(embedding-2) using thisapi key.Request Timeout Fail to access embedding
model(embedding-3) using this api key,Request Timeout Fail to
access model(glm-3-turbo) using this api key.ERROR:Request Timeout Fail to access model(glm-4) using this apikey.ERROR:Request Timeout Fail to access model(alm-4.0520) using this api key.ERROR: Request Timeout Fail toaccess model(glm-4-9b) using this api key.ERROR: RequestTimeout Fail to access model(glm-4-air) using this apikey.ERROR: Request Timeout Fail to access model(glm-4-airx) using this api key.ERROR: Request Timeout Fail toaccess model(glm-4-flash) using this api key.ERROR: RequestTimeout Fail to access model(glm-4-flashx) using this apikey.ERROR: Request Timeout Fail to access model(alm-4long) using this api key.ERROR: Request Timeout Fail toaccess model(glm-4-plus) using this api key.ERROR: RequestTimeout
是否与docker的网络桥接有关,导致无法访问网络

@ChengYan-Huang ChengYan-Huang added the question Further information is requested label Jan 9, 2025
@JinHai-CN
Copy link
Contributor

We intend to create an international community, so we encourage using English for communication.

@JinHai-CN JinHai-CN changed the title [Question]: 使用docker部署并登录页面后,尝试进行LLM的api添加,显示无法添加,timeout [Question]: Can't add LLM API Jan 9, 2025
@JinHai-CN
Copy link
Contributor

  1. How do you deploy RAGFlow and the model?
  2. Can you access the model directly ?
  3. Check if you can access the model inside the docker container.

@ChengYan-Huang
Copy link
Author

  1. First of all, I deployed the GLM4 model of Zhipu through the official document on server A, and opened the API interface. I then deployed RAGFlow on server B following the official documentation.
  2. I tested the API interface on server A and proved that I can call the API to access GLM4.
  3. I can enter the container, but I can't use curl to request any network in the container.
    In addition, both servers are in the intranet, and the model API can be successfully called on both servers
    c39d2c9479ad45878c607d1ddd69624

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants