Skip to content

LLM error #190

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
whale-fall-0803 opened this issue May 9, 2025 · 2 comments
Open

LLM error #190

whale-fall-0803 opened this issue May 9, 2025 · 2 comments

Comments

@whale-fall-0803
Copy link

Error: reading streaming LLM response: doRequest: error sending request: Post "https://generativelanguage.googleapis.com//v1beta/models/gemini-2.5-pro-preview-03-25:streamGenerateContent?alt=sse": dial tcp 142.251.215.234:443: i/o timeout

How should I handle the above issue?

@rsea2z
Copy link

rsea2z commented May 9, 2025

same, the log seem normal
/tmp/kubectl-ai.log

Log file created at: 2025/05/09 18:28:37
Running on machine: areszz
Binary: Built with gc go1.24.1 for linux/amd64
Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
I0509 18:28:37.896898 3382599 main.go:327] Application startedpid3382599
I0509 18:28:37.903974 3382599 conversation.go:78] "Created temporary working directory" workDir="/tmp/agent-workdir-510886778"
I0509 18:29:11.741031 3382599 conversation.go:133] "Starting chat loop for query:" query="list /root"
I0509 18:29:11.741097 3382599 conversation.go:146] "Starting iteration" iteration=0

by the way, i use proxychains like proxychains ./kubectl-ai , it may cause problem but it work at most times. I try:

proxychains curl "https://generativelanguage.googleapis.com/v1beta/models/gemini-2.0-flash:generateContent?key=GEMINI_API_KEY" \
-H 'Content-Type: application/json' \
-X POST \
-d '{
  "contents": [{
    "parts":[{"text": "Explain how AI works"}]
    }]
   }'

without any problem

@droot
Copy link
Member

droot commented May 9, 2025

Can you pl. try without proxychains and confirm if it is working for you ?

curious about the benefits of running with proxychains.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants