Skip to content

"TypeError: fetch failed" when trying to use ollama #401

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
temujin9 opened this issue May 7, 2025 · 1 comment
Open

"TypeError: fetch failed" when trying to use ollama #401

temujin9 opened this issue May 7, 2025 · 1 comment
Labels
bug Something isn't working ollama Issues related to Ollama integration

Comments

@temujin9
Copy link

temujin9 commented May 7, 2025

mycoder -i --githubMode false --provider ollama --model medragondot/Sky-T1-32B-Preview:latest --logLevel verbose 
Detected 3 browsers on the system
Available browsers:
- google-chrome-stable (chromium) at /usr/bin/google-chrome-stable
- google-chrome (chromium) at /usr/bin/google-chrome
- Firefox (firefox) at /usr/bin/firefox

Type your request below or 'help' for usage information. Use Ctrl+C to exit.

> describe this repo

MyCoder v1.6.0 - AI-powered coding assistant
LLM: ollama/medragondot/Sky-T1-32B-Preview:latest

Interactive correction mode enabled. Press Ctrl+M to send a correction to the agent.
An error occurred:
TypeError: fetch failed
TypeError: fetch failed
    at Object.fetch (node:internal/deps/undici/undici:11360:11)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at post (file:///usr/local/lib/node_modules/mycoder/node_modules/ollama/dist/browser.mjs:131:20)
    at Ollama.processStreamableRequest (file:///usr/local/lib/node_modules/mycoder/node_modules/ollama/dist/browser.mjs:277:22)
    at OllamaProvider.generateText (file:///usr/local/lib/node_modules/mycoder/node_modules/mycoder-agent/src/core/llm/providers/ollama.ts:102:42)
    at toolAgent (file:///usr/local/lib/node_modules/mycoder/node_modules/mycoder-agent/src/core/toolAgent/toolAgentCore.ts:142:45)
    at executePrompt (file:///usr/local/lib/node_modules/mycoder/src/commands/$default.ts:187:20)
    at Object.handler (file:///usr/local/lib/node_modules/mycoder/src/commands/$default.ts:280:5)
0
[Token Usage Total] Root: input: 0 cache-writes: 0 cache-reads: 0 output: 0 COST: $0.00
Forcing exit after 5000ms timeout
node --version
v18.19.1
ollama list
NAME                                     ID              SIZE     MODIFIED    
medragondot/Sky-T1-32B-Preview:latest    80155ae4b31c    19 GB    2 hours ago    
@bhouston
Copy link
Member

bhouston commented May 7, 2025

Issue Triage Report: "TypeError: fetch failed" with Ollama

Issue Classification

  • Type: Bug
  • Recommended Labels: bug, ollama

Initial Assessment

The issue provides sufficient information to understand the problem. The user is experiencing a "TypeError: fetch failed" error when attempting to use MyCoder with Ollama and the medragondot/Sky-T1-32B-Preview:latest model.

Investigation Findings

Potential Causes

  1. Network Connectivity Issues:

    • The fetch error suggests a failure to connect to the Ollama server
    • This could be due to Ollama not running, network restrictions, or firewall issues
  2. Node.js Fetch Compatibility:

    • The user is running Node.js v18.19.1
    • Node.js 18 has built-in fetch support, but there might be compatibility issues with the Ollama package's fetch implementation
    • The Ollama npm package version used in MyCoder is 0.5.14, while the latest is 0.5.15
  3. Ollama Server Status:

    • The user verified that the model is available (ollama list shows the model)
    • However, there might be issues with the Ollama server's ability to handle requests
  4. Large Model Issues:

    • The Sky-T1-32B-Preview is a large model (19GB)
    • There might be resource constraints or timeout issues when loading/using such a large model

Duplication Check

No similar issues were found specifically for Ollama fetch errors.

Recommended Next Steps

  1. For the User:

    • Verify Ollama server is running and accessible: curl http://localhost:11434/api/tags
    • Check Ollama server logs for any errors
    • Try using a smaller model to see if the issue is specific to the large model
    • Try setting the OLLAMA_BASE_URL environment variable explicitly
  2. For Development:

    • Update the Ollama npm package to the latest version (0.5.15)
    • Improve error handling in the OllamaProvider class to provide more informative error messages
    • Add a health check for the Ollama server before attempting to use it
    • Consider adding documentation on troubleshooting Ollama connectivity issues

Complexity Assessment

This issue appears to be of medium complexity. While the root cause is likely related to network connectivity or Node.js compatibility, it may require changes to the Ollama provider implementation for better error handling and diagnostics.

Estimated Timeline

  • Investigation and fix: 1-2 days
  • Testing: 1 day

Please let us know if you can provide any additional information that might help with resolving this issue.

@bhouston bhouston added bug Something isn't working ollama Issues related to Ollama integration labels May 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ollama Issues related to Ollama integration
Projects
None yet
Development

No branches or pull requests

2 participants