Skip to content

Fix Ollama fetch errors and improve error handling #402

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
bhouston opened this issue May 7, 2025 · 0 comments
Open

Fix Ollama fetch errors and improve error handling #402

bhouston opened this issue May 7, 2025 · 0 comments
Labels
enhancement New feature or request ollama Issues related to Ollama integration

Comments

@bhouston
Copy link
Member

bhouston commented May 7, 2025

Ollama Fetch Error Improvements

Background

This is a follow-up to issue #401 where a user reported "TypeError: fetch failed" errors when using MyCoder with Ollama.

Proposed Changes

  1. Update the Ollama npm package to the latest version (currently using 0.5.14, latest is 0.5.15)
  2. Improve error handling in the OllamaProvider class to provide more informative error messages:
    • Add specific error handling for network connectivity issues
    • Provide clearer error messages when the Ollama server is unreachable
    • Handle large model loading failures more gracefully
  3. Add a health check for the Ollama server before attempting to use it
  4. Add documentation on troubleshooting Ollama connectivity issues

Implementation Plan

  1. Update the Ollama dependency in packages/agent/package.json
  2. Enhance the OllamaProvider class with better error handling:
    try {
      // Make the API request using the Ollama client
      const response: OllamaChatResponse = await this.client.chat({
        ...requestOptions,
        stream: false,
      });
      // Process response...
    } catch (error) {
      // Enhanced error handling
      if (error.message.includes('fetch failed')) {
        throw new Error('Failed to connect to Ollama server. Please ensure Ollama is running and accessible.');
      }
      // Handle other specific errors...
      throw error;
    }
  3. Add a health check method to verify Ollama server status
  4. Update documentation with troubleshooting tips

Testing Plan

  • Test with Ollama server running/not running
  • Test with various network configurations
  • Test with different model sizes
  • Verify error messages are user-friendly and actionable

Related Issues

@bhouston bhouston added enhancement New feature or request ollama Issues related to Ollama integration labels May 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request ollama Issues related to Ollama integration
Projects
None yet
Development

No branches or pull requests

1 participant