Skip to content

Ollama.cs

suncloudsmoon edited this page Feb 4, 2025 · 1 revision

Ollama API Integration

Overview

The Ollama class is designed to interface with an Ollama API endpoint to retrieve model details, including properties like the context window size. It also provides a helper method to determine if a given collection of models is served by an Ollama provider.

Classes

Ollama

  • Namespace: LLMHelperFunctions
  • Purpose: Facilitates interactions with the Ollama API to fetch model details.

Constructor

  • Ollama(Uri endpoint)
    Initializes a new instance of the Ollama class.
    • Parameters:
      • endpoint: The base URI of the Ollama API endpoint.

Public Methods

  • Method: Show(string modelName, bool verbose = false)
    Retrieves details about a specified model from the Ollama API.

    • Parameters:
      • modelName: The name of the model to query.
      • verbose: When set to true, requests more detailed information (default is false).
    • Returns: A task that resolves to a dictionary mapping keys to objects representing model details.
    • Exceptions:
      • HttpRequestException if the HTTP request fails.
  • Static Method: IsOllama(OpenAIModelCollection availableModels)
    Determines whether the provided collection of OpenAI models indicates that the underlying provider is Ollama.

    • Parameters:
      • availableModels: A collection of OpenAI models.
    • Returns:
      • true if the provider is detected as Ollama (the first model’s OwnedBy property equals "library"); otherwise, false.
    • Exceptions:
      • Throws an OllamaException if the collection is empty.

OllamaException

  • Namespace: LLMHelperFunctions
  • Purpose: Represents errors that occur during interactions with the Ollama API.

Constructors

  • OllamaException(string message)
    Initializes a new instance with a specified error message.

  • OllamaException(string message, Exception innerException)
    Initializes a new instance with a specified error message and a reference to the inner exception.

Usage Examples

Retrieving Model Details from Ollama

using System;
using System.Threading.Tasks;
using LLMHelperFunctions;

public class OllamaExample
{
    public async Task Run()
    {
        // Base URI for your Ollama API endpoint
        Uri endpoint = new Uri("https://your-ollama-api-endpoint.com/");
        string modelName = "your-model-name";

        // Create an Ollama instance
        var ollama = new Ollama(endpoint);

        try
        {
            // Retrieve model details (with verbose output)
            var modelDetails = await ollama.Show(modelName, verbose: true);

            // Process the model details as needed
            Console.WriteLine("Model details received:");
            foreach (var kvp in modelDetails)
            {
                Console.WriteLine($"{kvp.Key}: {kvp.Value}");
            }
        }
        catch (HttpRequestException ex)
        {
            Console.Error.WriteLine($"HTTP error while retrieving model details: {ex.Message}");
        }
        catch (OllamaException ex)
        {
            Console.Error.WriteLine($"Error with Ollama API: {ex.Message}");
        }
    }
}
Clone this wiki locally