Class OllamaChatModel

java.lang.Object
org.springframework.ai.ollama.OllamaChatModel
All Implemented Interfaces:
org.springframework.ai.chat.model.ChatModel, org.springframework.ai.chat.model.StreamingChatModel, org.springframework.ai.model.Model<org.springframework.ai.chat.prompt.Prompt,org.springframework.ai.chat.model.ChatResponse>, org.springframework.ai.model.StreamingModel<org.springframework.ai.chat.prompt.Prompt,org.springframework.ai.chat.model.ChatResponse>

public class OllamaChatModel extends Object implements org.springframework.ai.chat.model.ChatModel
ChatModel implementation for Ollama. Ollama allows developers to run large language models and generate embeddings locally. It supports open-source models available on [Ollama AI Library](...) and on Hugging Face. Please refer to the official Ollama website for the most up-to-date information on available models.
Since:
1.0.0
Author:
Christian Tzolov, luocongqiu, Thomas Vitale, Jihoon Kim, Alexandros Pappas, Ilayaperumal Gopinathan, Sun Yuhan
  • Nested Class Summary

    Nested Classes
    Modifier and Type
    Class
    Description
    static final class 
     
  • Constructor Summary

    Constructors
    Constructor
    Description
    OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, org.springframework.ai.model.tool.ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions)
     
    OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, org.springframework.ai.model.tool.ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions, org.springframework.ai.model.tool.ToolExecutionEligibilityPredicate toolExecutionEligibilityPredicate, org.springframework.retry.support.RetryTemplate retryTemplate)
     
  • Method Summary

    Modifier and Type
    Method
    Description
     
    org.springframework.ai.chat.model.ChatResponse
    call(org.springframework.ai.chat.prompt.Prompt prompt)
     
    org.springframework.ai.chat.prompt.ChatOptions
     
    void
    setObservationConvention(org.springframework.ai.chat.observation.ChatModelObservationConvention observationConvention)
    Use the provided convention for reporting observation data
    reactor.core.publisher.Flux<org.springframework.ai.chat.model.ChatResponse>
    stream(org.springframework.ai.chat.prompt.Prompt prompt)
     

    Methods inherited from class java.lang.Object

    clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait

    Methods inherited from interface org.springframework.ai.chat.model.ChatModel

    call, call

    Methods inherited from interface org.springframework.ai.chat.model.StreamingChatModel

    stream, stream
  • Constructor Details

    • OllamaChatModel

      public OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, org.springframework.ai.model.tool.ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions)
    • OllamaChatModel

      public OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, org.springframework.ai.model.tool.ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions, org.springframework.ai.model.tool.ToolExecutionEligibilityPredicate toolExecutionEligibilityPredicate, org.springframework.retry.support.RetryTemplate retryTemplate)
  • Method Details

    • builder

      public static OllamaChatModel.Builder builder()
    • call

      public org.springframework.ai.chat.model.ChatResponse call(org.springframework.ai.chat.prompt.Prompt prompt)
      Specified by:
      call in interface org.springframework.ai.chat.model.ChatModel
      Specified by:
      call in interface org.springframework.ai.model.Model<org.springframework.ai.chat.prompt.Prompt,org.springframework.ai.chat.model.ChatResponse>
    • stream

      public reactor.core.publisher.Flux<org.springframework.ai.chat.model.ChatResponse> stream(org.springframework.ai.chat.prompt.Prompt prompt)
      Specified by:
      stream in interface org.springframework.ai.chat.model.ChatModel
      Specified by:
      stream in interface org.springframework.ai.chat.model.StreamingChatModel
      Specified by:
      stream in interface org.springframework.ai.model.StreamingModel<org.springframework.ai.chat.prompt.Prompt,org.springframework.ai.chat.model.ChatResponse>
    • getDefaultOptions

      public org.springframework.ai.chat.prompt.ChatOptions getDefaultOptions()
      Specified by:
      getDefaultOptions in interface org.springframework.ai.chat.model.ChatModel
    • setObservationConvention

      public void setObservationConvention(org.springframework.ai.chat.observation.ChatModelObservationConvention observationConvention)
      Use the provided convention for reporting observation data
      Parameters:
      observationConvention - The provided convention