Package org.springframework.ai.ollama
Class OllamaChatModel
java.lang.Object
org.springframework.ai.ollama.OllamaChatModel
- All Implemented Interfaces:
org.springframework.ai.chat.model.ChatModel,org.springframework.ai.chat.model.StreamingChatModel,org.springframework.ai.model.Model<org.springframework.ai.chat.prompt.Prompt,,org.springframework.ai.chat.model.ChatResponse> org.springframework.ai.model.StreamingModel<org.springframework.ai.chat.prompt.Prompt,org.springframework.ai.chat.model.ChatResponse>
ChatModel implementation for Ollama. Ollama allows developers to run
large language models and generate embeddings locally. It supports open-source models
available on [Ollama AI Library](...) and on
Hugging Face. Please refer to the official Ollama
website for the most up-to-date information on available models.- Since:
- 1.0.0
- Author:
- Christian Tzolov, luocongqiu, Thomas Vitale, Jihoon Kim, Alexandros Pappas, Ilayaperumal Gopinathan, Sun Yuhan
-
Nested Class Summary
Nested Classes -
Constructor Summary
ConstructorsConstructorDescriptionOllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, org.springframework.ai.model.tool.ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions) OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, org.springframework.ai.model.tool.ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions, org.springframework.ai.model.tool.ToolExecutionEligibilityPredicate toolExecutionEligibilityPredicate, org.springframework.retry.support.RetryTemplate retryTemplate) -
Method Summary
Modifier and TypeMethodDescriptionstatic OllamaChatModel.Builderbuilder()org.springframework.ai.chat.model.ChatResponsecall(org.springframework.ai.chat.prompt.Prompt prompt) org.springframework.ai.chat.prompt.ChatOptionsvoidsetObservationConvention(org.springframework.ai.chat.observation.ChatModelObservationConvention observationConvention) Use the provided convention for reporting observation datareactor.core.publisher.Flux<org.springframework.ai.chat.model.ChatResponse>stream(org.springframework.ai.chat.prompt.Prompt prompt) Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.springframework.ai.chat.model.ChatModel
call, callMethods inherited from interface org.springframework.ai.chat.model.StreamingChatModel
stream, stream
-
Constructor Details
-
OllamaChatModel
public OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, org.springframework.ai.model.tool.ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions) -
OllamaChatModel
public OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, org.springframework.ai.model.tool.ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions, org.springframework.ai.model.tool.ToolExecutionEligibilityPredicate toolExecutionEligibilityPredicate, org.springframework.retry.support.RetryTemplate retryTemplate)
-
-
Method Details
-
builder
-
call
public org.springframework.ai.chat.model.ChatResponse call(org.springframework.ai.chat.prompt.Prompt prompt) - Specified by:
callin interfaceorg.springframework.ai.chat.model.ChatModel- Specified by:
callin interfaceorg.springframework.ai.model.Model<org.springframework.ai.chat.prompt.Prompt,org.springframework.ai.chat.model.ChatResponse>
-
stream
public reactor.core.publisher.Flux<org.springframework.ai.chat.model.ChatResponse> stream(org.springframework.ai.chat.prompt.Prompt prompt) - Specified by:
streamin interfaceorg.springframework.ai.chat.model.ChatModel- Specified by:
streamin interfaceorg.springframework.ai.chat.model.StreamingChatModel- Specified by:
streamin interfaceorg.springframework.ai.model.StreamingModel<org.springframework.ai.chat.prompt.Prompt,org.springframework.ai.chat.model.ChatResponse>
-
getDefaultOptions
public org.springframework.ai.chat.prompt.ChatOptions getDefaultOptions()- Specified by:
getDefaultOptionsin interfaceorg.springframework.ai.chat.model.ChatModel
-
setObservationConvention
public void setObservationConvention(org.springframework.ai.chat.observation.ChatModelObservationConvention observationConvention) Use the provided convention for reporting observation data- Parameters:
observationConvention- The provided convention
-