Ollama安装使用参考:Deepseek Windows安装和入门使用
提示:此处需spring boot 3.x + jdk 17
创建项目时候的几个核心依赖:
spring-boot-starter-web
spring-ai-ollama-spring-boot-starter
详细参考下面片段(版本实时变化注意更新)
<properties>
<java.version>17</java.version>
<spring-ai.version>1.0.0-M6</spring-ai.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<scope>annotationProcessor</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-bom</artifactId>
<version>${spring-ai.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
配置ollama服务地址和端口以及默认模型
spring:
application:
name: demo-boot-ollama
ai:
ollama:
init:
pull-model-strategy: never #默认不拉取没有的模型
base-url: http://192.168.31.162:11434
chat:
options:
model: deepseek-r1:8b
更多参数参考:Ollama Chat :: Spring AI Reference Base Properties部分
@RestController
public class OllamaController {
@Resource
OllamaChatModel ollamaChatModel;
@GetMapping(value = "/ai/ollama/default")
public Object ollama(@RequestParam(value = "msg")String msg) {
String called = ollamaChatModel.call(msg);
System.out.println(called);
return called;
}
@GetMapping(value = "/ai/ollama/custom")
public Object ollamaConfig(@RequestParam(value = "msg")String msg){
ChatResponse chatResponse=ollamaChatModel.call(new Prompt(msg,
OllamaOptions.builder()
.model("deepseek-r1:14b")//使用哪个大模型
.temperature(0.4)//温度,温度值越高,准确率下降,温度值越低,准确率上升
.build()
));//温度,温度值越高,准确率下降,温度值越低,准确率上升
System.out.println(chatResponse.getResult().getOutput().getText());
return chatResponse.getResult().getOutput().getText();
}
}
两个简单对话实现,第一个使用默认的application配置模型对话,第二个自定义对话模型和其他参数;
Spring Boot Ollama 更多配置参考:Ollama Chat :: Spring AI Reference
浏览器访问测试:
另一个demo
@RestController
public class ChatController {
/**
* 注入application.yml配置的模型
*/
@Resource
OllamaChatModel chatModel;
@GetMapping("/ai/generate")
public Map<String,String> generate(@RequestParam(value = "message", defaultValue = "请讲一个笑话") String message) {
return Map.of("generation", this.chatModel.call(message));
}
//流返回
@GetMapping("/ai/generateStream")
public Flux<ChatResponse> generateStream(@RequestParam(value = "message", defaultValue = "请讲一个笑话") String message) {
Prompt prompt = new Prompt(new UserMessage(message));
return this.chatModel.stream(prompt);
}
}
使用单例测试方式对话,且指定格式化输出内容:
@Resource
OllamaChatModel ollamaChatModel;
@Test
public void chat() throws JsonProcessingException {
String jsonSchema = """
{
"type": "object",
"properties": {
"steps": {
"type": "array",
"items": {
"type": "object",
"properties": {
"explanation": { "type": "string" },
"output": { "type": "string" }
},
"required": ["explanation", "output"],
"additionalProperties": false
}
},
"final_answer": { "type": "string" }
},
"required": ["steps", "final_answer"],
"additionalProperties": false
}
""";
Prompt prompt = new Prompt("8x + 7 = -23 , 请计算x的值",
OllamaOptions.builder()
// .model(OllamaModel.LLAMA3_2.getName())
.model("deepseek-r1:8b")
.format(new ObjectMapper().readValue(jsonSchema, Map.class))
.build());
ChatResponse response = this.ollamaChatModel.call(prompt);
System.out.println(response.getResult().getOutput().getText());
}
执行结果参考:
最终计算结果正确。
http://blog.xqlee.com/article/2502181214317191.html