Skip to main content

Posts

Showing posts from December, 2024

Spring AI Structured Output API

Photo by Shubham Dhage on Unsplash LLMs generate natural language responses that are easily understood by humans. However, it is difficult for applications to communicate with each other using natural language. Therefore, these responses must be converted into machine-readable formats such as JSON, CSV, XML, or POJOs. Some LLM providers including OpenAI, Google PaLM, and Anthropic(Claude) have specialized LLM models that can understand the output Schema and generate the response accordingly. We'll learn how Spring AI helps integrate with them through its framework. Prerequisites For this article, we'll use OpenAI LLM services. Hence, we'll need an active OpenAI subscription to invoke its APIs. Furthermore, we must import the Maven dependencies to integrate the Spring Boot application with OpenAI: <dependency> <groupId>org.springframework.ai</groupId> <artifactId>spring-ai-openai-spring-boot-starter</artif...

Building AI Assistance Using Spring AI's Function Calling API

Photo by Alex Knight on Unsplash Building AI assistance in existing legacy applications is gaining a lot of momentum. An AI assistant like a chatbot can provide users with a unified experience and enable them to perform functionalities across multiple modules through a single interface. In our article, we'll see how to leverage Spring AI to build an AI assistant. We'll demonstrate how to seamlessly reuse existing application services and functions alongside LLM capabilities. Function Calling Concept An LLM can respond to an application request in multiple ways: LLM responds from its training data LLM looks for the information provided in the prompt to respond to the query LLM has a callback function information in the prompt, that can help get the response Let's try to understand the third option, Spring AI's Function calling ...