Ollama4j is a lightweight Java library designed to simplify interaction with the Ollama server, enabling developers to integrate locally hosted large language models into Java applications with minimal friction. It provides a clean and developer-friendly API that abstracts the underlying REST communication, allowing users to focus on building features rather than handling low-level networking details. The library supports a wide range of core capabilities, including chat-based interactions, text generation, embeddings, and model management, making it suitable for both simple applications and more advanced AI-driven systems. It integrates seamlessly into Java ecosystems through Maven or Gradle dependencies, and it works with standard Java environments starting from JDK 11 and above. Ollama4j also includes support for advanced features such as tool or function calling, enabling models to interact with external systems and execute more complex workflows.
Features
- Clean Java API for interacting with Ollama server
- Support for chat, text generation, and embeddings
- Model lifecycle management including pull and delete
- Tool and function calling for advanced workflows
- Multimodal support with image-based interactions
- Seamless integration via Maven or Gradle dependencies