Skip to content

LizeRaes/docker-model-runner-langchain4j

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

docker-model-runner-langchain4j

A minimal Java + HTML + JS chat app that shows how to use Docker Model Runner as a local LLM backend — fully integrated with LangChain4j.

Runs locally, uses your GPU (if available), and doesn’t depend on any cloud APIs. Looks like this

Chat App Screenshot


Features

  • 🧠 LangChain4j + local LLMs via Docker
  • 🐳 Uses Docker Model Runner (lets you pull gemma3, phi4, DeepSeek, ...)
  • ⚡ GPU acceleration (if available)
  • 🔂 Chat memory retains 15 messages
  • 🍃 Pure Java + vanilla JS + basic HTML
  • 🧰 Minimal dependencies: Jackson + LangChain4j only

Prerequisites


🚀 Quickstart

1. Enable Docker Model Runner

Go to Docker Desktop settings and enable the following:

Settings → Features in Development → Enable Model Runner
Enable host-side TCP support (default port: 12434)

Model Runner Settings


2. Pull a Model

docker model pull ai/gemma3

or any other from Docker Hub AI

Once a model is pulled, it can be ran at anytime. Verify with:

docker model list

3. Build & Run

Maven

mvn clean compile exec:java

Gradle

./gradlew run

Now open http://localhost:8080 and start chatting.


💡 Where the Integration Happens

If you're curious where the link to the local model is made, it's ChatServer.java code:

OpenAiChatModel.builder()
    .apiKey("not needed")
    .baseUrl("http://localhost:12434/engines/llama.cpp/v1")
    .modelName("ai/gemma3") // or any pulled model
    .build();

It uses the Docker Model Runner's OpenAI-compatible endpoint (on port 12434) to route requests to your local model.


Changing Models

To use a different model:

docker model pull ai/llama3.2:latest

Then update the code:

.modelName("ai/llama3.2:latest")

And run the app again.

DevoxxGenie usage

DevoxxGenie Setup

Install the DevoxxGenie in your IDEA via the marketplace

About

Demo vanilla LangChain4j + Docker Model Runner

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •