Skip to content

Commit 146e8eb

Browse files
authored
Merge pull request #533 from matysek/lcore-399-docs
LCORE-399: Update README how to create custom image
2 parents 5e2bc4e + a066855 commit 146e8eb

File tree

1 file changed

+77
-0
lines changed

1 file changed

+77
-0
lines changed

README.md

Lines changed: 77 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,7 @@ The service includes comprehensive user data collection capabilities for various
5151
* [Llama-Stack as Separate Service (Server Mode)](#llama-stack-as-separate-service-server-mode)
5252
* [Llama-Stack as Library (Library Mode)](#llama-stack-as-library-library-mode)
5353
* [Verify it's running properly](#verify-its-running-properly)
54+
* [Custom Container Image](#custom-container-image)
5455
* [Endpoints](#endpoints)
5556
* [OpenAPI specification](#openapi-specification)
5657
* [Readiness Endpoint](#readiness-endpoint)
@@ -693,6 +694,82 @@ A simple sanity check:
693694
curl -H "Accept: application/json" http://localhost:8080/v1/models
694695
```
695696

697+
## Custom Container Image
698+
699+
The lightspeed-stack container image bundles many Python dependencies for common
700+
Llama-Stack providers (when using Llama-Stack in library mode).
701+
702+
Follow these instructons when you need to bundle additional configuration
703+
files or extra dependencies (e.g. `lightspeed-stack-providers`).
704+
705+
To include more dependencies in the base-image, create upstream pull request to update
706+
[the pyproject.toml file](https://github.com/lightspeed-core/lightspeed-stack/blob/main/pyproject.toml)
707+
708+
1. Create `pyproject.toml` file in your top-level directory with content like:
709+
```toml
710+
[project]
711+
name = "my-customized-chatbot"
712+
version = "0.1.0"
713+
description = "My very Awesome Chatbot"
714+
readme = "README.md"
715+
requires-python = ">=3.12"
716+
dependencies = [
717+
"lightspeed-stack-providers==TODO",
718+
]
719+
```
720+
721+
2. Create `Containerfile` in top-level directory like following. Update it as needed:
722+
```
723+
# Latest dev image built from the git main branch (consider pinning a digest for reproducibility)
724+
FROM quay.io/lightspeed-core/lightspeed-stack:dev-latest
725+
726+
ARG APP_ROOT=/app-root
727+
WORKDIR /app-root
728+
729+
# Add additional files
730+
# (avoid accidental inclusion of local directories or env files or credentials)
731+
COPY pyproject.toml LICENSE.md README.md ./
732+
733+
# Bundle own configuration files
734+
COPY lightspeed-stack.yaml run.yaml ./
735+
736+
# Add only project-specific dependencies without adding other dependencies
737+
# to not break the dependencies of the base image.
738+
ENV UV_COMPILE_BYTECODE=0 \
739+
UV_LINK_MODE=copy \
740+
UV_PYTHON_DOWNLOADS=0 \
741+
UV_NO_CACHE=1
742+
# List of dependencies is first parsed from pyproject.toml and then installed.
743+
RUN python -c "import tomllib, sys; print(' '.join(tomllib.load(open('pyproject.toml','rb'))['project']['dependencies']))" \
744+
| xargs uv pip install --no-deps
745+
# Install the project itself
746+
RUN uv pip install . --no-deps && uv clean
747+
748+
USER 0
749+
750+
# Bundle additional rpm packages
751+
RUN microdnf install -y --nodocs --setopt=keepcache=0 --setopt=tsflags=nodocs TODO1 TODO2 \
752+
&& microdnf clean all \
753+
&& rm -rf /var/cache/dnf
754+
755+
# this directory is checked by ecosystem-cert-preflight-checks task in Konflux
756+
COPY LICENSE.md /licenses/
757+
758+
# Add executables from .venv to system PATH
759+
ENV PATH="/app-root/.venv/bin:$PATH"
760+
761+
# Run the application
762+
EXPOSE 8080
763+
ENTRYPOINT ["python3.12", "src/lightspeed_stack.py"]
764+
USER 1001
765+
```
766+
767+
3. Optionally create customized configuration files `lightspeed-stack.yaml` and `run.yaml`.
768+
769+
4. Now try to build your image
770+
```
771+
podman build -t "my-awesome-chatbot:latest" .
772+
```
696773

697774
# Endpoints
698775

0 commit comments

Comments
 (0)