feat: Debian-based full-stack Opencode Docker image (incl. OpenTelemetry) with GitHub build workflow for … #9560
+291
−0
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Issue fixed
#9562
How did you verify your code works?
The image build executed successfully on my clone of this repo in the branch used to submit this PR: see https://github.com/didier-durand/opencode/actions/runs/21161964252/job/60858222994
The container registry was successfully created on my clone: https://github.com/didier-durand/opencode/pkgs/container/opencode.
The built image runs fine on my MacOS laptop:

What does this PR do?
This PR mostly delivers a full-stack Docker image (based on Debian Trixie official image) for Opencode to operate
consistently across multiple environments (developer laptop, on-prem K8s pod, cloud service like AWS ECS or GCP Run
functions). Having such an image avoid setup issues in new environments and fosters repeatability across those environments
Running in Docker, Opencode still works fine with Ghostty or Alacritty.
To try the image interactively, just run in Ghostty:
docker run -it ghcr.io/didier-durand/opencode-vibe:latest opencode --log-level INFO --model opencode/glm-4.7-free --prompt 'who are you?'To try the build locally:
docker build --no-cache --tag opencode:latest .To run the locally built image:
docker run -it opencode:latest opencode --log-level INFO --model opencode/glm-4.7-free --prompt 'who are you?'This image is fully validated in the GitHub worklfow after its build with a very similar command using
opencode runwith same free model.You can see multiple executions of this workflow in repo opencode-vibe
The image embarks OpenTelemetry collector to make the Opencode agents fully observable (logs, trace spans,
metrics) as enabled by Vercel's AI SDK which is leveraged by Opencode. In our own use cases, we heavily leverage
OpenTelemetry observability with Dash0 service as backend.
Here is an example with Opencode logs
The deliverables of the PRare:
The Dockerfile will:
realize that running a generated ad hoc script will prove more efficient than a ton of tool calls
will allow the upload of Opencode log files to OpenTelemetry backend (see screenshot above)
opencodeuser to have a proper home directory to host .opencode directory and other config filesWITH_OTELenv var to be supplied at run time bydocker runcommand to activate OpenTelemetry or notThe GitHub workflow will:
Hadolintand the Opencode config file against official Opencode schema viacheck-jsonschemaghcr.io/anomalyco/opencodein this caseFinally, the workflow will:
opencode`docker exec "$CONTAINER_NAME" opencode run --log-leve