Skip to content

Nuface Blog

隨意隨手記 Casual Notes

Menu
  • Home
  • About
  • Services
  • Blog
  • Contact
  • Privacy Policy
  • Login
Menu

Running OpenCode AI using Docker

Posted on 2026-01-142026-01-14 by Rico

A Practical, Reproducible, and Enterprise-Ready Implementation Guide

AI coding tools such as OpenCode AI can significantly improve developer productivity. However, installing them directly on developer machines quickly becomes problematic in team or enterprise environments:

  • Inconsistent setups
  • Credentials scattered across laptops
  • No governance or auditability
  • Difficult upgrades and rollbacks

This guide shows how to run OpenCode AI inside a Docker container, following best practices for security, reproducibility, and enterprise governance.


1. Design Goals

Before implementation, define clear goals:

  • OpenCode AI must not be installed directly on the host
  • The runtime environment must be reproducible
  • No API tokens or secrets baked into images
  • Run as a non-root user
  • Easy to rebuild, upgrade, and roll back
  • Ready for team-wide or enterprise adoption

2. Project Structure

Create a clean project directory:

opencode-docker/
├── Dockerfile
├── build.sh
└── run.sh

3. Dockerfile (Core Implementation)

Below is a production-ready Dockerfile aligned with security best practices.

FROM ubuntu:latest

ENV DEBIAN_FRONTEND=noninteractive

# Install required system tools
RUN apt-get update && apt-get install -y --no-install-recommends \
    curl \
    ca-certificates \
    git \
    openssh-client \
    sudo \
 && rm -rf /var/lib/apt/lists/*

# Create a non-root user
RUN useradd -m -s /bin/bash ubuntu \
 && echo "ubuntu ALL=(ALL) NOPASSWD: ALL" > /etc/sudoers.d/ubuntu \
 && chmod 0440 /etc/sudoers.d/ubuntu

USER ubuntu
WORKDIR /home/ubuntu

# Prepare SSH configuration
RUN mkdir -p /home/ubuntu/.ssh \
 && touch /home/ubuntu/.ssh/known_hosts

# Preload GitHub host keys (non-interactive Git usage)
RUN ssh-keyscan -T 5 github.com 2>/dev/null >> /home/ubuntu/.ssh/known_hosts || true

# Install OpenCode AI (official binary installer)
RUN curl -fsSL https://opencode.ai/install | bash

Key Security Notes

  • No secrets in the image
  • Runs as non-root
  • Disposable container by design

4. Build the Image

Create build.sh:

#!/bin/bash
set -e

docker build -t opencode-ai:latest .

Run:

chmod +x build.sh
./build.sh

5. Running the Container (Critical Step)

OpenCode AI requires authentication data.
Credentials must remain on the host and be mounted at runtime.


5.1 Host Preparation (One-Time)

After authenticating with OpenCode AI on the host, the following paths typically exist:

~/.local/share/opencode/auth.json
~/.config/opencode/

These must never be baked into the image.


5.2 run.sh (Standard Usage)

#!/bin/bash

docker run --rm -it \
  -v "$HOME/.local/share/opencode:/home/ubuntu/.local/share/opencode" \
  -v "$HOME/.config/opencode:/home/ubuntu/.config/opencode" \
  -v "$PWD:/workspace" \
  -w /workspace \
  opencode-ai:latest \
  opencode

Run:

chmod +x run.sh
./run.sh

6. What Happens at Runtime

Inside the container:

  • You run opencode normally
  • You operate on the current project directory
  • You use your own credentials
  • No state is preserved inside the container

This follows the cloud-native principle:

Containers are disposable; data and credentials are external.


7. Why This Approach Works for Enterprises

✔ IT Operations

  • Centralized image versioning
  • Easy upgrades and rollbacks
  • Compatible with security scanning tools

✔ Security & Compliance

  • Credentials never enter the image
  • Non-root execution
  • Reduced supply-chain risk

✔ Engineering Teams

  • Identical environment for all users
  • No host pollution
  • Low onboarding cost

8. Extensions and Next Steps

This setup can naturally evolve into:

  • A shared enterprise OpenCode AI image
  • Integration with CI/CD pipelines
  • AI-assisted code review or refactoring workflows
  • Internal RAG-augmented coding environments

Conclusion

The value of AI coding tools is not just speed—it’s safe, repeatable, and governed adoption.

By running OpenCode AI in Docker, organizations can turn a personal productivity tool into a managed, enterprise-grade capability.

This approach ensures AI adoption that is:

  • Secure
  • Scalable
  • Sustainable

Recent Posts

  • Token/s and Concurrency:
  • Token/s 與並發:企業導入大型語言模型時,最容易被誤解的兩個指標
  • Running OpenCode AI using Docker
  • 使用 Docker 實際運行 OpenCode AI
  • Security Risks and Governance Models for AI Coding Tools

Recent Comments

  1. Building a Complete Enterprise-Grade Mail System (Overview) - Nuface Blog on High Availability Architecture, Failover, GeoDNS, Monitoring, and Email Abuse Automation (SOAR)
  2. Building a Complete Enterprise-Grade Mail System (Overview) - Nuface Blog on MariaDB + PostfixAdmin: The Core of Virtual Domain & Mailbox Management
  3. Building a Complete Enterprise-Grade Mail System (Overview) - Nuface Blog on Daily Operations, Monitoring, and Performance Tuning for an Enterprise Mail System
  4. Building a Complete Enterprise-Grade Mail System (Overview) - Nuface Blog on Final Chapter: Complete Troubleshooting Guide & Frequently Asked Questions (FAQ)
  5. Building a Complete Enterprise-Grade Mail System (Overview) - Nuface Blog on Network Architecture, DNS Configuration, TLS Design, and Postfix/Dovecot SNI Explained

Archives

  • January 2026
  • December 2025
  • November 2025
  • October 2025

Categories

  • AI
  • Apache
  • CUDA
  • Cybersecurity
  • Database
  • DNS
  • Docker
  • Fail2Ban
  • FileSystem
  • Firewall
  • Linux
  • LLM
  • Mail
  • N8N
  • OpenLdap
  • OPNsense
  • PHP
  • Python
  • QoS
  • Samba
  • Switch
  • Virtualization
  • VPN
  • WordPress
© 2026 Nuface Blog | Powered by Superbs Personal Blog theme