November 12, 2025
Reinventing the wheel, or flexing the brain?
Building a CI/CD Pipeline Runner from Scratch in Python
Solo dev rebuilds the “factory line” for code; internet splits between applause and eye-rolls
TLDR: A developer built a homegrown code pipeline tool in Python for an offline network and showed how standard runners work. Comments erupted into a build-versus-buy brawl: some praised the learning and control, others said “just use air‑gapped Jenkins or Argo,” flagging maintenance, reliability, and security trade-offs.
A developer just rebuilt a mini version of those click-and-go code pipelines—think the factory line that tests and ships apps—entirely in Python. He did it for an air-gapped setup (a strictly offline network), and broke down the guts: stages, jobs, artifacts, and even a first version that runs a job in a container. The crowd? Instantly divided. One loud camp called it classic “reinventing the wheel,” pointing to proven, offline-friendly tools like Jenkins and Argo Workflows. Another camp cheered the learn-it-yourself approach, arguing you only truly trust the machine when you’ve seen the gears spin. The hottest thread questioned the premise: if GitHub Actions or GitLab CI aren’t allowed, why not self-host a mature tool instead of shipping a homebrew runner? Supporters countered with control, security, and simplicity in locked-down networks. The memes flew: “YAML is the new assembly,” “Yet Another YAML Orchestrator,” and “Next up: Kubernetes in Excel.” Skeptics warned about pager-duty nightmares and maintenance debt; fans said the code demystifies how runners actually orchestrate work. It’s the timeless build vs. buy cage match—with extra spice from the air-gapped twist and Python purists vs. pragmatists. Drama level: high, emoji count: tasteful.
Key Points
- •The article defines CI/CD pipelines as staged workflows where jobs in the same stage run in parallel and stages run sequentially.
- •It outlines runner responsibilities: parse config, build dependency graph, execute jobs in containers, stream logs, pass artifacts, and report status.
- •Pipeline components are explained: stages (order), jobs (isolated execution with scripts and dependencies), and artifacts (files passed between jobs).
- •A tutorial aims to build a Python-based pipeline runner implementing stages, parallelism, job dependencies, and artifact passing.
- •Version 1 demonstrates a single job executor using Docker, with Job and JobExecutor classes, defaulting to python:3.11 and combining script commands for execution.