Skip to content

Aider runner

@conduit-harness/conduit-runner-aider invokes aider as a subprocess inside the issue’s worktree. Aider is an open-source agentic coding tool that uses search/replace diffs, which keeps it usable with smaller local models — making this runner the recommended path for fully self-hosted setups against Ollama.

Terminal window
npm install -g @conduit-harness/conduit-runner-aider

This runner has three external dependencies you’ll also need on PATH:

The one-line installer (Mac & Linux) is the simplest route:

Terminal window
curl -LsSf https://aider.chat/install.sh | sh

Windows (PowerShell):

Terminal window
powershell -ExecutionPolicy ByPass -c "irm https://aider.chat/install.ps1 | iex"

Other options (uv, pipx, manual pip) are documented at aider.chat/docs/install.html. All of them produce an aider binary on your PATH.

Download and install from ollama.com/download. After installation, Ollama runs as a background service and exposes an HTTP API at http://localhost:11434.

Pull a model into Ollama. qwen2.5-coder is a good default for this runner because it works well with aider’s search/replace edit format:

Terminal window
ollama pull qwen2.5-coder

For larger machines, qwen2.5-coder:14b or qwen2.5-coder:32b will produce better edits at the cost of more VRAM. The setup wizard recommends a model based on the hardware you select.

agent:
kind: aider
max_concurrent_agents: 1
aider:
model: ollama_chat/qwen2.5-coder
ollama_endpoint: http://localhost:11434

The runner writes the prompt to a temp file in the worktree and passes it via --message-file, so each issue dispatch is a single non-interactive aider session.

On startup the runner verifies that the configured aider binary is on PATH; if it isn’t, Conduit fails fast with the install command rather than silently dispatching agents that would all error out.

All under the aider: key in the workflow:

OptionDefaultDescription
modelollama_chat/qwen2.5-coder:14bModel identifier passed to aider via --model. Use the ollama_chat/<name> prefix for Ollama-served models.
ollama_endpointhttp://localhost:11434Set as OLLAMA_API_BASE in aider’s environment.
commandaider --yes-always --no-pretty --no-stream --no-show-model-warnings --no-detect-urls --no-check-updateOverride only if you need different aider flags. The runner appends --model and --message-file automatically.
extra_args(empty)Extra arguments appended after the auto-added flags. Useful for --no-git, --no-auto-commits, etc.
api_key(none)Optional. Set as OPENAI_API_KEY in aider’s environment for non-Ollama backends.
turn_timeout_ms3600000 (1 hour)Hard cap on the whole aider run.
stall_timeout_ms300000 (5 min)Kill aider if it produces no output for this long.

packages/conduit-runner-aider