Join us

Orchestrating AI Code Review at scale

Orchestrating AI Code Review at scale

Cloudflare engineers built an AI code review platform on OpenCode.

They split GitLab integration, model providers, prompts, and policy into separate plugins. A coordinator assigns up to seven domain reviewers across security, performance, code quality, documentation, release checks, and AGENTS.md compliance.

They stream review events as JSONL, route work by risk tier, protect each model with circuit breakers and failback chains, and let Workers/KV override model choices. They also track incremental re-review state and deduplicate prompt context to control cost and latency.

In the first 30 days, Cloudflare ran 131,246 reviews across 48,095 merge requests in 5,169 repos. The team reported a 3m39s median runtime and a $0.98 median cost per run.


Give a Pawfive to this post!


Only registered users can post comments. Please, login or signup.

Start writing about what excites you in tech — connect with developers, grow your voice, and get rewarded.

Join other developers and claim your FAUN.dev() account now!

Avatar

Kala #GenAI

FAUN.dev()

@kala
Generative AI Weekly Newsletter, Kala. Curated GenAI news, tutorials, tools and more!
Developer Influence
7

Influence

1

Total Hits

181

Posts