We shipped an upgrade to our test infrastructure this week that might seem small on the surface—adding pytest-asyncio to our Docker test-runner image—but it unlocks something we've needed for a while: proper async test support in our automated quality gates.
What Changed
Our Docker test-runner image now includes pytest-asyncio alongside the existing testing stack: pytest, Node 20, and npm. This wasn't just about adding a package—it required updating our Dockerfile, creating a new CI workflow to publish the image automatically, and revising our quality gates documentation to reflect the new capabilities.
The catalyst? Gate 2. Our quality gate pipeline needed to run async tests in the backend services, and we were hitting limitations. Rather than work around it, we fixed the foundation.
Why It Matters
Modern Python backends lean heavily on async patterns—FastAPI, async database clients, background task queues. If your test infrastructure can't handle async code naturally, you end up either skipping important tests or writing awkward sync wrappers that don't reflect how the code actually runs.
With pytest-asyncio in the test-runner, we can write tests that look like the code they're testing. Async functions, await calls, proper fixture management—all of it runs cleanly in our CI pipeline now. Gate 2 can validate async behavior without compromise.
The Implementation
We added a GitHub Actions workflow (.github/workflows/publish-test-runner.yml) that builds and publishes the updated Docker image whenever changes are made to the test-runner Dockerfile. This means the image stays in sync with our evolving test requirements without manual intervention.
We also updated our quality gates documentation to document the new async testing capabilities and adjusted backend service tests to take advantage of the new tooling. It's all tied together—infrastructure, automation, and documentation moving in lockstep.
What's Next
This sets the stage for more sophisticated async testing patterns across our backend services. We're looking at expanding test coverage for background jobs, event handlers, and streaming endpoints—all areas where async is essential.
We're also evaluating whether other testing tools (like coverage reporters or load testing frameworks) should be baked into the test-runner image. The goal is a single, reliable testing environment that works identically in CI and local development.
Small improvements to infrastructure compound. This week it's pytest-asyncio. Next week it might be something else. The point is we're building a foundation that doesn't force compromises when writing tests—and that makes the whole system more reliable.