Post-migration BigQuery at Nordstrom: From rising slot waste to 47% lower spend
Kristof Horvath
6 min read

Nordstrom’s BigQuery case study is useful because it shows what happens when a team treats BigQuery cost behavior as an engineering system, not just a billing report. This post breaks down what drove the result and what other teams can replicate.
After migrating to BigQuery, Nordstrom’s engineering and FinOps teams faced a familiar problem: costs rose quickly, but understanding why and reducing spend required too much manual work. As Pete Bruno, FinOps Lead & Platform TPM at Nordstrom, put it: “manual SQL tuning was delivering diminishing returns and consuming significant engineering time”.
That gap is where Rabbit fits. Nordstrom adopted Rabbit to get continuous visibility into project- and query-level drivers of spend, then layered on automation where it mattered most: reservation max-slot control, dynamic job-level pricing, storage billing recommendations, and the same kind of prioritization signals a FinOps team would chase manually—without the team having to own every config change by hand. The goal was not another cost dashboard: it was to change how capacity and pricing lined up with actual billing mechanics.
Once Rabbit was in place and the team turned on the optimizations that matched their workloads, the outcome was immediate and measurable:
- 47% reduction in BigQuery spend from slot optimization alone
- slot waste reduced by 38.7%
- about 400 engineering hours per month reclaimed
Read the case study:
How Nordstrom Cut BigQuery Costs by 47% with Rabbit: 'That's Real Spend That Never Hit Our Bill.'
The sections below walk through the migration context, the root cause of waste, what Rabbit did for Nordstrom, and what other organizations can borrow from the same playbook.
Why costs rise quickly right after a BigQuery migration
Teams usually migrate with reliability as the first priority. That is the right call from an operational perspective, but it often creates immediate cost side effects:
- conservative slot and reservation settings
- limited visibility into query-level cost drivers
- optimization done through ad hoc manual reviews
On paper, these are temporary conditions. In practice, they can become the default operating model unless teams build a clear optimization loop and stick to it meticulously, sacrificing significant engineering hours that could otherwise be used to build product.
Where Nordstrom’s biggest inefficiency came from
The core issue was reservation waste caused by BigQuery’s autoscaling behavior. BigQuery can scale up quickly when demand spikes, but billing still follows a 60-second minimum window for provisioned autoscaled slots. Rabbit refers to this as the Autoscaler Tax.
When workloads are bursty, this creates a repeated pattern: short spikes trigger extra capacity, then billing persists longer than useful consumption. Over many projects and schedules, this compounds into substantial waste.
For Nordstrom, this showed up as slot waste at 57.3% before optimization.
Learn more:
BigQuery Reservation Autoscaling: How Does It Really Work?
The manual optimization playbook (before automation)
Even without automation, teams can build a practical baseline workflow:
- Measure waste precisely using reservation and job-level usage timelines, not only monthly bill totals.
- Segment workloads by criticality so production pipelines and elastic workloads are not tuned with the same policy.
- Tune reservation limits using historical behavior (for example, percentile-based demand windows) rather than static “safe” caps.
- Re-check regularly because workload shape changes with product launches, seasonality, and data growth.
This works, but it is expensive in human time. It requires repeated analysis and repeated configuration updates, which is why many teams stall after the first savings pass.
What Rabbit automated for Nordstrom
Nordstrom used Rabbit to operationalize the same optimization loop continuously:
- Max Slot Optimizer adjusted reservation max slot settings in near real time to reduce Autoscaler Tax waste without starving critical workloads.
- Dynamic job-level pricing optimization routed queries to the cheaper compute model (on-demand or capacity-based) when applicable, with an additional 10% forecasted savings from continued rollout.
- Storage pricing recommendations identified where logical vs physical storage billing settings could lower cost.
- Project- and query-level cost visibility gave the team clearer prioritization of where engineering effort would produce the highest return.
The key point is that none of these are vanity dashboards. They are operating controls tied directly to billing outcomes – when applied in a continuous and automated manner, they lead to significant savings at scale.
“Rabbit doesn’t just optimize how you use BigQuery, they rethink how BigQuery billing itself can be worked in your favor. The slot optimizer, automated pricing routing per job, and storage billing model recommendations are all examples of capabilities built around a deep understanding of how BigQuery charges you. That kind of creative product thinking is rare. It’s the difference between a cost reporting tool and a platform that actually changes your bill.”
– Bryce Ageno, Principal Software Engineer, Nordstrom
Applying Nordstrom’s playbook on your stack
Regardless of the specifics of their infrastructure, data teams using BigQuery can copy the pattern that enabled Nordstrom to achieve meaningful cost savings:
- Treat slot waste as a first-class metric. Most teams monitor query runtime; fewer monitor reservation inefficiency with the same discipline.
- Prioritize by cost driver granularity. Optimize at project and query level, not only “BigQuery total spend”.
- Separate one-time savings from recurring controls. Migration cleanups are useful, but ongoing automation is what protects savings month after month.
- Measure engineering time reclaimed. Cost optimization is not only about dollars; it is also about reducing manual operational load.
Why one-off optimization is not enough
Bryce Ageno, Principal Software Engineer at Nordstrom, summarized the key distinction clearly:
“Rabbit’s slot optimizer alone has reduced our BigQuery spend by 47% since we turned it on. That’s not a projection or a modeled estimate, that’s real spend that never hit our bill. For a single feature of a platform to move the needle that significantly is something I wouldn’t have believed before seeing it ourselves.”
Forecasts and modeled savings still have a role: they help you decide what to try first and whether a change is worth the migration risk. What actually moves the needle are realized savings and sustained control so that the bill stays down even as workloads change.
Achieving both by hand is possible in theory; in practice it is brittle, slow, and expensive in engineering time – which is why Nordstrom leaned on automation with Rabbit.
Pete Bruno, FinOps Lead & Platform TPM at Nordstrom, described the shift in terms of ROI on effort:
“Rabbit’s deep visibility into BigQuery quickly identifies project and query-level drivers of inefficient spend, significantly increasing the ROI of our optimization efforts.”
For the whole story, read our Nordstrom customer case study!
Want a practical baseline before changing your reservations? Start with your current BigQuery waste profile and model the highest-impact actions first.
Use Rabbit’s BigQuery Savings Calculator to estimate where your biggest avoidable spend is today:


