Revenue · Cluster 2 Spoke

Closed-Won Analysis: What Your Best Deals Have in Common

Your best deals share a pattern. This is the framework to find it: the seven signals to extract, a quarterly cadence, and what to feed back into ICP, scoring, and BDR targeting.

By Siddharth Gangal · Founder, Fairview · Updated April 13, 2026 · 11 min read

Closed-won analysis hero: magnifying glass hovering over a grid of deal cards with the winning pattern highlighted in purple

TL;DR

  • Closed-won analysis is the practice of studying recently won deals to find the pattern your best customers share.
  • Seven signals worth extracting: firmographics, contract size, cycle time, source channel, stakeholder role, usage pattern, and expansion behavior.
  • Weight by revenue, not deal count. Ten small wins do not outvote three whale deals when the goal is ICP refinement.
  • Run it quarterly with a minimum of 20 wins in the window. Feed results into CRM scoring, BDR target lists, and marketing criteria.
  • Fairview joins CRM, billing, and usage data so the seven signals are queryable without a custom warehouse build.

Every sales team has a folklore answer to “who is our ideal customer?” It usually lives in the VP Sales’ head, arrived at through three years of close calls, and it is almost always partly wrong. Closed-won analysis is how you replace folklore with a pattern that the CRM can enforce.

Done well, the exercise takes about a day per quarter and changes how the next quarter’s pipeline is built. Done poorly, it produces a slide deck that nobody references again. The difference is almost entirely in the discipline of the seven signals below and the decision to weight by revenue instead of deal count.

This guide walks through the definition, the seven signals, the quarterly cadence, and the three places the output has to land to actually matter. It pairs with the Cluster 2 post on RevOps vs Sales Ops and the pipeline hygiene guide in the hub.

What is closed-won analysis?

Definition

Closed-won analysis: a structured review of recently won deals to extract the firmographic, behavioral, and motion-level signals they share. The output sharpens the ideal customer profile, the CRM scoring model, and the targeting rules used by marketing and BDRs.

The exercise looks simple. Pull every closed-won deal from the last quarter, put them in a spreadsheet, and look for what is common. The complication is which columns to include, how to weight them, and where to draw the line between real pattern and noise. A team that does this with three columns will get a sharper ICP than one that does it with thirty.

It is not the same as win/loss analysis. Win/loss compares wins with losses, usually through buyer interviews, and answers “why did we win or lose this one?” Closed-won analysis looks only at wins and answers “who are we actually good at selling to?” The two are complements, not substitutes.

The seven signals worth extracting

Seven closed-won signals in a grid: firmographics, contract size, sales cycle, source channel, stakeholder role, usage pattern, and expansion
The seven signals a useful closed-won analysis extracts from every won deal.

Stop at seven. Any more columns and the analysis becomes a research project that ships too late to use. Any fewer and the ICP comes out too narrow.

  1. Firmographics. Industry, sub-vertical, employee count, revenue band, region, and tech stack signals. The core question: what does the best-fit company look like before anyone talks to them?
  2. Contract size (ACV). Segment by deal size band rather than averages. Whale deals, mid-market, and SMB win for different reasons and should not be collapsed into one pattern.
  3. Sales cycle length. Time from first touch to close, in days. Short cycles usually signal a well-matched use case. Long cycles often signal a misfit that got over the line anyway.
  4. Source channel. Where the deal originated: inbound, outbound, partner, event, referral, PLG self-serve. Different sources produce different closed-won shapes.
  5. Stakeholder role. Title and seniority of the economic buyer and primary champion. A deal that closed through a director of RevOps looks different from one closed through a CFO.
  6. Product usage pattern. In the first 30 days post-close, which features got adopted, by how many users, and how often. This is the bridge between sales and expansion.
  7. Expansion behavior. Did the account upgrade, add seats, or add products in the first 90 days? The accounts that expand quickly are the real ideal customers.

Key insight

The most common mistake is treating every closed-won deal as one equal vote. Weight by revenue. A $240K deal and a $12K deal do not belong in the same bucket.

How to actually run the analysis

A one-day operator exercise, not a consulting project. The workflow:

  1. Pull wins from the last 90 days. Export from HubSpot, Salesforce, or Pipedrive. Minimum sample: 20 deals. Below that, treat the output as qualitative.
  2. Enrich with billing and usage. Join Stripe or the billing system for actual collected revenue, not booked ARR. Join the product analytics tool or the CS platform for early usage.
  3. Segment into three ACV bands. Whale, mid, SMB — or whatever the natural break in your data is. Analyze each band separately.
  4. Score each signal by revenue contribution. If 62% of revenue came from accounts 200–1000 employees, note the band. If 38% of revenue came from partner-sourced deals, that is a target signal for next quarter.
  5. Write the two-line ICP per band. “Mid-market wins: Series B SaaS companies, 300–800 employees, HubSpot as CRM, director-level buyer, 42-day cycle.” Two lines per band beats a ten-page deck every time.
  6. Feed the results into three places. CRM lead scoring rules, BDR outbound target list, and marketing targeting criteria. If the output does not change behavior in those three systems, the analysis did nothing.

What the output should look like

Quarterly operating review cadence showing where closed-won analysis fits in the calendar
Closed-won analysis sits in the quarterly operating review, alongside pipeline and forecast reviews.

A useful closed-won deliverable is shorter than most teams expect. For a typical B2B SaaS with 40 wins per quarter, the final artifact is one page per ACV band:

SignalWhale ($150K+)Mid ($30–150K)SMB (<$30K)
IndustryB2B SaaS, FintechB2B SaaS, eComeCom, agencies
Employees1000–5000200–80020–150
SourcePartner, referralInbound, eventPLG, paid search
Cycle time (median)94 days42 days14 days
90-day expansion58%34%12%

That table — with the specifics of your business — is the closed-won analysis. Everything else is commentary. The three distinct ICPs become the three different playbooks marketing, sales, and CS each run.

Quote-ready

You do not have one ideal customer. You have three, and closed-won analysis is how you separate them before the pipeline goal for next quarter gets set.

The three places the result has to land

A closed-won analysis that does not change behavior in one of these three systems was a waste of a day:

  • CRM scoring rules. Lead scoring should weight the firmographics the closed-won data identifies. If 200–800 employees is the mid-market sweet spot, that band should add 30 points. Anything outside it, subtract points.
  • BDR outbound target list. Next quarter’s BDR list should over-index on the patterns that closed-won data validated. If partner-sourced Whales produce 42% of revenue, the partner team gets quota, not just MDF.
  • Marketing targeting criteria. Paid social audiences, paid search campaigns, and ABM lists all inherit the firmographic filters from the analysis. This is where most of the ROI shows up — targeting the wrong companies is the single largest marketing waste at growth stage.

The three mistakes that kill the exercise

  1. Treating booked ARR as revenue. Some of those deals will churn in 90 days. Use collected revenue from Stripe or the billing system, or at minimum net of early churn.
  2. Analyzing deal count instead of revenue-weighted wins. The SMB motion usually wins the deal-count race and loses the revenue race. Both patterns matter, but only if they are kept separate.
  3. Skipping the feedback loop. Running the analysis and never updating the CRM scoring is the most common failure. If nothing in the downstream systems changes, the exercise is theater.

How Fairview runs closed-won analysis automatically

Fairview operating dashboard surfacing closed-won patterns across firmographics, source, cycle time, and expansion
Fairview reconstructs the seven signals across every won deal and surfaces the pattern by ACV band.

Fairview connects to HubSpot, Salesforce, Pipedrive, Stripe, QuickBooks, Xero, Shopify, Google Ads, Meta Ads, and HubSpot Marketing Hub via native OAuth. Once connected, the operating view segments closed-won deals by ACV band, enriches with billing and source data, and surfaces the firmographic and motion signals for each segment without a warehouse project.

When a pattern shifts quarter over quarter, Fairview writes a named next-best action: "Whale deals >$150K shifted from partner-sourced to direct-sourced this quarter. Cycle time extended from 87 to 112 days. Recommendation: rebuild the partner qualification track before Q3 hiring."

See pricing and tiers for the plan that fits your stack.

7 signals

Extracted per deal automatically

Quarterly

Pattern refresh per ACV band

10

Native integrations live today

Key takeaways

  • Closed-won analysis replaces ICP folklore with pattern evidence.
  • Seven signals: firmographics, ACV, cycle, source, stakeholder, usage, expansion.
  • Weight by revenue, segment by ACV band, minimum 20 wins per window.
  • Output lands in CRM scoring, BDR target lists, and marketing criteria — or it was theater.
  • Run it quarterly. Track what shifts between quarters as carefully as what stays the same.

Get your closed-won pattern on day one.

Connect HubSpot or Salesforce, Stripe, and your usage data. Fairview surfaces the seven-signal closed-won analysis across ACV bands automatically. 14-day trial, no card required.

Book a demoStart free trial

Frequently asked questions

Closed-won analysis is a structured review of recently won deals to extract the firmographic, behavioral, and motion signals they share. It replaces anecdote-driven ICP guesses with a pattern that can be encoded into CRM scoring, marketing targeting, and BDR lists. It is usually run quarterly on a rolling 90-day window of wins.

Win/loss analysis compares wins and losses, usually through buyer interviews, and tells you why deals went either way. Closed-won analysis looks only at won deals and tells you who your best customers are. Most growth-stage teams need both: win/loss to sharpen messaging and product, closed-won to sharpen targeting and ICP.

Quarterly is the default for most B2B companies with at least 20 wins per quarter. Higher-volume transactional motions can run it monthly. Anything below 20 wins per window produces too much noise for statistical claims, though a qualitative read of the deal list can still be useful at the seed stage.

Seven core signals: firmographics (industry, size, region, tech stack), contract size or ACV, sales cycle length, source channel, stakeholder role and seniority, 30-day product usage pattern, and 90-day expansion behavior. Weight by revenue, not deal count, and segment by ACV band so whale wins and SMB wins do not get averaged together.

A minimum of 20 closed-won deals in the analysis window, ideally 40 or more. Below 20, the patterns are not statistically meaningful and the recommendation is to review the deals qualitatively instead of trying to extract percentages. At 40 or more, you can segment into ACV bands and still have enough per band to draw a read.

Feed them into three systems: CRM lead scoring rules so inbound leads match the pattern, BDR outbound target lists so outbound effort lands on ICP-shaped accounts, and marketing targeting criteria so paid and ABM audiences reflect the firmographics that closed. An analysis that does not change behavior in those three places was a wasted day.

Tags

closed-won analysisICPrevopspipelineoperating intelligence

Keep reading

Related posts

Ready to see your data clearly?

Stop reporting on last week.
Start acting on this week.

10 minutes to connect. No SQL. No engineering team. Your first dashboard is built automatically.

No credit card required · Cancel anytime · Setup in under 10 minutes