
Walk into any B2B marketing event, and someone will mention intent data. They’ll talk about surge topics, in-market accounts, and the magic of knowing who’s researching what. They’ll show you a dashboard. It’ll look impressive. Then they’ll mention which provider they use, maybe a layered combination of tools.
But a big problem is that a large chunk of that data comes from the same place. Your three closest competitors are buying it, too. That’s intent data commoditization — and it’s why your CPLs keep climbing while your conversion rates flatten.
Most third-party intent data falls into one of a few buckets: publisher co-ops (where a network of B2B websites contributes anonymized data on content consumption), software review platforms (G2, TrustRadius), and bid stream data. That last one is the biggest and the least understood.
Bid stream data is collected from programmatic ad exchanges. Every time a webpage loads an ad, metadata about the visitor — including the content they’re viewing — flows through ad exchange bid requests. Some intent providers tap that stream to infer topic interest. The scale is enormous, with billions of signals daily.
The SEO toolkit you know, plus the AI visibility data you need.
The bid stream problem
At that scale, the tradeoffs become clear. The accuracy is lower than co-op or editorial data, the resolution is mostly account-level (IP-resolved to companies), and the privacy footing is shaky. As one provider candidly notes, “B2B intent data providers relying on bidstream stand on very weak ground” under GDPR, in part because real-time bidding mass collection often occurs without meaningful consent.
Some providers are even more direct about their data lineage. Some companies tell prospective customers that their intent data delivers breadth because of its “direct access to the bidstream — the source for the most intent signals.”
At least they’re being transparent. The point is structural: when one source supplies the largest share of an industry’s intent, and most platforms either tap that source directly or resell from a handful of co-ops, what you get is a market where everyone sees the same signals at roughly the same time.
The downstream economics are exactly what you’d expect. According to DemandScience’s 2026 State of Performance Marketing Report, 87% of organizations say their marketing investments produce unreliable or inflated intent signals, and only 26% of those signals convert into qualified opportunities. Two-thirds of leaders say their campaign metrics frequently look successful but fail to drive revenue.
If you and three competitors are bidding on the same surging accounts at the same time, all you’re doing is bidding up the price of a meeting.
What signal convergence should actually mean
I want to draw a line between two ideas that get tangled up.
- Intent data commoditization is the problem above. Same data, same accounts, same plays, higher costs.
- Signal convergence is the opposite. It’s the moment when your account-level signals (i.e., firmographic fit, technographic match, hiring patterns, funding), and your contact-level signals (i.e., a real human at that account engaging with your content, your competitors, and your category) intersect. That intersection is where marketing and sales activities converge and result in meetings.
You can’t get to that intersection by buying the same packaged feeds as everyone else. You have to build a richer signal layer, much of it from sources your competitors aren’t using.
That’s what custom signal capture is. It’s accessible even if you’re a short-staffed marketing team.

Custom signal scraping: A primer
Instead of (or in addition to) buying packaged intent feeds, you systematically capture buying signals from public sources that aren’t already in someone else’s data co-op. Done well, you end up with a proprietary signal that your competitors literally can’t purchase.
The categories worth thinking about:
- Job postings: When a company posts a senior security role, that’s a signal. When they post five of them in a quarter alongside a new VP of Engineering, that’s a much louder signal. Job descriptions also leak technology stack, team structure, and strategic priorities.
- Hiring pattern shifts: Departures from key roles can signal vendor reevaluation. Headcount growth in a function (RevOps, data, AI) can signal investment in your category.
- Website and product page changes: A target account quietly adding a new integration partner page, a pricing page change, or a new product tier — these are leading indicators of strategy that intent feeds rarely capture.
- Podcast and webinar appearances: When a buyer goes on a podcast, they tell the world what they care about right now. The transcript is searchable. So is the language they use.
- Community engagement: Comments on Reddit, posts in industry Slack groups, GitHub activity, and questions in developer forums. This is where buyers actually talk about real problems.
- Funding, M&A, and leadership changes: Public, well-covered, easy to track, and remarkably underused as a coordinated trigger across marketing and sales.
- LinkedIn activity: Public posts, job changes, comments on competitor content, and engagement with category thought leadership. This is a contact-level signal that no third-party intent provider gives you cleanly.
- Deep research at scale: With tools like Claude Code or other coding agents, you can write workflows that read any website, scrape any page, parse any call transcripts, and surface the strategic priorities a buyer has actually committed to publicly.
The combination is what matters. One signal is noise. A funding round + three engineering hires + a new CIO + a podcast appearance where the new CIO mentions the exact problem you solve — that’s a signal to go get a meeting.

Data comes in all shapes and sizes
The right approach depends on how much time, budget, and operational maturity you have. Here’s how I’d think about it for three different team sizes.
Small teams: Seed to Series A, marketing team of 1-5)
You can’t afford a $100,000 Demandbase or 6sense contract, and honestly, you shouldn’t. At this stage, the goal is precision over scale.
Start with a focused list of 100-300 target accounts. Build your own signal layer using:
- A workflow tool like Clay to orchestrate enrichment, scraping, and routing into your CRM.
- UserGems or a manual LinkedIn workflow to catch job-change signals from past champions.
- Free or low-cost website monitoring for target account product and pricing page changes.
- Claude or another LLM, hooked up to your account list, doing weekly research sweeps on funding, hiring, leadership changes, and earnings commentary.
Total spend can land well under $2,000 a month. The output is a richer, more current view of your top 200 accounts than most enterprise teams have on theirs. You catch buyers your competitors can’t see, and you reach them while the signal is still warm.
Mid-market teams: Series B to D, marketing team of 6-10
You have more budget, more accounts to cover, and more pressure to show predictable pipeline. A two-tiered approach works here.
For your top tier (the 200-500 accounts you actually want to win), run the small-team playbook above with more rigor. Document signal definitions. Build automated scoring. Connect everything to a dashboard that your sales team checks daily.
For the broader audience (5,000+ accounts), you can layer in one packaged third-party source, but pick one that’s differentiated from what your competitors are likely buying.
G2 Buyer Intent and TrustRadius give you software comparison signals you can’t reproduce. UserGems gives you job-change signals at scale. A category-specific provider (Onfire, for example, for technical buyers) often beats a generic feed for less money.
Enterprise teams: Marketing team of 10+
This is where I see the most waste. Big intent contracts get auto-renewed. The data flows into a system nobody really audits. Sales reps complain that “the intent data isn’t useful,” which is sometimes a skill problem, and sometimes a real problem.
Two moves matter at this scale.
- Audit what you’re actually getting from your packaged providers: Run an overlap analysis. If your three providers are giving you 70% the same accounts, consolidate or replace one with something genuinely additive. Custom-scraped signals (hiring, technographic, community, leadership change, and dark social) almost always provide higher uniqueness than another bid-stream-derived feed.
- Invest in a small but dedicated GTM engineering capability: One operator with multiple APIs, Claude Code, and access to your CRM can build proprietary signal pipelines that no vendor offers because they’re tuned to your ICP, your competitors, and your product. At enterprise scale, this is the cheapest pipeline you’ll ever build relative to the third-party media you’re already spending.
Where to start after you finish reading this
If you take one thing from this, let it be this: stop buying signals and start building them.
A practical first step: pick 50 target accounts. Spend two hours setting up a simple weekly scrape across three sources you don’t currently monitor — say, job postings, leadership announcements, and podcast appearances. See how many of those 50 accounts produce a real signal in 30 days.
Buying intent data isn’t going away. You’ll probably still need a packaged provider or two. But the competitive edge in 2026 won’t come from spending more on the same data your competitors already have. It’s going to come from the signals you build yourself, and the ones nobody else thought to look for.
The post The intent data playbook is breaking down appeared first on MarTech.