You pull up your Google Ads dashboard and see 42 conversions. Then you check Meta and count 38. Your CRM says 27 deals closed. And Google Analytics tells yet another story entirely.
If your ad attribution data is not matching across platforms, you are not alone, and you are not going crazy. Discrepancies between ad platforms, analytics tools, and CRMs are one of the most common and most frustrating challenges digital marketers face today.
The root of the problem is straightforward: every platform uses its own attribution model, its own tracking methodology, and its own definition of what counts as a conversion. Add in browser privacy restrictions, iOS tracking limitations, ad blockers, and cross-device user journeys, and it is no surprise the numbers never line up perfectly.
Here is the good news. While you may never get every platform to report the exact same number, you can systematically identify the root causes of your discrepancies, fix the ones within your control, and build a single source of truth that gives you the confidence to make real budget decisions.
This guide walks you through a practical, step-by-step process to diagnose and resolve attribution data mismatches. Whether you are a solo media buyer or part of a larger marketing team managing campaigns across multiple channels, these steps will help you move from confusion to clarity.
By the end, you will have a repeatable framework for auditing your tracking setup, aligning your attribution windows, and ensuring the data feeding your ad platform algorithms is as accurate as possible. Let's get into it.
Before you can fix a discrepancy, you need to understand exactly what each platform is measuring and where in the user journey it is measuring it. Most attribution mismatches are not caused by broken tracking. They are caused by platforms measuring completely different things and being compared as if they were equivalent.
Start by creating a visual inventory of every tool in your stack that reports conversions. This typically includes your ad platforms (Google Ads, Meta Ads, LinkedIn, TikTok), your analytics tool (GA4), your CRM, and any standalone attribution platform you use. List them all out in a spreadsheet.
For each tool, document the following:
What event triggers the conversion: Does it fire on a form submission, a thank-you page load, a purchase confirmation, or a CRM stage change? A tool that fires on form submission will always report more conversions than your CRM, which only logs qualified opportunities.
Where in the user journey the tag fires: A pixel that fires on the landing page click is measuring something fundamentally different from one that fires on the order confirmation page. Document the exact URL or event that triggers each conversion.
What data is being passed: Is the platform receiving revenue values, order IDs, or just a basic conversion ping? Missing parameters mean you cannot reconcile revenue figures later.
Next, check for duplicate tracking. This is one of the most common causes of inflated numbers. A classic example: you have a Google Ads conversion tag firing on your thank-you page, and you also have a GA4 goal set up for the same page that is imported into Google Ads. Now every conversion is being counted twice. Use Google Tag Manager's preview mode or Google Tag Assistant to confirm how many times each tag fires per conversion event.
Also look for cross-platform overlap. If a user clicks a Google ad and then a Meta ad before purchasing, both platforms will likely count that purchase as their own conversion. This is not a bug. It is how self-attribution works. Understanding this upfront prevents you from treating it as a tracking error.
The goal of this step is not to fix anything yet. It is to build a clear map of your tracking ecosystem so you know exactly what you are working with. A simple spreadsheet with columns for platform, conversion event, trigger page or action, and data passed is enough to get started.
Success indicator: You have a complete inventory showing every tracking tool, the exact event it fires on, and how data flows through your stack. You can explain, in plain language, why each platform's number is different from the others.
Here is where most attribution confusion actually lives. Even if your pixels are firing perfectly, comparing platforms with different attribution windows is like comparing apples to oranges. The numbers will never match, and that is by design.
Let's look at the defaults. Google Ads uses a 30-day click-through attribution window by default. Meta Ads defaults to a 7-day click and 1-day view-through window. This means Google is claiming credit for any conversion that happened within 30 days of a click, while Meta is claiming credit for conversions within 7 days of a click or 1 day of someone simply viewing an ad without clicking.
These are fundamentally different measurement standards. If you are comparing them side by side without adjusting for this, the comparison is meaningless. Understanding attribution windows in advertising is essential to making fair cross-platform comparisons.
To bring them into closer alignment, adjust your attribution window settings in each platform:
In Google Ads: Navigate to your campaign settings or the Conversions section under Tools. You can change the click-through window to 7 days to match Meta's click window more closely. This will reduce your reported conversion count but make the comparison more apples-to-apples.
In Meta Ads: In Ads Manager, you can adjust the attribution setting at the ad set level or compare windows in the Attribution Setting column. Consider switching to 7-day click only if you want to remove view-through inflation from your comparison.
Speaking of view-through conversions: this is a major source of inflated numbers on Meta specifically. When view-through attribution is enabled, Meta counts a conversion if someone saw your ad and then converted within 1 day, even if they never clicked. For most direct response campaigns, turning off view-through attribution or at least reporting on click-only conversions gives you a more accurate picture of ad-driven results.
Beyond windows, standardize your conversion definitions across platforms. Ask yourself: what is a conversion for your business? Is it a lead form submission? A booked sales call? A closed deal in the CRM? A purchase? Each of these happens at a different stage of the funnel, and if different platforms are measuring different stages, the numbers will always diverge.
Define your primary conversion event and make sure every platform is measuring the same action. Secondary events (like page views or video plays) should be tracked for optimization purposes but not compared as conversions.
Important note: Changing attribution windows does not rewrite historical data. If you adjust your window today, the change applies going forward. Document the exact date you made the change so your team can account for it when reviewing trends over time.
Now that you understand what each platform should be measuring, it is time to verify that your tags are actually doing what you think they are. Pixel and tag errors are surprisingly common, and they silently distort your data for weeks or months before anyone notices.
Start with the debugging tools built into each platform:
Meta Pixel Helper: This Chrome extension shows you which Meta pixels are firing on any page, what events they are sending, and whether there are any errors. Open it on your thank-you page and confirm the correct event fires exactly once with the correct parameters.
Google Tag Assistant: Available as a Chrome extension or through GA4's DebugView, this tool shows you which Google tags are firing on each page and flags common implementation errors. Look for duplicate fires, missing parameters, or tags loading out of sequence.
Browser developer console: For a platform-agnostic check, open Chrome DevTools and watch the Network tab while completing a test conversion. You can filter by the tracking endpoint (e.g., "facebook.com/tr" for Meta or "google-analytics.com" for GA4) and confirm the request fires with the correct payload.
Common errors to look for during your audit:
Pixels on wrong pages: A conversion pixel accidentally placed on a product page instead of the confirmation page will fire every time someone views that product, massively inflating your conversion count.
Redirect chains stripping UTM parameters: If your landing pages redirect before the final page loads, UTM parameters can get dropped in the process. This breaks attribution at the source level and causes conversions to appear as direct traffic in GA4. If you are experiencing this, our guide on UTM parameters not tracking properly covers the most common fixes.
Tags loading before consent: If you operate in regions where cookie consent is required, tags that fire before a user accepts your consent banner are not just a compliance issue. They also create inconsistent data, since some users' conversions are tracked and others are not.
Missing event parameters: If your purchase event is not passing revenue, currency, and a unique order ID, you cannot reconcile your ad platform data against your actual revenue figures. Verify that every required parameter is included in every conversion event.
To test thoroughly, complete the full user journey yourself. Click an ad in incognito mode, navigate through your site, and complete the conversion action while monitoring tag fires in real time. This catches issues that only appear in the live user flow.
Success indicator: Every tag fires exactly once per conversion event, on the correct page, with all required parameters populated correctly. No duplicate fires. No missing values.
Even with a perfect pixel implementation, a meaningful portion of your conversions will go untracked at the browser level. This is not a setup error. It is a structural reality of the modern web, and it is a major reason why ad attribution data does not match across platforms.
The primary culprits are well-documented. Apple's App Tracking Transparency (ATT), introduced with iOS 14.5, requires apps to request explicit user permission before tracking activity across other companies' apps and websites. Many users opt out, which significantly reduces the conversion data that Meta and other app-based platforms can observe directly. This is also a key reason Facebook Ads stop tracking conversions accurately.
Safari's Intelligent Tracking Prevention (ITP) caps the lifespan of script-set first-party cookies at 7 days. Firefox's Enhanced Tracking Protection (ETP) applies similar restrictions. This means users who click an ad and convert more than 7 days later may not be attributed correctly in browser-based tracking systems.
Ad blockers add another layer of data loss. Users with ad blockers installed often block tracking pixels and analytics scripts entirely, making those conversions invisible to client-side tracking tools.
The result is that both Google and Meta now rely on modeled conversions to fill these gaps. Machine learning estimates the conversions that cannot be directly observed, based on patterns from users who do allow tracking. This means the numbers you see in your ad platform dashboards include a mix of observed and estimated data. It is one reason why platform-reported conversions often exceed what your CRM records as actual closed deals. Understanding first-party data tracking is critical to navigating this landscape.
The most effective solution to this problem is server-side tracking. Instead of relying on a browser-based pixel to fire and transmit data, server-side tracking sends conversion data from your server directly to ad platforms through their APIs. Because the data travels server to server, it bypasses browser restrictions, ad blockers, and cookie limitations entirely.
Google's server-side tagging via Google Tag Manager and Meta's Conversions API are the most widely adopted implementations of this approach. When set up correctly, they recover a significant portion of conversions that client-side pixels miss.
Beyond recovering lost data, server-side tracking allows you to enrich the conversion events you send back to ad platforms. By including hashed customer identifiers, order values, and other first-party data signals, you give Meta and Google's optimization algorithms better information to work with. Better data in means better targeting and optimization out.
This is exactly where Cometly's server-side tracking and Conversion Sync capabilities make a real difference. Cometly captures conversions that client-side pixels miss and feeds enriched, conversion-ready events back to Meta, Google, and other ad platforms. Instead of your campaigns optimizing on incomplete data, they are working with the most accurate signal available, which directly improves targeting quality and return on ad spend.
Here is an uncomfortable truth about ad platform reporting: every platform is inherently biased toward claiming credit for as many conversions as possible. Google wants to show that Google drove your results. Meta wants to show that Meta drove your results. When a user clicks both a Google ad and a Meta ad before converting, both platforms count that as their conversion. Neither is lying. They are just each telling their own version of the story.
This self-attribution bias means you cannot rely on any single ad platform's native reporting as your ground truth. If you are making budget decisions based on Meta's reported ROAS or Google's reported conversion volume alone, you are working with an optimistic and incomplete picture. This is precisely why marketing data accuracy matters for ROI.
The solution is a centralized attribution platform that sits above your individual ad channels and tracks the full customer journey independently. Rather than asking each platform "how many conversions did you drive?", a centralized system observes the entire sequence of touchpoints from first ad click to final conversion and assigns credit based on a consistent, unbiased methodology.
This is where multi-touch attribution becomes valuable. Unlike last-click attribution, which gives 100% of the credit to the final touchpoint before conversion, multi-touch models distribute credit across all the touchpoints that contributed to the outcome. For longer sales cycles where a prospect might interact with your brand across multiple channels over days or weeks, this gives you a far more accurate picture of what is actually working.
Connecting your CRM data to your attribution system takes this one step further. Instead of optimizing toward conversion counts, you can optimize toward actual revenue. A lead form submission is not a closed deal. By connecting CRM stage data back to your ad touchpoints, you can see which campaigns are driving qualified pipeline and closed revenue, not just form fills.
Cometly is built specifically for this purpose. It connects your ad platforms, CRM, and website to track the entire customer journey in real time, giving you a single, independent view of which ads are actually driving revenue. Its AI-powered analytics surface high-performing campaigns across every channel and provide recommendations for where to scale and where to cut, based on revenue data rather than platform-reported metrics.
With Cometly as your single source of truth, you are no longer reconciling conflicting dashboards or guessing which platform's numbers to trust. You have one place where all your attribution data lives, analyzed together, so you can make budget decisions with confidence.
Fixing your attribution setup is not a one-time project. Tracking breaks. Platforms update their algorithms. Developers push landing page changes that accidentally remove pixels. Privacy regulations evolve. Without a regular audit process, you will find yourself back in the same position months from now, staring at numbers that do not add up and not knowing when things went wrong.
Build a recurring data reconciliation routine into your team's workflow. A weekly or biweekly check does not need to take long. The goal is to compare conversion counts across your key platforms and flag any discrepancies that exceed your acceptable variance threshold. A variance of 10 to 20 percent between platforms is often normal given attribution window differences and modeled data. A variance of 50 percent or more usually signals a tracking problem worth investigating. For a deeper dive into common causes, see our guide on solving attribution data discrepancies.
Set up monitoring to catch issues early:
Conversion volume alerts: In Google Ads and Meta, you can set up automated alerts that notify you when conversion volume drops below a defined threshold. A sudden drop in reported conversions often means a pixel stopped firing, a tag manager change broke something, or a landing page update removed a tracking script.
GA4 anomaly detection: GA4 includes built-in anomaly detection that surfaces unusual changes in your data. Check the Insights section regularly for automated alerts about unexpected shifts in conversion behavior.
A shared tracking changelog: This is one of the most underrated practices in attribution management. Every time someone makes a change that could affect tracking (new pixels added, tag manager updates, landing page changes, attribution window adjustments, new ad campaigns launched), document it in a shared log with the date, the change made, and who made it. When a discrepancy appears in your data, you can trace it back to a specific change rather than spending hours guessing.
When major platform updates roll out, such as changes to Meta's attribution model, Google's conversion tracking methodology, or browser privacy updates, treat it as a trigger for a proactive audit rather than waiting for your data to break and then investigating. Investing in reliable revenue attribution tracking tools makes this process significantly easier to manage at scale.
The same applies to significant changes in your own marketing stack. Launching a new landing page, switching to a new CRM, or adding a new ad channel all create opportunities for tracking gaps to appear. Audit proactively whenever your stack changes.
Success indicator: You catch tracking issues within days of them occurring, rather than discovering weeks-old data gaps during a monthly reporting cycle. Your team has a shared changelog and a defined process for reconciling data regularly.
Fixing ad attribution data mismatches is not about achieving perfect agreement between every platform. Some level of discrepancy is normal, expected, and built into how these systems work. The goal is to minimize the preventable gaps, understand the unavoidable ones, and build enough confidence in your data to make bold budget decisions.
Here is a quick-reference checklist summarizing the six steps covered in this guide:
Step 1: Map your tracking stack. Document every platform reporting conversions, what event triggers each conversion, where in the user journey it fires, and whether duplicate tracking exists.
Step 2: Align attribution windows and conversion definitions. Standardize windows across platforms where possible, disable view-through attribution if it is inflating Meta numbers, and ensure every platform is measuring the same conversion event.
Step 3: Audit your pixel and tag implementation. Use Meta Pixel Helper, Google Tag Assistant, and browser DevTools to verify every tag fires once, on the correct page, with the correct parameters. Test the full user journey in incognito mode.
Step 4: Address privacy-driven data loss. Implement server-side tracking via Meta's Conversions API and Google's server-side tagging to recover conversions lost to iOS restrictions, ITP, and ad blockers. Feed enriched data back to ad platforms to improve their optimization algorithms.
Step 5: Build a single source of truth. Stop relying on self-reported platform data for budget decisions. Use a centralized attribution platform with multi-touch models and CRM integration to see which ads are actually driving revenue.
Step 6: Create a recurring audit process. Set conversion volume alerts, maintain a tracking changelog, and run regular data reconciliation checks so tracking issues surface within days rather than weeks.
The combination of server-side tracking, a centralized attribution platform, and a consistent audit routine creates a system where you can genuinely trust your numbers. Not because every platform agrees, but because you understand exactly why they differ and you have an independent source of truth that connects ad spend to actual revenue.
If you are ready to stop reconciling conflicting dashboards and start making decisions from a single, accurate view of your marketing performance, Get your free demo of Cometly today. See how it connects your ad platforms, CRM, and website to show exactly which ads are driving revenue, with AI-powered recommendations to help you scale what is working with confidence.