
For the last eighteen months, the most consistent answer we have given small-business clients asking “how do we measure AI search?” has been some version of imperfectly. You could squint at Search Console for AI-feature impressions, you could spot-check ChatGPT for branded mentions, you could parse server logs for AI crawlers, and you could stitch all of that together into a half-coherent report. What you could not do was point at a single dashboard and say this is your AI citation surface, here is what changed this week.
That gap just narrowed. On May 15, 2026, Microsoft rolled out an AI citations dashboard inside its free Microsoft Clarity product, according to Search Engine Land's coverage of the launch. Clarity is free, requires no paid analytics seat, and the new dashboard shows — for the first time in a single view — when your URLs are being cited by AI answer engines. This post is the plain-English explainer of what it shows, how to get it running in ten minutes, where it fits next to GA4's new AI Assistant channel, and what it honestly cannot see.
Key Takeaways
- Microsoft Clarity's new AI citations dashboard rolled out on May 15, 2026, inside the free Clarity product — no paid analytics seat required.
- The dashboard surfaces citation counts, cited URLs, the citing AI engine where attributable, and trend over time.
- Installation is a 10-minute task via a script tag, a Google Tag Manager container, or the official Clarity WordPress plugin.
- Clarity measures citations — your URL was referenced by an AI engine; GA4's AI Assistant channel measures referral traffic — a real user clicked through. They are complementary, not substitutes.
- Clarity attributes citations only from AI engines that send identifiable referrers. A meaningful share of ChatGPT and Perplexity traffic in 2026 still does not, so the dashboard underrepresents your real citation footprint.

What does the new Clarity dashboard actually show?
Strip the marketing language and the dashboard is built on a single primitive: a citation event. A citation event is logged when an AI answer engine references a specific URL on your site, attributable through the referrer string, the engine's documented citation feed, or — in Clarity's case — a combination of those signals plus Microsoft's own bot-detection layer. The dashboard rolls those events up into four views, per the Search Engine Land coverage:
- Citation count over a selectable window (day, week, month).
- Cited URLs — which specific pages on your site were referenced, ranked by citation volume.
- Citing AI engine where attributable — for example, citations from Microsoft Copilot, Bing Chat, Perplexity, and a smaller set of others. ChatGPT citations attribute in some configurations and not in others.
- Trend lines showing citation velocity and shifts week over week.
What it does not show, importantly: it does not show whether the citation produced a click. A citation can sit inside an AI answer with your name on it and never send a single user to your site. That is normal — AI answers are designed to satisfy intent inline — and it is one of the reasons measurement of AI search has stayed messy. The right framing is to treat citations as visibility and clicks as yield. Clarity now covers the first half; GA4's AI Assistant channel covers the second.
There is also a Clarity-specific advantage worth naming. Clarity already records session replays, heatmaps, and behavior data for actual visits. That means when a user does click through from an AI surface, you can replay their session in the same product — see what they did, how long they stayed, whether they scrolled, where they bounced. That is a meaningful upgrade over GA4 alone, where you see the visit count and the conversion but not the session itself.
For a deeper look at why citation visibility is now load-bearing for small-business growth, our Information Gain Audits post covers what makes a page worth citing in the first place — Clarity gives you the measurement; the audit gives you the why.
How do you install Clarity in 10 minutes?
Three install paths cover virtually every small-business site. Pick whichever matches your stack, and the AI citations dashboard appears in your Clarity account within a few hours of receiving its first qualifying traffic.
Path 1: Script-tag install
The simplest path. Sign up at clarity.microsoft.com with a Microsoft account, create a new Clarity project using your site's URL, and copy the project's tracking script from the Setup page. The script is a single <script> tag, well under a kilobyte minified, that goes inside the <head> of every page on your site. On a Next.js site you put it in the root layout; on a hand-rolled site you put it in your global header include; on a Webflow or Squarespace site you paste it into the platform's custom-code field. Once the tag is live and a session has fired, the Clarity project goes from “Waiting for data” to active, typically inside an hour.
The script is asynchronous and does not block page render. On well-built sites we see no measurable Core Web Vitals impact from Clarity; on heavy sites that already ship a dozen tags, it adds about as much overhead as a typical analytics tag — small, but not zero. Test before and after on real devices if Core Web Vitals are tight.
Path 2: Google Tag Manager container
If you already have GTM running, use it. Microsoft maintains an official Clarity Custom Template in the GTM gallery — search “Microsoft Clarity” inside Tag Manager's Templates tab, import it, then create a tag using the template, paste your Clarity project ID, and fire it on All Pages. Publish the container. This path is the cleanest in production because the tag is governed by GTM's version history; if you ever need to roll back, you roll back the container.
A common gotcha: if your GTM container is already at the page-load performance ceiling — usually because of two or three poorly chosen marketing tags — adding Clarity makes it visibly worse. Audit your container before adding another tag.
Path 3: WordPress plugin install
If you run WordPress, install the official Microsoft Clarity plugin from the WordPress repository. The plugin is maintained by Microsoft, not a third party, which is the version you want. Activate, paste your Clarity project ID, save. The plugin injects the tag into the front-end <head> automatically. Test in an incognito window — you should see Clarity's session begin in your project within five minutes.
We recommend the plugin path for clients running WordPress because it survives theme switches and updates better than a manually-pasted tag. The script-tag and GTM paths are equally valid; pick the one that matches your stack.

Clarity vs GA4's AI Assistant channel: which one do I need?
You need both, but they answer different questions. The simplest way to explain the difference is that Clarity measures the supply side — your content was referenced by an AI — while GA4's AI Assistant channel measures the demand side — a real human clicked through from an AI surface. Search Engine Land's coverage of the GA4 AI Assistant rollout walks through the GA4 side; our GA4 AI Assistant channel post covers the implementation details for small businesses.
Here is the comparison we now run with clients:
| Dimension | Microsoft Clarity (AI citations) | GA4 AI Assistant channel |
|---|---|---|
| Price | Free | Free |
| Measures | AI engines referencing your URLs (visibility) | Real users clicking from AI surfaces (yield) |
| Strongest signal | Citation counts, cited URLs, citing engine | Sessions, conversions, revenue from AI traffic |
| Weakest spot | Misses AI citations from engines that don't send identifiable referrers | Misses citations that never produce a click |
| Session-level detail | Yes — Clarity records the full session replay | No — only standard GA4 event data |
| Where it lives | clarity.microsoft.com project | analytics.google.com property |
| Install effort | 10 minutes via script, GTM, or plugin | 0 minutes if GA4 already installed; new “AI Assistant” default channel grouping |
Used together, the two tools answer the small-business question the old setup could not: which of our pages are AI engines citing, and which of those citations are actually sending us business? That is the question that drives content prioritization for the next twelve months.
There is a tertiary tool worth naming. Server log file analysis is the underlying truth for which AI crawlers are hitting your site, and our log file analysis for AI crawlers post walks through how to read the logs. Logs catch the AI crawls that never produce a citation and the citations that never produce a click — a strict superset of what Clarity sees, in exchange for more setup effort. Most small businesses do not need to add logs to their measurement stack on day one; once the Clarity + GA4 pairing is running, logs are the next sensible upgrade.

Fort Wayne SMB walkthrough: what an owner-admin actually sees in week one
Here is the concrete version, anchored to the kind of business we work with. Imagine a Fort Wayne professional-services firm — a small commercial real-estate brokerage with seven licensed agents and a marketing site that publishes one blog post a month.
Saturday morning. The owner-admin signs up for Clarity, pastes the tracking script into the site's global header, and pushes the change. By Saturday afternoon the project shows active sessions but the AI citations panel reads “No data yet.”
Sunday and Monday. Sessions are being recorded. Heatmaps populate. The AI citations panel still reads “No data yet.” This is normal — AI citations accumulate slowly for a site this size, and the dashboard needs a meaningful sample before it starts surfacing results.
By the end of week one. The AI citations panel typically shows single-digit weekly citations for a mature but unknown local business — perhaps 3 to 8 citations across the week, mostly from Microsoft Copilot, with one or two from Perplexity. The citing-engine breakdown will be heavy on Copilot because Microsoft's own engines attribute citations cleanly inside Clarity. A new site might show zero in week one and a few in week two.
Week two and onward. A trend line appears. The cited URLs panel starts to differentiate which of your pages are getting cited — usually one or two “anchor” pages do most of the work, and the long tail is shallow. This is the signal that matters for content prioritization. If a page about Allen County commercial lease structures is getting cited and your homepage is not, you know which page to invest in next.
What you will not see in week one: large citation volumes, ChatGPT citations in every row, or a dashboard full of activity. That is not a setup failure; that is what the citation surface honestly looks like for a small-to-mid-size local business with modest content depth. A national publisher will see thousands of weekly citations; a Fort Wayne brokerage will see single digits. Both are correct numbers for their respective sites. The trap is comparing your dashboard to a hypothetical “average citation count” — there is no useful average, because the distribution is dominated by a small number of national sources.
If your dashboard reads zero after three weeks of meaningful traffic, the most common reasons are: (1) your content does not contain the specific factual surfaces AI engines quote from, (2) your structured data is missing or wrong, or (3) your content is genuinely thin. We work through the diagnostic process for each of those in our Intent gap analysis post and the Information Gain Audits post.
What Clarity misses — and how to set expectations
This is the section that protects you from over-relying on the dashboard. The honest framing for Clarity's AI citations data is a meaningful subset of your real citation footprint, with reasonable bias in the data Microsoft is best positioned to measure. Concretely:
- ChatGPT citation attribution is partial. ChatGPT serves citations in multiple modes — web-search citations, in-line links, search-via-Bing citations, and others. Some of these produce identifiable referrers; some do not. The ones that do not are invisible to Clarity. Treat Clarity's ChatGPT count as a floor, not a ceiling.
- Anthropic Claude citations are similarly partial. When Claude cites a source in a chat response, the citation typically does not produce a referrer event Clarity can attribute, unless the user clicks through and lands on your site via a click event with a referrer header.
- Google AI Overviews and AI Mode are not in scope. Clarity's dashboard is focused on the engines Microsoft's own bot-detection and partnership integrations can attribute. AI Overview citations live in Google Search Console's standard performance reports, not in Clarity.
- Multi-step agent flows often drop the referrer. When an autonomous agent uses your content to compose a response — covered in our agentic engine optimization playbook — the final user-facing surface may or may not preserve the citation in a form Clarity can see.
- Bot vs human filtering is conservative. Microsoft is correctly cautious about labeling traffic as a citation if it could be a bot crawl. That means real citations from emerging engines may take weeks to show up while Microsoft's heuristics catch up.
The right way to use the dashboard given those limits is as a trend, not as a total. The week-over-week movement, the relative ranking of cited URLs, and the directional split between citing engines are reliable. The absolute citation count is an under-count. Combining Clarity's data with GA4's AI Assistant channel and — once you outgrow those two — server log analysis is the way to get to a fuller picture. Search Engine Land's coverage of GEO metrics for 2026 covers the broader measurement framework well.

A Fort Wayne local angle: why free measurement matters here
Most Fort Wayne and Northeast Indiana small businesses we work with cannot justify a paid analytics seat beyond GA4. We have audited dozens of NE Indiana marketing stacks, and the most common shape we see is: GA4 (free), Search Console (free), one paid SEO tool (Ahrefs, Semrush, or Moz), and nothing else. The “AI search measurement” line item in that budget is usually zero — not because the work does not matter, but because every paid product we know of in that category starts at a monthly price most owner-operated businesses are unwilling to add.
Clarity changes that math. The AI citations dashboard is free, the tool is genuinely useful for non-AI purposes (session replays and heatmaps are valuable on their own), and the install is short enough that a marketing-savvy owner can do it on a Saturday afternoon. That makes it the first AI-search measurement tool we recommend to every Fort Wayne client, regardless of size.
For a 5-to-25-person business in Auburn, Huntertown, New Haven, or Fort Wayne, the practical sequence is:
- Install Clarity this week. Pair it with GA4's AI Assistant channel (already free, already in your property).
- Let three weeks of data accumulate. Resist the temptation to draw conclusions from week one.
- At the end of week three, pull the top five cited URLs and ask: do these match the pages we want AI engines to be citing? If yes, double down on those pages. If no, the content priority shifts.
- Re-audit quarterly.
This is the same cadence we walk Button Block clients through. Free tools, real measurement, no big-tool dependency.

What we'd recommend for Button Block clients
If you already have Clarity installed for session replay — many of our clients do — the AI citations dashboard appeared in your project automatically on May 15, 2026. Go look. The dashboard is in the left navigation; click in, set the window to the last 30 days, and check whether your top cited URLs are the ones you would have predicted. If they are not, you have a content priority to revisit.
If you do not have Clarity installed, this is the cleanest week to install it. The dashboard is genuinely new, the install is short, and there is no recurring cost. Pair the install with the AI bot traffic surge post framing — Clarity sees the human-clickable citation surface; that post covers the upstream crawl surface, which is increasingly distinct.
If you would like Button Block to install Clarity, audit your existing analytics stack, or build an AI-search measurement dashboard that integrates Clarity, GA4, Search Console, and server logs, reach out. Our Answer Engine Optimization engagements typically include this measurement setup as part of the foundation.
Want a unified AI-search measurement dashboard?
Button Block can install Clarity, set up GA4's AI Assistant channel, and integrate the two with Search Console and server logs so you have one place to see your AI citation surface and AI referral traffic together. Most small-business installs finish in a week.
Frequently Asked Questions
- Is Microsoft Clarity really free for small businesses?
- Yes. Microsoft Clarity is free with no usage caps that meaningfully constrain small-business sites — session replays, heatmaps, and the new AI citations dashboard are all included. Microsoft's documentation describes Clarity as free without an enterprise upsell, and our experience with the product has matched that. The free pricing is genuine.
- Does Clarity capture every AI citation of my content?
- No, and you should not expect it to. Clarity attributes citations from AI engines that send identifiable referrers or that Microsoft has direct integration with — strongest for Microsoft Copilot and Bing Chat, partial for ChatGPT and Perplexity, weakest for closed agent flows that drop the referrer. Treat the dashboard count as a directional floor, not an absolute total.
- How is Clarity different from GA4's AI Assistant channel?
- Clarity measures citations — when your URL was referenced by an AI engine, whether or not a user clicked. GA4's AI Assistant channel measures referral traffic — when a real user clicked through from an AI surface and landed on your site. The two are complementary; use both.
- How long until I see meaningful data in the Clarity AI citations dashboard?
- Most small-business sites need two to three weeks of accumulated data before the trend lines and cited-URL breakdowns become reliable. Week one typically shows single-digit citations for a mature local site and zero for a brand-new site. Resist drawing conclusions from week one; the citation surface is sparse for small businesses, and a small sample is normal.
- Will adding Clarity slow down my site?
- It usually does not. Clarity's tracking script is asynchronous, small, and well-built. On most small-business sites the impact on Core Web Vitals is unmeasurable. On sites already running a heavy tag stack, adding Clarity will worsen what is already a thin margin — audit your tag stack before installation if Core Web Vitals are tight, and consider using Google Tag Manager to manage the script.
- Should I install Clarity, GA4 AI Assistant, and log file analysis all at once?
- For most Fort Wayne small businesses we work with, the answer is no — start with Clarity and GA4 AI Assistant, both free, and add log file analysis only after the first two are running and producing decisions.
- What if my Clarity dashboard shows zero AI citations after a month?
- Three causes are most common: (1) your pages do not contain the specific factual surfaces AI engines quote from — short, sourced, clearly answered questions; (2) your structured data is missing, malformed, or fabricated; or (3) your content is genuinely thin relative to the citation competition for your queries.
Sources & Further Reading
- Search Engine Land: searchengineland.com/microsoft-clarity-citations-dashboard-rolls-out-477663 — Microsoft Clarity citations dashboard rolls out (2026-05-15).
- Microsoft: clarity.microsoft.com — Microsoft Clarity product.
- Microsoft Learn: learn.microsoft.com/en-us/clarity/setup-and-installation/clarity-overview — Microsoft Clarity overview.
- Search Engine Land: searchengineland.com/google-analytics-ai-assistant-477544 — Google Analytics AI Assistant channel (2026-05-14).
- Search Engine Land: searchengineland.com/ai-bot-traffic-surged-publishers-report-473900 — AI bot traffic surged, hitting publishers hardest (2026-04-08).
- Search Engine Land: searchengineland.com/geo-metrics-to-track-476642 — 8 GEO metrics to track in 2026 (2026-05-07).
- Search Engine Land: searchengineland.com/log-file-analysis-ai-crawlers-search-visibility-474428 — Why log file analysis matters for AI crawlers (2026-04-16).
