Troubleshooting & Debugging
Diagnose and fix the most common issues with Tailor experiments and tracking.
Verify Tailor Is Running
The fastest way to check if the Tailor script is installed and active on a page is the healthcheck overlay.
Healthcheck Overlay
Append ?t_healthcheck to any page URL. If the Tailor script is present, a debug overlay will appear in the lower-right corner of the page showing script status, experiment assignment, and targeting details.
If the overlay doesn't appear, the Tailor tag is either not installed on the page, blocked by a Content Security Policy (CSP), or blocked by a consent manager.
No Data Showing Up
If your experiment shows zero impressions or conversions, work through this decision tree:
1. Is the variant receiving traffic?
- Check that allocation is greater than 0% (not fully deramped)
- Confirm targeting rules match real traffic (UTMs present after redirects, geo/device correct)
- Check for priority conflicts with other experiments on the same page
2. Is the goal firing?
- Run an event parity test: trigger the conversion action once on control, once on the variant, and confirm the event fires identically
- Check for duplicate events, SPA route change issues, or consent blocking
Wrong Experience Showing
If visitors are seeing the wrong variant, check these causes in order of likelihood:
- 1Overlapping targeting rules: multiple experiments match the same visitor. The most specific rule takes priority, but ambiguity can cause unexpected results. Narrow your rules to a single value to test.
- 2Priority ordering: if multiple experiments are on the same page, verify your most specific rule outranks the general one.
- 3CDN caching stale content: the CDN may be serving an old version. Clear your browser cache and try incognito.
- 4UTMs missing or rewritten: redirects, vanity URLs, and privacy tools can strip or rewrite UTM parameters before they reach the page.
- 5Device/geo differences: test on the same device type and location as your target audience, or use preview mode to force the variant.
Quick Fix (5 Minutes)
Force UTMs/params in your URL, narrow the targeting rule to a single value, and test in an incognito window. Use ?preview_mode=treatment to verify the variant renders correctly.
Extension Says "No Tag Found"
If the Tailor Chrome extension can't detect the tag on your page:
First, try the healthcheck
Append ?t_healthcheck to the page URL. If the overlay appears, the tag is installed but the extension may need a refresh or update.
If the overlay doesn't appear
The tag is not loading on the page. Check your Google Tag Manager (GTM) configuration, Content Security Policy (CSP), and consent manager settings. The Tailor script may be blocked.
If the overlay appears but the extension fails
Try incognito mode, verify you're logged into the correct Tailor workspace, and check that the extension is up to date. See the Extension Update Guide.
Performance Drop After Launch
If metrics dip after starting a Tailor experiment, don't panic. Here's how to triage:
Fast Rollback
Deramp the experiment immediately (set allocation to 0% for the variant). This sends 100% of traffic back to the control. Confirm metrics stabilize before investigating further.
Possible causes to investigate:
- Broken tracking: tags or events were added, changed, or duplicated during the same period
- Variant breaking UX: the tailored experience has a layout or functionality issue on certain devices
- Traffic mix shift: campaigns, keywords, or audiences changed at the same time the experiment launched
- Consent or ad blocker changes: consent mode updates or ad blocker list changes can affect tracking coverage
CAC Up, CVR Down: What Changed?
When costs rise and conversion drops, work through these causes in order:
- 1Tracking integrity: any tag changes, duplicate events, consent issues, or conversion definition changes?
- 2Traffic mix shifts: did campaigns, keywords, audiences, geo, or device mix change?
- 3Page changes: any deploys, speed regressions, or outages during this window?
- 4Offer/message mismatch: does the ad promise something the landing page doesn't deliver?
- 5Attribution window changes: did the ads platform update its attribution settings or model?
Numbers Differ Between Platforms
It's normal for Tailor, GA4, and your ads platform to report different numbers. Each tool counts differently (visitor assignment vs. session vs. click), handles consent differently, and uses different attribution windows. Focus on the relative lift between control and treatment within Tailor's own consistent measurement, not absolute numbers across tools.
Seeing Control When Expecting Treatment
If you're seeing the original page when you expect the tailored version:
- Try an incognito window (sticky assignment from a previous session may persist)
- Clear cache and hard refresh (Cmd+Shift+R / Ctrl+Shift+R)
- Check that the experiment is active and ramped (not deramped to 0%)
- Verify targeting rules match your current context (UTMs, device, geo)
- Use
?preview_mode=treatmentto force the variant
Flash of Original Content (FOUC)
It's normal to see a brief flash (under 100ms) of the original page before Tailor applies changes. This is usually imperceptible. The flash may be more noticeable on slow networks, heavy pages, or with complex modifications.
If the flash is noticeable to visitors, reach out to support@tailorhq.ai and we can work on optimizations specific to your setup.
Is Tailor Breaking My Tracking?
To confirm Tailor isn't interfering with your existing analytics, run a simple parity test: trigger the same conversion action on both the control and treatment versions. If the events fire identically in your analytics platform, Tailor is not affecting tracking.
If you see differences, check whether Tailor's DOM changes affect the element your analytics tool is listening to (e.g. a changed button ID or class name). Adjust the tailored page to preserve the original tracking selectors.
Still Stuck?
If none of the above resolves your issue, reach out to support@tailorhq.ai with the page URL, experiment name, what you expected, what happened, and the time window when the issue occurred. The more context you include, the faster we can help.
