HUMAN Security measured a 6,900% increase in AI-agent and agentic-browser requests since July 2025, and a 144.7% surge during Black Friday to Cyber Monday 2025 specifically targeting e-commerce sites. If your WooCommerce GA4 is showing rising sessions and flat purchases since October 2025, this is probably why. ChatGPT Atlas and Perplexity Comet are already walking through your funnel — adding to cart, filling forms, abandoning — and every one of those sessions is training your ad platforms on intent that doesn’t exist. The fix isn’t a blocker plugin; it’s server-side filtering at the event layer, before agent activity becomes a conversion signal.
The Numbers Are Already in Your GA4
This is not a forecast. HUMAN Security observed a 6,900% increase in requests from AI Agents and agentic browsers since July 2025 (HUMAN Security, 2026), and separately measured a 144.7% surge in agent traffic targeting e-commerce sites during the 5-day Black Friday to Cyber Monday window compared to the previous 5 days (HUMAN Security, 2025). SUSO Digital’s independent analysis confirmed total AI-agent traffic more than tripled between July and September 2025 alone (SUSO Digital, 2025). The volume is real and it is specifically going to stores that sell things.
The two dominant agentic browsers:
- ChatGPT Atlas — released October 21, 2025, initially macOS only (Washington Post / TIME, 2025). Built on Chromium. Network requests often carry CFNetwork and Darwin user-agent fragments visible at the server layer.
- Perplexity Comet — released July 2025, now available on macOS, Windows, and Android (Washington Post / TIME, 2025). Built on Chromium. Distinctive signature patterns in header ordering and extension-manifest gaps.
Three checks to run in your own data this week:
- GA4 browser and user-agent dimensions: filter for strings containing Atlas, Comet, PerplexityBot, or CFNetwork. The traffic is already there.
- Session behavior: look at add-to-cart sessions from Q4 2025 onward with zero scroll depth, no viewport interaction, and immediate cart drop. That’s agent behavior, not human.
- Referrer reports: traffic from perplexity.ai, chat.openai.com, or atlas.openai.com that navigates through multiple product pages in seconds.
You may be interested in: Your WooCommerce Tracking Failed 30 Days Ago
What This Breaks for WooCommerce
Agentic browsers don’t just inflate session counts. They corrupt three systems that most WooCommerce stores are actively optimising against right now.
1. Conversion Rate Metrics
Every agent session that doesn’t buy lands on the denominator of your site-wide conversion rate. A store that used to run at 2.5% CVR suddenly looks like 1.8%, and the team starts redesigning the cart page to fix a problem that doesn’t exist. The product and the funnel are fine; the traffic mix changed underneath them.
2. Meta Advantage+ Training on Fake Intent
When an agent’s add-to-cart fires Meta Pixel or reaches Meta CAPI as a standard InitiateCheckout event, Meta Advantage+ treats it as real intent. For stores already hovering at Meta Advantage+’s 50-event-per-week optimisation threshold, a few dozen agent events per week can meaningfully shift what the algorithm considers a converting profile. The ad platform then optimises toward lookalikes of agent sessions — which is a fast way to spend real money on fake audiences.
3. Google Smart Bidding Optimising Toward Ghost Carts
The same mechanic hits Google. Smart Bidding is only as good as the conversion signal you feed it — and when that signal includes rapid agent-driven cart additions that never convert, the algorithm learns to bid harder on traffic that resembles agents. Stores running Performance Max see this first because the model has the widest signal surface area, but any conversion-based bidding strategy is vulnerable.
Documented agent patterns are specific and recognisable. HUMAN’s research on Perplexity Comet captured a carding pattern: rapid card additions, repeated payment attempts, and fallback to loyalty-point redemption — a traffic signature resembling early-stage credit-card fraud even when the underlying user intent was legitimate automation (HUMAN Security, 2026). That’s not a subtle signal. It’s a direct hit on fraud flags, payment-processor risk scores, and cart analytics simultaneously.
Why Your Bot Filter Doesn’t See Them
Standard bot protection was designed for an earlier era of automation: headless Chrome, PhantomJS, cURL requests, Python-script scrapers. All of those have tell-tale signatures at the browser level. Atlas and Comet don’t. Both are built on the Chromium engine, and with basic fingerprinting techniques, they appear almost identical to a generic Chrome browser (HUMAN Security, 2026).
That means a WordPress security plugin scanning for “is this a bot?” at the PHP layer will see a clean Chrome user agent, a valid cookie store, a credible viewport, and real JavaScript execution. It will pass every check — because the agent is, by design, driving an actual Chromium instance. The automation is above the browser, not below it.
The implication is architectural: client-side tools arrive too late. By the time a WordPress plugin or a GTM tag fires, the agent has already loaded the page, the event has already been registered in GA4, and Meta Pixel has already pinged. Filtering at that layer is cleaning up after the pollution already entered the data.
The Server-Side Detection Signatures That Actually Work
Agentic browsers leave forensic breadcrumbs at the network request layer — visible to a server, invisible to most client-side filters. Three signature families that HUMAN’s research surfaces as reliable for Q1 2026:
- CFNetwork + Darwin user-agent patterns on macOS — common in Atlas sessions because of how OpenAI’s Swift networking layer wraps outbound requests. Visible server-side in the
User-Agentheader; rarely surfaced in WordPress analytics. - Chromium extension-manifest gaps — standard Chrome carries an identifiable manifest of installed extensions and browser features. Stripped Chromium builds used by agents are missing expected entries, detectable through specific fetch patterns.
- Text-to-speech voice-list absence — full Chrome exposes system TTS voices via the Web Speech API. Agent Chromium builds often omit this. Scripted detection at the server side catches the asymmetry.
None of these are individually definitive — an advanced operator can spoof any single fingerprint. Used together, they get accurate enough to route agent traffic into a separate bucket with high confidence. The goal is not to block the agent. It’s to keep agent activity out of your conversion-rate math and out of Meta’s and Google’s learning data.
Context from the broader security community matters here too: OWASP’s 2026 Top 10 for Agentic Applications identifies goal hijacking, tool misuse, memory poisoning, and unintended cross-domain actions as primary failure modes (OWASP, 2026). Agentic traffic is a new attack surface category, not just an analytics nuisance — and one that plugin-layer defenses were not designed for.
You may be interested in: Why Is Google Smart Bidding Bringing in WooCommerce Customers Who Return Their Orders?
The Architectural Fix
The clean implementation runs detection at the event-collection layer, before events ever become conversion signals. The sequence looks like this:
- A request hits the WooCommerce site — page view, add-to-cart, begin-checkout, whatever.
- The server-side event collector evaluates the request signatures against an agent-detection ruleset.
- If the signatures match agent patterns, the event is logged to a separate BigQuery table for visibility and routed away from Meta CAPI, Google Enhanced Conversions, GA4 Measurement Protocol, and other learning destinations.
- If the signatures look human, the event proceeds through the normal pipeline to all destinations.
The agent activity is still measured — you need to know the volume — but it is not feeding the ad platforms’ optimisation algorithms or the store’s conversion-rate metrics. Human events and agent events are separated at source.
Transmute Engine™ is a first-party Node.js server running on the store’s own subdomain (e.g., data.yourstore.com) that receives every WooCommerce event and enforces agent-filtering rules before any outPIPE fires. Because the evaluation happens at the server layer — not in the browser, not in a WordPress plugin — the agent-detection signatures (CFNetwork patterns, manifest gaps, TTS-voice checks) are all accessible before the event becomes a Meta CAPI or GA4 call. That’s the architectural advantage a plugin or GTM container cannot cleanly replicate: by the time the browser-side code runs, the agent has already been counted.
Key Takeaways
- HUMAN Security documented a 6,900% surge in AI-agent browser traffic since July 2025, with a 144.7% Black Friday spike specifically on e-commerce sites. The volume is already in your GA4.
- ChatGPT Atlas and Perplexity Comet look like generic Chrome to standard bot filters because they are literally built on Chromium. WordPress plugin-level detection does not catch them.
- Three systems break: site-wide conversion rate, Meta Advantage+ training, and Google Smart Bidding — all trained on mixed human-plus-agent data that client-side filters arrive too late to clean.
- Server-side detection signatures exist (CFNetwork/Darwin UA, Chromium manifest gaps, TTS voice-list absence) but they require the event-collection layer to sit on your own server, not in the browser.
- Blocking is the wrong response. The right response is filtering: log agent events separately for visibility, keep human events flowing to Meta, Google, and GA4 unchanged.
Frequently Asked Questions
Filter your GA4 browser and user-agent dimensions for strings containing “Atlas”, “Comet”, or “PerplexityBot”, and check for CFNetwork or Darwin agent signatures on macOS sessions. Both browsers are Chromium-based, so they won’t appear as “Chrome bot” — look for referrer patterns showing rapid multi-page navigation with near-zero dwell time and no scroll depth.
This is the classic agent-activity signature. HUMAN Security documented Perplexity Comet producing rapid card additions, repeated payment attempts, and fallback to loyalty-point redemption — patterns that look like carding fraud even when the underlying user intent is legitimate automation. If your add-to-cart rate started climbing in Q4 2025 while purchase rate stayed flat, agent traffic is a likely contributor.
Blocking is not recommended. Some of the traffic is doing legitimate research on behalf of a human customer who may buy later. The correct approach is to filter agent activity from your analytics and ad-platform conversion signals — not from the site itself — so your optimisation algorithms train on human intent while agent users still get served pages.
Yes. If an agent’s add-to-cart reaches Meta CAPI as a standard event, Meta Advantage+ treats it as signal and trains on the underlying browser session. For stores already near Meta’s 50-event-per-week optimisation threshold, a few dozen agent events per week can materially shift what Advantage+ considers a converting profile.
Most don’t. Both browsers are built on the Chromium engine and pass standard fingerprinting checks as generic Chrome. Detection requires deeper signatures — CFNetwork or Darwin user-agent combinations, extension-manifest gaps, text-to-speech voice-list absence — which are typically only feasible at the server-side request layer, not in WordPress plugins that see events after the browser has already reported them.
Run the three GA4 checks above this week before you set next quarter’s ad budget. If agent traffic is already in your funnel, no optimisation decision made on current dashboards is safe. Start at seresa.io/product.
