Is this true?: HUMANS NOW ONLY MAKE UP 38.5% OF INTERNET TRAFFIC. THE OTHER 61.5% IS NON-HUMAN, LIKE BOTS, HACKING TOOLS, ETC.
Executive summary
The headline that “humans now only make up 38.5% of internet traffic” is not a settled, universally accepted fact; estimates of non‑human traffic vary widely by researcher and year, with recent industry reports clustering around roughly half non‑human and half human traffic rather than the specific 61.5% figure implied by the claim [1] [2] [3]. Differences in definition, measurement methods and the interests of reporting organizations explain most of the divergence in published numbers, and there is no single authoritative global meter that proves the 38.5% figure as definitive [4].
1. The headline numbers: competing reports and what they actually say
Several widely cited reports produce different snapshots: Imperva’s 2024 “Bad Bot” research reports that almost 50% of internet traffic is non‑human and that bad bots alone make up nearly one‑third of traffic in some accounts, framing a roughly 50/50 split rather than the 38.5% human figure [1] [2] [3]. Older or alternative trackers have produced higher non‑human estimates — for example, a 2013 Incapsula figure often repeated in commentary put non‑human traffic at 61.5% [5] — while other vendors and summaries cite figures such as 42% or similar for bot traffic [6]. Media outlets have amplified different takes: Vice reported a 38% human (≈62% non‑human) figure in 2024, echoing one set of measurements but not establishing consensus [7].
2. Why the numbers diverge: definitions, scope and measurement choices
“Non‑human traffic” is an umbrella term that can include benign automated crawlers (search engines, site monitors, CDNs), “good bots” that index sites, and malicious or “bad” bots used for scraping, fraud, and attacks — and reports differ in whether and how they separate those categories [6] [8]. Measurement approaches vary too: some studies count requests, others measure sessions or bytes transferred, and sample sets differ (global vs. particular industries, traffic to clients of a security vendor, or web vs. API traffic), producing inherently different ratios [4]. As Christopher Butler and others note, there is no central registry of bots; detection relies on heuristics and vendor toolsets that yield differing results [4].
3. Who benefits from particular framings — the incentives behind reports
The organizations issuing these reports include security vendors whose business model is bot detection and mitigation, and several journalists and analysts warn that cybersecurity firms may have an incentive to stress the problem to sell services, a factor explicitly flagged in coverage of the Imperva findings [3]. Conversely, other analysts emphasize that “good” automated traffic (search engine crawlers, legitimate monitoring) is part of the total, which softens the alarmist interpretation that most traffic is malicious [6] [8]. Readers should note that press summaries and secondary outlets sometimes conflate “bot” with “malicious bot,” amplifying more dramatic interpretations [2] [3].
4. What can be robustly concluded from available reporting
Multiple recent vendor reports converge on the headline that automated traffic is large and growing — often approaching or roughly equaling human traffic — and that “bad” automated traffic represents an economically significant subset with measurable harms like ad fraud and scraping [1] [2] [3]. However, the specific numeric claim that humans make up exactly 38.5% of internet traffic, with 61.5% being non‑human, cannot be proven as a universal truth across the entire internet from the cited sources because of differing methodologies and timeframes [1] [5] [4].
5. Verdict and practical takeaway
The assertion that humans now only account for 38.5% of internet traffic is an over‑precise restatement of a contested and method‑dependent set of industry estimates: the best contemporary industry evidence cited here points to non‑human traffic being very large — roughly around half of total traffic in recent Imperva reporting — but does not settle on the 61.5% figure as definitive across the whole internet [1] [2] [3]. For policy makers, advertisers and platform operators the actionable conclusion is clear: automated traffic (both benign and malicious) is a dominant force that must be measured and managed, but that dominance should not be reduced to a single fixed percentage without scrutiny of sources and methods [6] [4].