Keep Factually independent
Whether you agree or disagree with our analysis, these conversations matter for democracy. We don't take money from political groups - even a $5 donation helps us keep it that way.
What are the top organizations rating media bias like AllSides or Ad Fontes?
Executive summary
Three widely used organizations that rate media bias are AllSides, Ad Fontes Media (creator of the Media Bias Chart), and Media Bias/Fact Check (MBFC); AllSides lists over 2,400 rated sources and emphasizes crowd and expert panels [1] [2], Ad Fontes publishes a flagship Media Bias Chart updated twice a year and an interactive app [3] [4] [5], and MBFC publishes numerical bias and factual-reporting ratings and extensive writeups [6] [7]. Academic and library guides commonly point readers to these three tools and note both their usefulness and limits when interpreting “bias charts” [8] [9] [10].
1. AllSides — “crowd plus expert” bias ratings with scale and scope
AllSides presents a Media Bias Chart and a database of more than 2,400 rated outlets; it says its process balances input from thousands of everyday Americans across the political spectrum alongside a politically balanced expert panel, and it focuses explicitly on political bias rather than accuracy or editorial standards [2] [1]. AllSides also documents methods such as blind bias surveys and small-group editorial review in public announcements about chart updates [11]. Libraries and research guides cite AllSides as a common resource for identifying political leanings of news sources while reminding users that AllSides does not rate overall factual reliability [8] [12].
2. Ad Fontes Media — the Media Bias Chart and a reproducible methodology
Ad Fontes Media is known for the Media Bias Chart, including a public gallery and an interactive web app; the chart maps outlets on axes of bias (left–right) and reliability/news value, and Ad Fontes says its methodology uses a politically balanced team of analysts and a reproducible review process [3] [4] [5]. Educational materials and libraries highlight Ad Fontes’ claim of a “rigorous, reproducible methodology” and note the chart’s dual dimensions—bias and reliability—which distinguish it from tools that only place outlets on a left–right spectrum [13] [14].
3. Media Bias/Fact Check (MBFC) — numerical bias & factual-reporting scores
Media Bias/Fact Check publishes ratings that score outlets on political bias and on “factual reporting” (for example, MBFC assigns bias labels like “far right” and factual-rating categories such as “mixed”), and it issues detailed writeups about specific organizations and topics [6] [7]. Libraries and guides reference MBFC as a searchable database that complements chart-style tools, noting MBFC’s granular classifications and case-by-case explanations [9].
4. Aggregators and downstream users — how other services combine ratings
Some news-aggregation and context services average or combine these organizations’ ratings. Ground News, for instance, states its bias scores come from an average of independent monitors including AllSides, Ad Fontes Media and Media Bias/Fact Check, using the combined view to produce its own “bias” metric [15]. This practice illustrates a practical approach: users can triangulate across systems rather than rely on one chart.
5. What researchers and journalism-watchers warn about — strengths and blind spots
Poynter and academic guides urge caution: bias charts are useful entry points but not definitive judgments on quality; political bias is only one axis of media evaluation and many charts do not measure editorial standards or accuracy comprehensively [10] [9]. Poynter notes that AllSides focuses on political bias while Ad Fontes also attempts to rate reliability, and both organizations publish their methods for scrutiny [10] [1] [3].
6. Practical takeaway for news consumers — triangulate, know limits
Use AllSides for broad political-placement across many outlets, Ad Fontes when you want a two-dimensional view that includes a reliability axis, and MBFC for granular factual-reporting assessments and written analyses; cross-checking across them (or using services that aggregate them) helps reveal where tools agree or diverge [2] [5] [6] [15]. Libraries and educators recommend treating these charts as starting points for media literacy, not as final verdicts on trustworthiness [9] [8] [10].
Limitations and open questions: available sources do not mention independent academic validation comparing all three systems head-to-head across a large sample; users should therefore be transparent about which metric they cite and mindful that methodology differences produce different outcomes [10] [13].