Is this a legit fact check sight or does it have an political bias?

Checked on February 8, 2026
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

Media Bias/Fact Check (MBFC) is a widely used, independently run media-rating site that publishes bias and factuality ratings for thousands of outlets and is cited by libraries, educators and other organizations — but its methodology mixes objective and subjective elements and has drawn criticism for lacking rigorous, scientific validation, so describing it as an unqualified "neutral fact-check site" overstates the case [1] [2]. Researchers and tools use MBFC data, yet reputable critics and media literacy organizations warn that such ratings can oversimplify complex editorial behavior and may reflect the choices of their evaluators [2] [3].

1. MBFC’s reach and role in the information ecosystem

MBFC publishes bias and factual-reporting assessments for thousands of outlets and is presented as a go-to resource on many university and library research guides, and in tools such as browser extensions that display MBFC ratings while browsing [1] [4] [5]. Academic and literacy projects have used MBFC’s ratings as input to larger analyses — for example, researchers have paired MBFC data with other datasets to build metrics tracking the prevalence of questionable sources online [2] [3]. Those citations signal practical utility and broad adoption rather than an institutional imprimatur of scientific infallibility [1] [3].

2. What MBFC claims about neutrality and method

MBFC describes itself as “the most comprehensive media bias resource on the internet,” saying it evaluates political bias and factual reporting across thousands of entries using a blend of measures and reviewers, and it discloses that some of its work is assisted by volunteers and funded by donations and advertising [1] [6]. The site asserts it uses signatories of the International Fact-Checking Network when selecting fact-checkers it references and publishes fact checks from around the world [1]. Those claims point to an effort at transparency and some external standards, but they don’t fully settle methodological rigor [1].

3. Independent assessments and known criticisms

Media-bias rating projects, including MBFC, have been both used and critiqued by journalism scholars and institutions: a Poynter Institute analysis warned that quick-rating systems can “misfire” and oversimplify a multidimensional problem, and that MBFC’s methods are “in no way scientific” despite frequent citation [2]. At the same time, scientific studies have found MBFC’s ratings show high agreement with other credibility datasets such as NewsGuard and independent fact-check datasets, suggesting convergent validity even if methods mix objective and subjective elements [2].

4. Where bias risk lives: selection, language and story choice

Scholars and watchdogs stress that an outlet can score well on “factual reporting” while still showing editorial bias through story selection, framing or emotional language — a distinction MBFC itself documents in its methodology [2]. Other rubric-driven efforts (AllSides, Ad Fontes) show fact-checkers and media-rating sites can shift left or right on aggregate metrics, with specific ratings varying by sample and method; for instance, AllSides maps fact-check sites and notes differences in focus and story choice that can produce perceived leanings [7] [8] [9].

5. Practical advice for readers and hidden agendas to watch

Libraries and media-literacy guides recommend MBFC as a useful starting point for lateral reading but caution that no single rating should be treated as definitive; users are urged to compare MBFC with other tools like AllSides, Ad Fontes, and direct source evaluation [10] [5] [11]. MBFC’s funding through donations and advertising, the involvement of volunteer researchers, and its self-description that combines objective metrics with subjective analysis are legitimate areas to scrutinize for potential bias or inconsistency — not evidence of a deliberate partisan agenda, but reasons to treat its results as one input among several [1] [2] [6].

6. Bottom line: useful but not a single-source arbiter

MBFC is a legitimate, widely used media-rating resource that offers transparent labels and broad coverage and has demonstrable utility in research and education, but it is not an unassailable, purely scientific arbiter of bias; its mixed methodology and critics’ warnings mean its judgments should be checked against other assessments and original reporting when making definitive claims about political slant [1] [2] [3].

Want to dive deeper?
How do MBFC’s bias and factuality ratings compare to AllSides and Ad Fontes on specific outlets?
What methodological critiques have independent researchers published about Media Bias/Fact Check’s rating system?
How should librarians and educators teach students to use media-bias sites like MBFC alongside primary-source evaluation?