MBFC is left leaning and cannot be trusted
Executive summary
Media Bias/Fact Check (MBFC) is an independent rating site founded in 2015 that evaluates political bias and factual reporting for thousands of outlets, and its ratings have been used in academic research and compared favorably with other industry tools — but its methods combine objective measures and subjective judgment, it accepts advertising funding, and it faces routine methodological criticisms that counsel using it as one tool among many rather than as an infallible arbiter [1] [2] [3].
1. What MBFC is and how it presents itself
MBFC describes itself as an independent website created to educate the public about media bias and deceptive practices, staffed by a primary editor supported by volunteers and contractors and offering bias and factual-reporting ratings for thousands of media sources [1] [4]. Its publicly posted methodology states it uses a formula that weights factual reporting, political bias and traffic/longevity, and it explicitly incorporates objective measures like traffic data from SimilarWeb while acknowledging subjective analysis and editorial judgment in ratings [5] [2].
2. Evidence that MBFC is treated as a legitimate research tool
Scholars and institutions have used MBFC ratings in research, and independent studies have found high agreement between MBFC’s ratings and other ground-truth datasets such as NewsGuard and an independent fact-check dataset from 2017, suggesting its outputs correlate strongly with established measures of credibility and misinformation risk [2] [3]. MBFC has also been included in curated databases and research resources (for example, RAND’s list of media-bias tools), which further indicates its practical acceptance in research and media-literacy contexts [6].
3. Common critiques that fuel “cannot be trusted” claims
Criticisms fall into predictable categories: methodological simplification, potential for subjective bias, and funding model. Media-literacy educators and critics have warned that single-score services can oversimplify a multidimensional problem and that pop-up ads and the site’s reliance on advertising and donations raise transparency questions even if MBFC claims financial independence [3] [4]. MBFC itself acknowledges human evaluators and subjective elements, which means its classifications are not purely algorithmic or immune to error [5] [2].
4. The accusation that MBFC is left‑leaning: what the record shows
Assertions that MBFC is institutionally left-leaning are not clearly borne out by the available reporting: MBFC publishes a breakdown showing an average bias rating and explains why higher counts of low-credibility right‑leaning sites appear in its database — attributing that distribution to submission patterns and the expansion of conservative media ecosystems rather than an intentional leftward tilt in methodology [7]. Multiple external sources and academic guides present MBFC as a broadly nonpartisan tool for lateral reading, even while warning users about limits to any single rating system [3] [8].
5. How to use MBFC responsibly and where it falls short
MBFC is best treated as a useful, widely cited index that flags bias and credibility issues but not as an unquestionable authority: the site’s methodology is documented and evolves, it pulls traffic metrics and uses human judgment, and it is transparent about editorial processes and funding sources — all of which enable scrutiny but do not eliminate subjectivity or error [5] [1]. Media-education guides recommend consulting MBFC alongside other tools and lateral-reading techniques rather than relying on its label alone, which is precisely the pragmatic answer to whether MBFC “can be trusted” [3] [9].