Keep Factually independent

Whether you agree or disagree with our analysis, these conversations matter for democracy. We don't take money from political groups - even a $5 donation helps us keep it that way.

Loading...Goal: 1,000 supporters
Loading...

Is factually accurate or slanted in one direction?

Checked on November 17, 2025
Disclaimer: Factually can make mistakes. Please verify important info or breaking news. Learn more.

Executive summary

Media Bias/Fact Check (MBFC) publishes daily "vetted fact checks" and maintains a database that labels the political slant of media and fact‑checking outlets, using color codes such as Red (Right‑Leaning), Green (Least Biased) and Blue (Left‑Leaning) [1]. MBFC says it "fact‑checks the fact‑checkers," selects only IFCN signatories or sources it has verified as credible, and sometimes offers its own bias rating when it disagrees with a fact‑checker [2] [1].

1. What MBFC says it does — a quick inventory

MBFC presents itself as a curator of fact checks, publishing daily compilations of items it calls "MBFC’s Daily Vetted Fact Checks," and it repeatedly states that it selects fact‑checkers that are either signatories of the International Fact‑Checking Network (IFCN) or have been "verified as credible by MBFC" [1] [3] [4]. The site also declares it reviews each fact check for accuracy and "fact‑checks the fact‑checkers," offering its own bias ratings for outlets when it disagrees with the original fact checker [1] [4].

2. How MBFC labels bias — simple codes, big implications

MBFC uses an explicit color/label system — Red = Right‑Leaning, Green = Least Biased, Blue = Left‑Leaning, Black = Unrated — and applies those labels to both news outlets and fact‑checking sources [1] [3]. That visual shorthand makes it easy for readers to sort outlets quickly, but it also compresses complex editorial practices and methodologies into single tags, a choice that can amplify perceptions of slant even when MBFC also documents factual reliability [1] [2].

3. Claims about independence and vetting — what the site documents

MBFC claims a two‑part standard: preference for IFCN signatories and a separate MBFC verification track for others, plus internal review of each fact check it lists [1] [4]. That description frames MBFC as both gatekeeper and reviewer — a role that provides value (aggregation and cross‑checking) but also concentrates power to decide which fact checks and which biases are highlighted [1] [2].

4. What MBFC itself publishes — editorial products and examples

MBFC’s output includes daily roundup posts titled "MBFC’s Daily Vetted Fact Checks" by date; the snippets show numerous dated entries (for example, 11/13–11/17/2025) and topical fact‑check headlines such as its "Fact vs. Fiction" pieces [3] [5]. MBFC also maintains pages that catalog "Fact Checking Sources" and publishes individual bias & credibility profiles [2] [6].

5. External context in the provided reporting — other fact‑checkers still in play

Other fact‑checking organizations like Reuters and FactCheck.org continue producing independent fact checks; Reuters runs its own fact‑check page and FactCheck.org posts answers on topical claims such as stimulus checks, demonstrating that MBFC is one of several actors in an ecosystem of verification [7] [8]. MBFC positions itself as an aggregator and meta‑reviewer rather than the sole authority [2].

6. Points in favor of MBFC’s approach

MBFC’s transparency about its color codes and its stated preference for IFCN signatories creates a visible policy framework for readers to judge: it openly states selection criteria and that it sometimes rates fact‑checkers’ bias and credibility [1] [2]. That explicitness helps readers interpret why a particular fact check appears on its pages and what MBFC believes about the source’s slant.

7. Limitations, tradeoffs and where caution is warranted

Available sources show MBFC compresses nuanced editorial behavior into single labels and places itself in the reviewer role — a setup that risks substituting one layer of judgment (MBFC’s) for others and could reflect MBFC’s own editorial perspectives in how it rates sources [1] [4]. The provided reporting does not include MBFC’s full methodological rubric or third‑party studies validating MBFC’s ratings, so judgment about MBFC’s ultimate neutrality relies on readers assessing its explanations and comparing them to other independent audits [2].

8. How readers should use MBFC alongside other tools

Treat MBFC as a curated index and a starting point: use its labels to flag potential slant, but cross‑check original fact checks with the primary fact‑checking outlets (e.g., Reuters, FactCheck.org) and read MBFC’s source pages to understand why an outlet received its rating [7] [8] [2]. That layered approach reduces the risk of relying solely on MBFC’s color codes to determine what is "factually accurate."

9. Bottom line

MBFC provides a clear, summarized taxonomy of media and fact‑checking bias and compiles daily vetted lists of fact checks, which is useful for rapid orientation; however, its model centralizes evaluative authority and simplifies complex editorial behavior into color labels, so readers should corroborate MBFC’s assessments with the original fact checks and alternative verification resources documented by MBFC itself and by independent fact‑checkers [1] [2] [7].

Want to dive deeper?
How can I assess whether a news article is factually accurate or biased?
What practical steps detect slant in opinion vs. news reporting?
Which fact-checking organizations verify claims across the political spectrum?
What linguistic cues indicate ideological bias in writing?
How do source selection and omission influence perceived accuracy?