My previous post talked about my objections to the so-called “Obamameter” that the website Politifact uses. Now I am going to aim to discuss their signature item: the Truth-O-Meter™.
The challenge with a lot of fact checking is that statements are seldom flat true or flat false on their face. Truth is nuanced. There are assumptions, models, caveats, and context. Even if we both agree that a statement is lacking in context, we may reasonably disagree whether that context is critical to the core meaning of what has been said, and how much context is critical.
So when President Obama spoke on the matter of private sector job growth in the 2012 State of the Union, there were a few caveats on the statement. The statement was absolutely true on its face, but the following questions come up:
- Who is really responsible for how much of it?
- Was President Obama taking credit for it and to what degree, given that it was a State of the Union address?
- It is also worth noting that it is still down from where we started, so it should not be construed that we are recovered.
All fair enough to point out, but the challenge comes in when the question gets asked: “How do we rate this on a linear scale?” Politifact started by rating it Half True despite that by their own analysis (and that of FactCheck.org) the actual statement was absolutely true. So in terms of what he said: True. Whether he is taking credit for it unnecessarily and to what degree that matters… is a difficult question, and not one that fits neatly on a scale that goes from “Pants on Fire” to “True.” It’s also one where reasonable people can disagree to some extent, or at least where there is room for discussion.
The same is true when they evaluated the question of the Democrats claim that the Republican plan was looking to “end medicare.” The answer is… what do you consider the core of Medicare? How much qualification is required before that statement becomes True, or at least debatable?
Fact checking isn’t about making an absolute judgement of truth or falsity, because such judgements can obscure the actual debate rather than illuminate. They should be something to help voters make up their minds, not something that makes an absolute statement.
But Politifact doesn’t see it that way. It sees the Truth-O-Meter™ as absolutely core to their business, and likes to rate claims on that scale no matter how amenable the nuances of the claim are to being evaluated. This means that many of the objections to Politifact have nothing to do with their analysis, but on the basis of their final rating (e.g., Rachel Maddow’s recent criticism doesn’t focus on their analysis, and uses their analysis to critique their rating).
Cleveland.com published an editorial on their use of politifact that stated:
I’d guess that more than 90 percent of the complaints of bias I’ve gotten about PolitiFact Ohio have been not about the reporting, but about the rating on the Truth-O-Meter.
The problem with the Truth-O-Meter is, fundamentally, that it is a substanceless gimmick. It doesn’t add anything to the dialogue to mark it as “half true” versus “mostly false” versus “mostly true,” but it does provide ammunition for pundits and those looking to make a case on distorting the truth. It does serve to obscure the actual nuance of the debate, rather than helping people be informed so that they can make informed decisions.
In the essay Politifact is Bad For You they state:
Politifact is dangerous. Stop reading it. Stop reading the “four Pinocchios” guy too. Stop using some huckster company’s stupid little phrases or codes or number systems when it’s convenient, and read the actual arguments instead.
The argument being not that there shouldn’t be analysis and fact checking, but that the gimmick itself is interfering with honest dialogue, saying explicitly that: “since it calls itself Politifact and assigns ratings that you can just glance over, it undeservedly becomes a irresistible cudgel to use against your political opponents.” The problem is not the fact checking, its the gimmick.
Demonstrating a remarkable lack of self-reflection, Politifact’s Bill Adair characterized this as:
It’s dangerous to put independently researched information in the hands of the citizenry?
Except that it’s clear from context that the problem isn’t the fact check, it’s the gimmick. Them characterizing it this way would have earned at the best a “half true” by their own ratings tool. Yet it’s clear that they–for whatever reason–think that any attack on the Truth-O-Meter™ is actually an attack on their fact checking and against independent fact checking in general. They are, in short, reading their own press releases.
So while I firmly support the idea that fact checking shouldn’t be a separate column but should be an integrated part of the media proper (something that The New York Times was recently contemplating), I also like what independent fact checkers can do. But that’s with their analysis and their legwork, not their talking points, and not their gimmicks.