The final problem is selection bias. This means choosing statements from conservatives that are more likely to be false an ones from liberals that are more likely to be true. They spent a lot of time fact-checking Michele Bachmann, almost always finding her wrong but they failed to evaluate everything she said. One example was her widely-ridiculed statement that our fore-bearers fought against slavery, giving John Quincy Adams as an example. Pundits insisted that she Fore-bearers meant Founding Fathers and that she had confused J. Q. Adams with his father. In fact, "fore-bearers" simply means people who came before us and J. Q. Adams was a life-long opponent of slavery (among other things, he represented the captives of the Armistad). Given how often Bachmann's statement was quoted, it deserved to be rated.
I was going to give a recent example of this where the statement being evaluated was that warming has stopped for the last 15 years but I can't find it on their site. As I remember it, they talked to experts who said that the warming might be going elsewhere than the atmosphere and insisted that 15 years is not significant. They went on to give the claim a Pants On Fire rating. This is strnage given that the statement is true.
The second problem is when they already know the answer. They don't even try to search out competing experts. They as experts that will support their conclusion. This is known as confirmation bias. This happens anytime they evaluate statements on Global Warming or the Social Security Trust Fund.
Example: A rating of the claim that ObamaCare does not mandate a background check of "navigators" is examined. They admit that this is true. There is nothing in the law or the related regulations to keep someone convicted of identity theft from getting a job as a navigator. So how do they rate the claim? They give it a Half-True because individual states can mandate background checks.
The first is the rating system they use. Politicians seldom make statement that are totally true or false. They cherry-pick facts or rely on one study when conflicting ones exist. Typically, PolitiFact does a fairly good job of sorting through this. Then the column is given to a panel of editors to rate. This is where things get subjective and biases creep in. It is not uncommon for a liberal who relies on one of many studies to get a rating of "mainly true" while a conservative will get "mainly false". Often the editors seize on some minor point to justify their rating, leaving people who read the entire column scratching their head.
Despite claiming to be neutral, PolitiFact does have left-leaning biases. These break down into three categories.
Last year the fact-checking site PolitiFact drew conservative criticism after they declared that the "Lie of the year" was "[Mitt Romney] Says Barack Obama "sold Chrysler to Italians who are going to build Jeeps in China" at the cost of American jobs." As it turned out, Chrysler is building a plant in China which is due to open this year. Even granting that the plant will be making Jeeps for the Chinese, it still means that Romney's statement was true.Politifact never went back and reevaluated their assessment.This year the Lie of the Year was President Obama's statement, "If you like your insurance you can keep it period." This hasn't made conservatives any happier since PolitiFact evaluated this statement several times over the last 5 years and never found it to be false. In their summary they dropped the original evaluation from 2008 which was rated 100% true.