Especially for papers that rely on empirical work with painstakingly assembled datasets, the only way for peer reviewers to do the kind of thorough vetting that many commentators seem to imagine is implied by the words "peer review" would be to . . . well, go back and re-do the whole thing. Obviously, this is not what happens.
(Sorry, that just slipped out.)
This is not to say that the peer review system is worthless. But it's limited. Peer review doesn't prove that a paper is right; it doesn't even prove that the paper is any good (and it may serve as a gatekeeper that shuts out good, correct papers that don't sit well with the field's current establishment for one reason or another). All it proves is that the paper has passed the most basic hurdles required to get published--that it be potentially interesting, and not obviously false. This may commend it to our attention--but not to our instant belief.
Fact-checking and the criticism of one's peers are not a few of McArdle's favorite things, and therefore a strawman is duly built, erected, knocked down, set on fire, and its ashes are sown into the dirt.
Welcome back, McArdle!