The claim that the founders meant America to be a Christian nation isn’t just bad history—it’s a declaration of war by the religious right.