We are for the dark
Decades of research have shown that humans are so-called cognitive misers. When we approach a problem, our natural default is to tap the least tiring cognitive process. Typically this is what psychologists call type 1 thinking, famously described by Nobel Prize–winning psychologist Daniel Kahneman as automatic, intuitive processes that are not very strenuous.
This is in contrast to type 2 thinking, which is slower and involves processing more cues in the environment. Defaulting to type 1 makes evolutionary sense: if we can solve a problem more simply, we can bank extra mental capacity for completing other tasks. A problem arises, however, when the simple cues available are either insufficient or vastly inferior to the more complex cues at hand.
Exactly this kind of conflict can occur when someone chooses to believe a personal opinion over scientific evidence or statistics.