Or, how not to fool ourselves into thinking we have an explanation before we do. Sometimes "I don't know" is the only reasonable answer. If your explanation ( e.g., "phlogiston") would have served equally well to explain any other outcome, it's not an explanation. It hasn't added to your knowledge.
The site describes fallacies in assessing probabilities and risks, too. Apparently there is a strong human tendency to overestimate a risk stated in whole numbers rather than percentages, so that a disease sounds more dangerous if it kills 1,000 out of each 100,000 affected than if it kills 2%: those thousand bodies weigh on the hindbrain. We also have only a limited inborn talent for distinguishing between the risk and reward of a chancy proposition. The more convinced we are of the benefits of a course of action, the lower our assessment of its risk, even when the two have nothing at all to do with each other. These are new skills in the evolutionary sense, for which we haven't yet developed much in the way of gut-level shortcuts.