The original

Brandolini’s Law, or the “bullshit asymmetry” principle is a really neat summation of things that are well-known, but hard to express simply

the amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.

Alberton Brandolini, twitter 11 Jan 2013

I’m going to slightly rephrase this as:

the amount of energy needed to refute bullshit is much greater than needed to produce it.

Not as neat, and its not going to supersede Brandolini’s formulation, but its a bit clearer for those without a science background.

Brandolini’s law is implied in a lot of the material I’ve written on disinformation and the mess of information, but I now have a phrase that summarises the problem, and sets out areas for future research, much better than anything I’ve written.

It is much easier to invent ‘facts’ that appeal to people’s biases, fears and already accepted truths, than it is to make a reasonably accurate statement about reality. An explanation for why someone is wrong is often lengthy, and sometimes impossible.

As an example of impossibility, say for example someone asserts that the President of Agleroa engages in the slave trade of children, and uses his power to hide this.

No one can disprove this. A disproof can simply be another example of his power in action, or “fake news”. How can I show an absence of children being traded etc? To make a disproof requires vast amounts of energy. If for example the bullshitter had made a claim that the President had traded kids on a particular date, and I could find no evidence for that, it does not disprove all the dates that such trades could have occurred, and it might be argued that I could not find anything because I’m operating in bad faith or that the data is hidden beyond my capacities to find it. Even if I succeed in convincing one person that the President is not trading children then, if there is a group of people devoted to slandering the President of Agleroa who find it profitable to spread this accusation, it will still keep surfacing. People may even disbelieve me if I try and show Agleroa is not a real place.

In a similar case a real President was repeatedly said to be fighting organised pedophilia. There was no evidence for this, and it was similarly hard to disprove, because we were told he was working in secret. He apparently didn’t even talk about it, so as not to alarm pedophiles, and this silence could be taken as proof. Those who could be bothered to disprove it, were probably trying to defend pedophiles and therefore not trustworthy.

These situations are like disproving climate change denial.

If a person assumes nearly all climate scientists are lying or conspiring so as to harm them, then there can be no disproof. A person who tries to participate in the disproving by pointing out ‘facts’, is either part of the conspiracy, or a dupe repeating these scientist’s false information. How do you disprove the assertion that nearly all climate change scientists are lying, to a person who accepts that proposition as more probable than they are not lying?

This energy needed to maintain a “true position” means that what I’ve called “information groups” that filter out information rejected by the group, condemn those outsiders who disagree, and which propagate the misinformation the group lives by, and identifies with, become even more important.

Other Formulations

My earliest formulation of a similar position was what I called Gresham’s law of information “Bad information drives out good”. This is partly because bad information is plentiful [is easy to manufacture], but people may want to hoard and hide good information to give themselves an advantage, or it gets lost in the ether [Entropy]. But this is nowhere near as elegant, or as explanatory, as Brandolini’s Law.

Earlier formulations include this from Jonathan Swift:

Besides, as the vilest Writer has his Readers, so the greatest Liar has his Believers; and it often happens, that if a Lie be believ’d only for an Hour, it has done its Work, and there is no farther occasion for it. Falsehood flies, and the Truth comes limping after it; so that when Men come to be undeceiv’d, it is too late; the Jest is over, and the Tale has had its Effect…

Quote investigator: ‘A Lie Can Travel Halfway Around the World While the Truth Is Putting On Its Shoes”

The obvious point here being that human energy use always involves time. Information takes time to discover and test, and it needs to be present at the time it is needed. Misinformation can have its intended effect, and by the time it is satisfactorily refuted, it is too late. Again we can see this with climate change denial claims in which it now seems too late to do anything effective about climate change, so let’s not bother.

Slightly later we have George Holmes:

Pertness and ignorance may ask a question in three lines, which it will cost learning and ingenuity thirty pages to answer. When this is done, the same question shall be triumphantly asked again the next year, as if nothing had ever been written upon the subject. And as people in general, for one reason or another, like short objections better than long answers, in this mode of disputation (if it can be styled such) the odds must ever be against us; and we must be content with those for our friends who have honesty and erudition, candor and patience, to study both sides of the question.

Holmes “Letters on Infidelity” Leter VIII p146-7

Holmes points out another problem, which is even more common in the information age, disinformation never dies. The disinformation can be reprised with ease, in perhaps a slightly different form if necessary. And, in the unlikely even that the person who revitalises the disinformation wants to find something more accurate, it will take them a lot longer to locate and read the refutation (assuming the refutation is good in the first place :). The short punchy lie is much easier to grasp than the lengthy refutation, at any time.


Brandolini’s Law is a succinct and explanatory formulation that has great relevance for modern information society.

There are two big questions it raises:

  1. Given the huge (and probably increasing) amounts of energy that it takes to maintain a shared sense of the universe in a large society and keep people well informed about reality and responsive to events in reality, is it inevitable that such societies will fragment into factions pushing their own truths and ignoring what is happening, until they collapse? [this is a bit like
  2. What can we do to lessen the law’s effects, so we can resurface from being buried under disinformation and misinformation?