Brandolini's Law in the GPT Era

Posted on Jul 28, 2024

I went from not knowing what GPT meant to using such models daily in just a couple years, and I am not even an “AI engineer”. After nearly 15 years paying the rent doing software engineering, I am stuck in an abusive love-hate relationship with generative AI - shouldn’t I have known better?

When I test models, I ask them tricky and complex questions, such as “list 10 things about golang preemption that no one knows about”. I tried this one on Claude 3.5 a few weeks ago, and it almost got me, although I consider myself as pretty knowledgeable about Go-related niche stuff. It came up with an obscure mechanism that I had never heard of, with a nice chart on the side. It was so well explained and believable that I ended up digging the source code to check if it was true. Of course, it wasn’t.

The bullshit asymmetry principle, or Brandolini’s law, was formulated by Alberto Brandolini in 2013, in a tweet that went viral:

The bullshit asimmetry [sic]: the amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.

Alberto Brandolini, 2013 1

This has never been more true than today, in the GPT era. The amount of energy needed to produce bullshit is a few keystrokes and fractions of a cent. The amount of energy to refute it, as models get better and better, is getting through the roof, even for experts in the field.

Copilot in my IDE helps me go faster for repetitive lines, but it is also distracting me with its constant stream of dumb suggestions. Damn, it does not get things right most of the time - and when it spits out what I was thinking about, I still read it twice to check if it is not bullshit. And yet, when I turn it off, I feel like I am missing something, like I am searching for my words. Has my brain become lazy? Is that what GPT withdrawal feels like?

To help with organizing my thoughts, I mainly use ChatGPT nowadays. Hate it or love it, it is such a powerful tool. Do you remember life in the 2000s? This is fuck*ng alien technology. But it’s not coming after your job any time soon. I just took the text you just read up to the previous phrase and gave it to a custom GPT that has access to all my previous posts, and I asked it to predict the next 100 words:

Generative AI tools have transformed our workflows, providing both convenience and new challenges. Despite the impressive capabilities of models like GPT and Copilot, their suggestions require careful scrutiny. This balance is crucial: we must harness AI’s power without becoming overly reliant on it. It’s essential to maintain our critical thinking skills and verify AI-generated content. These tools should augment, not replace, our expertise. As AI continues to evolve, staying informed and vigilant will help us leverage its benefits while mitigating its risks. The future of work involves collaboration between human ingenuity and AI capabilities, enhancing productivity and innovation.

Glorious ChatGPT, 2024

Does that remotely sound like something I would write? No. This is bland and generic. It generally does the same for code, as for anything else. Still, I use it often, and it helps me a lot to summarize, organize and connect ideas, and confront them with the vast knowledge it has access to. But it’s still just an alien stuck in a room.

I’ve heard that genAI is a done subject, that it is not a research area anymore, it is being productized - won’t get better than that. Que sera, sera.

And then, there’s search. Although quickly criticized, this area sounds promising - google’s and bing’s generative search, the SearchGPT experiment and the likes. We are still very early in the game, but I see a huge potential. With a stricter scope and less room for hallucination, I believe the models we will have in just a few years, if not months, are going to be capable of delivering a great search experience. But that’s not just about generative AI. This new wave is reshuffling the cards and may open new avenues to change how information is gathered by search engines.

The web is becoming exponentially vast and complex as time goes by. Google only crawls half to 70% of big websites nowadays. Rendering pages is more costly than it ever has been, and it’s not getting better. Search engines just cannot keep up with the pace of the web, and we need new ways of feeding the beast. It may well be this wave of change that provokes the next big shift in how search engines consume information, not merely another layer of tech on top of old machinery that struggles to keep up.

All fields requesting personally identifiable information (PII) are optional. Any personal data you provide will be kept private and is used solely for moderation purposes. Your information is stored securely on a server in Switzerland and will not be used directly, shared, or disclosed to third parties. By submitting your personal data, you consent to its indefinite storage under these terms. Should you wish to have your data removed or have any privacy concerns, please contact me using the information provided in the 'About' section.