Who to ignore (3 bullshit filters)

By Gabriel and Ernest Oppetit

Unforseen shit happens. 9/11, the 2008 crash, Brexit, Trump, COVID-19… you know the story. 

These events surprised us because our information diets are dominated by people who didn’t see them coming. Many of these people are paid for sounding good (or protecting a certain interest), not for being right, so unfortunately we still hear just as much of them.

So it’s on us as readers to improve our information diet to get better commentary and predictions. “Listen to the experts” doesn’t cut it. By definition of course we should listen to them, but finding them by title or pedigree within specific disciplinary boundaries got us to where we are… so again, it’s on us readers to find the experts.

Spoiler alert: we haven’t found them. What we have started is to figure out predictors for catching people who get it wrong a lot, and are causing harm to our information diets. “primum, non nocere“: “first, do no harm”, as the Ancients said, then we can look to supplement our diets with better inputs.

So, here’s a set of bullshit-detection filters, through which you can pass any new prediction or pundit. These are heuristics, not hard-and-fast rules. They definitely shouldn’t replace objective evaluations of arguments you hear, regardless of who they might come from!


Bullshit filter #1: Are they free to speak their minds?

“Don’t ask a barber if you need a haircut”

Who/what pays the bills? How can you know if they really believe what they’re saying?

  • If they are in an institution – who funds it? To what extent could they divert from the “party line” of that institution? For example, how free is the Education Ministry to say that the system is in dire need of reform?
  • If they are in a company – what are the incentives of the company? Who controls it and has a fiduciary duty to the shareholders (to maximise enterprise value)? For example, how free would Nike be to say that running with shoes with a much smaller and “low-tech” sole is better for one’s health? (This example happened and is well told in Born to Run)  
  • As an individual, what are their incentives? Do they own positions or business interests they might be “talking up”? Is the angel investor invested in Revolut going to share any negative news about them, or only the positive which they are incentivised to share?

Apply additional scrutiny and question motives based on these answers.

Conversely: are they going against the “party line” of the company/institution/community they’re in, and taking on risk as a result? (“whistleblowing” being the extreme version of that). This could provide additional signal that they really believe what they’re saying.


Bullshit filter #2: Does their livelihood depend on being right?

“There are two kinds of people: those who like to win, and those who like to win arguments”

Nassim Taleb

What happens to them if they’re wrong? If they predicted something incredibly wrongly, would they be more likely to lose their job? Or are their jobs tied to other indicators, such as reader/viewership rather than “correctness” of the analysis? 

The classic example here is “talking heads” – full-time columnists, pundits, interviewers, bloggers – whose careers can thrive independently of the correctness of their predictions. Paul Krugman comes to mind. His extremely wrong prediction about the Internet, or about the impact of Trump’s election on the economy, have not caused him to lose any stature: he maintains his NYT column and high readership. (Note: as Superforecasters will tell you, there still is value in listening to them for the questions and scenarios they raise, but their predictions should be heavily discounted)

A telling sign of this group is that they often shy away from providing testable predictions, which could be unambiguously disproven later. Keeping things vague provides enough wiggle-room to keep an “I told you so” stance.

Beware of reading too much into predictions from this group: remember they got to where they are more from sounding good than from being right. 

Conversely: listen more carefully to people whose livelihood depends on being right and who have been provably right a lot (this is the well-publicized “skin in the game” factor). The easiest proof is people who have been paid by the market for being right: they took on a position or built a service, and the market decided over sufficient time on the correctness of that position or value of the service. (One interesting side-effect of this group is they tend not to be confined to academic boundaries: they follow the questions wherever they lead because they’re chasing an outcome, not recognition in a given field.)

Side-note: it’s likely that some predictions are deliberately wrong. When Elon Musk says “we will have a full self driving robo-taxis in 2 years” or “we will be on Mars in 10” it seems likely to us that he doesn’t truly believe that, but he’s incentivised to say it. The cost is negligible (a few bad headlines perhaps) but the upside is high: it motivates the troops and helps with branding & recruiting.)


Bullshit filter #3: Do they take cyberspace seriously?

“The spread of computers and the internet will put jobs in two categories: people who tell computers what to do, and people who are told by computers what to do.”

Marc Andreessen 

This criterion is a little less conventional, and more of a personal belief… so hear us out. 

We are living in an age in which the balance of power is being overturned by information technology.

In other words: geeks will inherit the Earth. We are moving from the Industrial Age to the Knowledge age, and from scarcity of Capital to scarcity of Attention. Software and the Internet enable “permissionless innovation” where the best product wins and has very low cost distribution: this is changing the world.

It’s important to listen to people who know that this is happening, and take it very seriously. They have a deep appreciation and respect for the engineer’s “craft”, and they are either building for this new world or they build networks and information diets to stay close to these developments. 

It’s such an important change in society that the opinion and predictions of those who don’t get this or are “wishing it away” nostalgically should be discounted.


Conclusion

We’ve laid out a few heuristics we think are helpful to keep in mind when evaluating arguments and predictions, and to construct a healthy information diet.

  1. As Munger said: “Show me the incentive and I’ll show you the outcome.” Be wary of views coming from people who are not free to speak their minds on the topic at hand, because of their existing incentives.
  2. Favour views from those who get paid for being right, discount views from those who get paid for sounding right.
  3. The balance of power is being overturned by information technology. Favour people who deeply understand it and work it into their world view.

The good news is that lots of writers are going solo as the distribution cost of media has gone to 0. So (1) we have more writers who are their own boss, whose incentives are less likely to get polluted by bullshit filter #1, and (2) we can more easily follow writers (eg Bill Bishop) instead of institutions (eg the NYT) which enables for more granular filtering.

Happy curating!

PS: As a follow up we’ll post a list of people who we’ve find consistently pass these filters and have been righter than most on a few big predictions / events.

10 comments

  1. funny, your lack of trust in institutions is alarming given they get paid more for their content than experts on the fringe like bill bishop, granted the gravitas behind individual brands varies, but is the difference enough for you to ignore/mistrust NYT on china/

    Like

  2. The mental heuristics here are very similar to those I’ve picked up listening to the No Agenda show over the years. (https://www.noagendashow.net). Always questioning the motivations of media publishers and authors is one of the core tools the hosts, Adam Curry and John C Dvorak, use when deconstructing and analyzing the media, and the name No Agenda is fitting, as they don’t have advertisers that they’re beholden to (the show is entirely supported by donations).

    Like

  3. I don’t know what to believe!

    Maybe the miss-conception is that some people consistently predict the future correctly – when in fact nobody can.

    Only four funds have outperformed the FTSE in the last 5 years, when every single fund has been trying their upmost to deliver this aim.

    https://www.trustnet.com/news/7461432/the-four-uk-funds-outperforming-the-ftse-all-share-in-each-of-the-past-five-years

    Likewise, some companies will have outperformed their respective markets (Wal-Mart and NYSE for example) in the long run, but ask the people who run WMT how covid-19 will play out, or how the Brexit transition will go, or who will win the next US election, and my reckoning is that they will be only as correct as the next man.

    Maybe rather than looking for people who are less wrong, we should remind ourselves that everyone is wrong 50% of the time. Maybe more. Even those people who sound sooo legit, and have been right the last 100 times

    Similarly – as you point out – it’s good to remember that oftentimes those who look that they are trying to predict the future (so might have a 50% success rate) have incentives to predict it wrong (that 50% could fall to 0%!)

    Bleakness.

    H

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s