By Gabriel and Ernest Oppetit
Unforseen shit happens. 9/11, the 2008 crash, Brexit, Trump, COVID-19… you know the story.
These events surprised us because our information diets are dominated by people who didn’t see them coming. Many of these people are paid for sounding good (or protecting a certain interest), not for being right, so unfortunately we still hear just as much of them.
So it’s on us as readers to improve our information diet to get better commentary and predictions. “Listen to the experts” doesn’t cut it. By definition of course we should listen to them, but finding them by title or pedigree within specific disciplinary boundaries got us to where we are… so again, it’s on us readers to find the experts.
Spoiler alert: we haven’t found them. What we have started is to figure out predictors for catching people who get it wrong a lot, and are causing harm to our information diets. “primum, non nocere“: “first, do no harm”, as the Ancients said, then we can look to supplement our diets with better inputs.
So, here’s a set of bullshit-detection filters, through which you can pass any new prediction or pundit. These are heuristics, not hard-and-fast rules. They definitely shouldn’t replace objective evaluations of arguments you hear, regardless of who they might come from!
Bullshit filter #1: Are they free to speak their minds?
“Don’t ask a barber if you need a haircut”
Who/what pays the bills? How can you know if they really believe what they’re saying?
- If they are in an institution – who funds it? To what extent could they divert from the “party line” of that institution? For example, how free is the Education Ministry to say that the system is in dire need of reform?
- If they are in a company – what are the incentives of the company? Who controls it and has a fiduciary duty to the shareholders (to maximise enterprise value)? For example, how free would Nike be to say that running with shoes with a much smaller and “low-tech” sole is better for one’s health? (This example happened and is well told in Born to Run)
- As an individual, what are their incentives? Do they own positions or business interests they might be “talking up”? Is the angel investor invested in Revolut going to share any negative news about them, or only the positive which they are incentivised to share?
Apply additional scrutiny and question motives based on these answers.
Conversely: are they going against the “party line” of the company/institution/community they’re in, and taking on risk as a result? (“whistleblowing” being the extreme version of that). This could provide additional signal that they really believe what they’re saying.
Bullshit filter #2: Does their livelihood depend on being right?
“There are two kinds of people: those who like to win, and those who like to win arguments”Nassim Taleb
What happens to them if they’re wrong? If they predicted something incredibly wrongly, would they be more likely to lose their job? Or are their jobs tied to other indicators, such as reader/viewership rather than “correctness” of the analysis?
The classic example here is “talking heads” – full-time columnists, pundits, interviewers, bloggers – whose careers can thrive independently of the correctness of their predictions. Paul Krugman comes to mind. His extremely wrong prediction about the Internet, or about the impact of Trump’s election on the economy, have not caused him to lose any stature: he maintains his NYT column and high readership. (Note: as Superforecasters will tell you, there still is value in listening to them for the questions and scenarios they raise, but their predictions should be heavily discounted)
A telling sign of this group is that they often shy away from providing testable predictions, which could be unambiguously disproven later. Keeping things vague provides enough wiggle-room to keep an “I told you so” stance.
Beware of reading too much into predictions from this group: remember they got to where they are more from sounding good than from being right.
Conversely: listen more carefully to people whose livelihood depends on being right and who have been provably right a lot (this is the well-publicized “skin in the game” factor). The easiest proof is people who have been paid by the market for being right: they took on a position or built a service, and the market decided over sufficient time on the correctness of that position or value of the service. (One interesting side-effect of this group is they tend not to be confined to academic boundaries: they follow the questions wherever they lead because they’re chasing an outcome, not recognition in a given field.)
Side-note: it’s likely that some predictions are deliberately wrong. When Elon Musk says “we will have a full self driving robo-taxis in 2 years” or “we will be on Mars in 10” it seems likely to us that he doesn’t truly believe that, but he’s incentivised to say it. The cost is negligible (a few bad headlines perhaps) but the upside is high: it motivates the troops and helps with branding & recruiting.)
Bullshit filter #3: Do they take cyberspace seriously?
“The spread of computers and the internet will put jobs in two categories: people who tell computers what to do, and people who are told by computers what to do.”Marc Andreessen
This criterion is a little less conventional, and more of a personal belief… so hear us out.
We are living in an age in which the balance of power is being overturned by information technology.
- Some kids (early teens) are making 100s of thousands of dollars programming games Roblox
- 100s of people are letting an open-sourced (built by a team of 12 people in California) AI hack into their cars’ control systems to drive them around
- We’re on the cusp of making photorealistic digital humans, an important building block of the Metaverse
In other words: geeks will inherit the Earth. We are moving from the Industrial Age to the Knowledge age, and from scarcity of Capital to scarcity of Attention. Software and the Internet enable “permissionless innovation” where the best product wins and has very low cost distribution: this is changing the world.
It’s important to listen to people who know that this is happening, and take it very seriously. They have a deep appreciation and respect for the engineer’s “craft”, and they are either building for this new world or they build networks and information diets to stay close to these developments.
It’s such an important change in society that the opinion and predictions of those who don’t get this or are “wishing it away” nostalgically should be discounted.
We’ve laid out a few heuristics we think are helpful to keep in mind when evaluating arguments and predictions, and to construct a healthy information diet.
- As Munger said: “Show me the incentive and I’ll show you the outcome.” Be wary of views coming from people who are not free to speak their minds on the topic at hand, because of their existing incentives.
- Favour views from those who get paid for being right, discount views from those who get paid for sounding right.
- The balance of power is being overturned by information technology. Favour people who deeply understand it and work it into their world view.
The good news is that lots of writers are going solo as the distribution cost of media has gone to 0. So (1) we have more writers who are their own boss, whose incentives are less likely to get polluted by bullshit filter #1, and (2) we can more easily follow writers (eg Bill Bishop) instead of institutions (eg the NYT) which enables for more granular filtering.
PS: As a follow up we’ll post a list of people who we’ve find consistently pass these filters and have been righter than most on a few big predictions / events.