Scott Alexander is Smarter Than Me. Should I Steal His Beliefs?
Arguments are overrated; you can find truth without them
It’s no secret I’m a Scott Alexander glazer. Scott once said that he was an embarrassing fanboy of Eliezer Yudkowsky, and that it may be his fate to have embarrassing fanboys of his own one day. Well, the bell tolls. I find Scott to be consistently interesting and intelligent, and he has a way of connecting topics to one another in an interesting way that I’ve never seen from anyone else. He’s seemingly dedicated to truth more than anyone else I know.
So, he’s smarter than me, and a better thinker than me, and spent a lot of time on many different topics trying to find the truth. As the smartest guy I’ve found, should I steal all of his beliefs indiscriminately? Legitimately. If a person only care about having the most correct beliefs, which I feel like is a reasonable goal, is finding the smartest person you know and stealing his beliefs a good idea?
Down the rabbit hole
Epistemology and epistemic practices are the study and methods of finding true things1. For the vast majority of issues, in politics, religion, psychology, and philosophy, where Scott’s gotten his beliefs through fantastic epistemic practices and I’ve gotten my beliefs from random sources and friends and biases that I can’t remember, does it make sense to copy everything?
Well, it seems obvious that there are some issues I shouldn’t copy from him — If there’s an issue where I’m an expert and he is not, then I probably shouldn’t steal his beliefs. But for the rest, I can treat very act of him having a belief as very strong Bayesian2 evidence in favor, and update very strongly towards his belief, because I know that he’s spent a lot of time thinking about any given topic to attempt to reach the truth, and I know I’m more biased!
Alright, to back off from the Scott glazing, am I too preoccupied with having my own, interesting opinions and I should change ALL my beliefs to what the smart people I respect think? Fully adopting someone else’s opinions without understanding them can be a dangerous game. Without understanding why someone believes X or why, you don’t actually know which way new evidence should swing, and how much. But I find it very easy to agree with the fact that if 99% of people just fully adopted Scott’s opinions, they would be much more correct. And if rationality and epistemology isn’t about being correct, what even are they about? They can still try to learn and understand different points, and when their expertise eclipsed Scott they could place a heavier emphasis on their points, but it’s the point of epistemology to be correct.
But of course, most people don’t think as highly of Scott as I do. Plenty of people are mirrors for the beliefs of a political figure, or even more commonly, their political side. If I could convince them to copy Scott’s epistemic gusto, then they’d be more correct, but they would need to think they should trust me, which brings a whole host of issues with it. I’m using my own shaky epistemic tools to decide that Scott is the guy I should copy, so if I think he’s really that much of a better choice than, say, Donald Trump, most of the work of stealing his beliefs is already over.
Eliezer has an old, relevant chestnut in this post that talks about how authority and argument are two very different types of evidence. It’s only if you don’t understand an argument, or don’t attempt to, that authority become relevant. If you fully grasp it, and judge it to be sound, then the speaker who said it becomes only a footnote. But I feel that I’m able to understand most of Scott’s posts, which is how I decided that he was such a good candidate for this thievery to begin with! It seems like if there was a hypothetical user X, with even better epistemic practices than Scott, but X didn’t care enough to dumb down his arguments to something that heathens like myself can understand, I’m out of luck.
Wait a second, Scott’s not a computer scientist or political scientist, he’s a writer
Now’s about the time where I point out that as great as Scott is, he is first and foremost a writer before an expert in any of the fields he talks about (except psychiatry). His best skill is his way to weave words to make them interesting, which makes them stick out in my brain like a sore thumb. So I’m basically doing what the person considering becoming a mirror to Trump is: finding the most charismatic person I know, and stealing all his beliefs. It’s just that my definition of charismatic is poisoned by wanting to feel like I’m learning.
If someone said “hey, your posts are insightful, I’m going to blindly steal all your beliefs” I would implore them to steal Scott’s instead! But Scott himself is certainly not the smartest person that SCOTT knows, as he says here in a post where he admits he always struggled with math:
Every so often an overly kind commenter here praises my intelligence … But at my level, I spend my time feeling intellectually inadequate compared to Scott Aaronson.
Scott Aaronson describes feeling ‘in awe’ of Terence Tao and frequently struggling to understand him.
Terence Tao – well, I don’t know if he’s religious, but maybe he feels intellectually inadequate **compared to God.
So wait a second, I’m at the bottom of this long totem pole of smartness. I haven’t read barely anything from Aaronson or Tao! Okay, we’re on this truth hunt, let’s cut out the middleman and just steal Terrance Tao’s beliefs. But Tao is private about most of his personal beliefs. As an example, YIMBYs are people who want to build more houses and have less housing regulations, and their naysayers are the dastardly NIMBYs — this is a niche political topic! What do I do when I want to have an opinion on YIMBYism for NIMBYism and I go up the totem chain and Tao has spent his time on computer science stuff, and doesn’t even have anything public about his thoughts on a niche topic?
Screw it, Tao isn’t a political scientist, why would I trust him there? Why don’t I just find a smart person in every field and steal their beliefs. I’ll find a smart political scientist, a smart philosopher, and a smart AI expert. But it becomes immediately obvious that I’m not smart enough in fields I don’t understand to determine who’s actually smart and who’s bullshitting. Let’s say I don’t understand any political science — how could I tell the difference between Curtis Yarvin and, uh, I guess literally any other substack political commentator? If I don’t even have a basic grasp of the material I’m screwed. But actually, experts in a topic have to be generally right when they have a consensus — say, climate change existing vs not, trusting the experts is a great heuristic. And if an issue is contentious, maybe trusting the majority is best? Once again, if I’m looking for a truth shortcut, the side with 60% seems better than the side with 40%!
So throw out the idea of finding one specific guy who’s really smart in a topic, let’s steal the expert consensus of everything. Except how am I supposed to know who’s the experts? If everyone working in a field counts, you’re gonna get a lot of stupid people. Now we’re stuck between 1 person being too many and a consensus of everyone giving college dropout interns in the field a say. But it might all just average out?
Maybe I need a consensus of the top 5 experts in a field, chosen by experts in a field. But that’s stupid, the people are just gonna choose
Screw it, the experts can’t be trusted, just get a consensus of the people. Hah! Are you stupid? That’s way worse. Steve Kirsch has 250,000 substack subscribers! Everybody is stupid except me! Maybe I’m just uniquely suited to finding truth in the universe because I got a 1480 on the SAT.
Wait a second, a 1480 literally isn’t even that high. Smart people? Let’s poll smart people! What’s the consensus of everybody who’s got a 1600 on the SAT on every topic? They can’t go wrong. Where can I find this? Do I need to create a database and find out what the smartest people think to find truth?
This is all going wrong. All I want is a fast track to truth without having to understand the arguments for every single topic. Wait, maybe epistemology and Bayes’ Rule are the problem. Who decided that the truth needed to be backed by statistics? Maybe the universe operates outside numbers. Religion! The majority of the US is religious! I’m not, but if we’re trusting the masses, what does God think? Err, what’s the consensus of people who talk to God about what God’s beliefs are? Does God think I should be a YIMBY? Maybe I operate in a privileged place in the universe, and epistemology doesn’t even work for me because of something something anthropic reasoning. What’s the best way to get God to speak to me about whether I should support Israel or Palestine? How can I find only the true prophets?
No! The prophets have no clear miracles! Why am I assuming other people are even real, or that my beliefs are real? What’s a belief anyway? If my conscious experience right now isn’t experiencing the belief, it’s just somewhere else in my brain, am I experiencing it? What is solipsism is true? Maybe this is a religious test in a simulation or something to see if I can discover NIMBYism off of pure faith without any evidence. Can God create another God so powerful he cannot control him? Can I control my own brain? How can I be a YIMBY if the fabric of time travelling from the past into the future cannot even be independently verified? What if I’m a Boltzmann brain and statistics and reality and truth and time are gonna break down? I’ll give it 3… 2… 1…
RAHHHHHHH
This is all ridiculous, right?
Okay, back up. I hope you realize this is all insane. It’s really, really hard to get a fast track to truth without understanding the arguments. It’s true that a cursory understanding isn’t as good as a deep understanding, but it’s better than nothing. Scott Alexander is valuable because he explains the arguments so well. And, of course, trusting the experts is usually very good advice, and a good heuristic to have. If the experts all disagree with you, you’re probably wrong. On contentious issues, weigh the sides, the amount of people for or against, and the arguments, and the world will mostly make sense. If many people have an incentive and the ability to position their false beliefs as the expert opinion, arguments and statistics are necessary. As a closer, I’m reminded of one of my favorite parables, about smallpox:
Louis XV died of smallpox in in 1774. He had all the power, and money, and resources in the world, yet he met his fate all the same; was he truly doomed? An inevitable and unavoidable tragedy?
Nay; three months before his death, a lowly dairy farmer across the Atlantic in the United States braced his family as Smallpox ravaged his town. Luckily, it was folk wisdom that cowpox, a relatively mild affliction, made you completely immune to the much more devilish smallpox. He took his family to his cows and rubbed their pus on his arm, and they were saved from the terrible fate for the rest of their lives
Loius XV’s fate was sealed not from a lack of resources, but from a lack of knowledge; there would have been no way for him to distinguish true knowledge from the snake oil salesman and faith healers that surrounded him. The only path is true understanding.
Alright, you got me, it’s the study of knowledge and is interested in the difference between true beliefs and justified beliefs, how they differ from opinion, etc etc. You’re damn right that I will abuse it in my article to mean “good at finding true things” because it can be a fancy way to say something that resembles that. Also, the word is fun to say. Ep-eh-stem-ick.
Bayes bayes bayes bayes bayes. if you say it 5 times it almost doesn’t sound like a word. This word is actually completely irrelevant in this sentence, and can be ignored. Also, I’m kinda totally abusing Bayes here, as you can’t be under or overconfident, but tuning your method to find truth is fine! But yes, fine, it’s “one of the most important equations of all time”. Yes, it’s “a mathematical way to explain how to accurately update beliefs so you can find truth”. I know, I know, I know. Scott’s bio actually is literally just Bayes’ rule, so if I don’t include it because it’s so important I’m basically disrespecting Scott.
Did you read this post from Daniel Muñoz - https://open.substack.com/pub/bigifftrue/p/stop-blocking-everyone-you-disagree?utm_source=share&utm_medium=android&r=1ackrk? It deals with a similar question of where we should source our information and beliefs.
An interesting point it brought out is that the best informed person is not necessarily the most informative. If we just take on the opinions of the best informed, our opinions will be entirely uninformative for those who already have access to their opinions.
It feels kind of paradoxical that we can be smarter collectively by being willing to risk being more wrong individually. But, this is kind of the heart of science, evolution, and all progress. Come up with new (likely wrong) ideas and put them to the test.
I wrote a pretty similar article recently about this general subject (how to determine truth and to what extent one should put their faith in "the experts"): https://glasshalftrue.substack.com/p/science-is-a-liar-sometimes
I agree that actually, fully understanding something is the best way to reach the truth on something, rather than deferring to experts. But at the same time, practically speaking you will not have the wherewithal to personally vet 99.99% of your knowledge, so there will always inevitably be a few false beliefs that sneak in; the best you can do is have epistemic humility and be willing to continuously re-evaluate what you think you know and who you trust.