An interesting point it brought out is that the best informed person is not necessarily the most informative. If we just take on the opinions of the best informed, our opinions will be entirely uninformative for those who already have access to their opinions.
It feels kind of paradoxical that we can be smarter collectively by being willing to risk being more wrong individually. But, this is kind of the heart of science, evolution, and all progress. Come up with new (likely wrong) ideas and put them to the test.
Yep I did read it! Good post good post, the masses indeed lead us along the search for truth.
There’s a good LessWrong post about this too, how you can waste 20 years of your life on a totally wrong idea and it will contribute to society, but it’s a bit cruel to be encouraged to waste 20 years of your life on a false idea
I agree that actually, fully understanding something is the best way to reach the truth on something, rather than deferring to experts. But at the same time, practically speaking you will not have the wherewithal to personally vet 99.99% of your knowledge, so there will always inevitably be a few false beliefs that sneak in; the best you can do is have epistemic humility and be willing to continuously re-evaluate what you think you know and who you trust.
Persuasiveness is a powerful trait, and intelligent, persuasive people are a little scary because it's so easy to fall into a bit of a trance from their rhetorical skill and their smooth explication of the best arguments [for their side]. *Especially* for subjects out of your own wheelhouse.
Hell, I find myself too easily convinced by smart people making good arguments in fields that aren't my own and it takes a lot of effortful, System 2 engagement to stop myself from simply adopting the beliefs so clearly explicated for me.
What I love about Scott and many others in the SSC/ACX commentariat (hehe) is the combination of:
(1) attempting to steelman other positions [and expressing that value directly],
(2) explicit willingness to change / acknowledge error [and expressing that value directly], and
(3) cultivating a community of "check my work please" and in-depth arguments in the comment section [and expressing the value of poking holes in each other's thoughts with civility].
It's similar to why I'm a scientist: science in practice is not always perfectly self-correcting, and peer-review / conferences aren't a pure quest for truth above all by any means, but the enterprise is at least formed around those values. (Am I a science glazer now? Is this glazey enough? Am I doing it right??)
Regarding your claim that once you understand an argument, authority is a footnote, I think this neglects the possibility that you might think you understand an argument while not actually getting it. Pretty much nobody who ever misunderstands an argument or things they have rebuted it when they haven’t thinks they don’t understand the argument. For example, back in the day, I used to be confident that I had seen the problems with the argument from fine-tuning and therefore the actual value of physical constants shouldn’t be a problem I need to worry about. I still think it doesn’t work as an argument for God, but now that I have a better grasp of anthropics, I think I was clearly in error when I thought I had seen why the argument was mistaken. Basically, if you and a super smart person disagree about whether an argument is correct, you should all else equal assume that the smarter person is correct.
I do think Scott is a poor example, because while I’m pretty confident, he is smarter than I and possibly you his actual performance in his prediction contests makes me think you’d be better off, consulting the opinion of superforecasters. Of course as a practical matter, I noticed that I feel comfortable disregarding their opinions where I think the reasoning is clearly wrong, for example on existential risk from artificial intelligence. Similarly, I think many worlds is correct, even though the community of physicists who have probably forgotten more on the topic then I remember is sharply divided when it comes to this subject. This does indicate that subconsciously my brain still thinks it’s knowledgeable enough to judge the topics to some extent, but I noticed I thought the same back when I was in my auntie string theory phase, and in retrospect, I was clearly not thinking clearly in that time period and definitely wasn’t as knowledgeable as I needed to be to form confident opinions. I think in practice you need to give substantial way to authority when reasoning about a topic, but I do think that I’m not making a mistake when I don’t give authority in finite wait and do let sufficient evidence from my own thinking overrule It. Of course, this could be a mistake, and I noticed most people have overly confident opinions on topics where they have done, minimal research and thinking, even though they are aware that experts disagree. Think of topics like the minimum wage where people will actively get angry at the suggestion that it’s not definitely a good idea, even though professional economists disagree on the topic. Of course, I noticed I myself have several topics on which I think expert disagreement is irrelevant, like the non-existence of libertarian free will or my belief that electrons aren’t conscious.
I absolutely agree that you can’t judge who is an expert without already having a good understanding of the topic. As Scott noted if you are a young earth creationist, you think the relevant experts are fundamentalist preachers, and listen to the experts can’t possibly save you then. In fact, very often, you’ll judge whether you can trust the experts by looking at whether you think they come to correct conclusions in which case you can’t then turnaround and use your conclusions to form your own. At the very least, even if you don’t want to judge them by their conclusions, you have to judge them by something like the quality of their arguments. In which case you’re already filtering for people who agree with you on which arguments are valid.
In general, the actual solution, most people in our position appear to resolve on is generally trust the experts, but feel free to disagree with them if you have thought about the topic sufficiently. of course, I noticed that many of these same people are much more absolute in their trust of the stock market. Even though it effectively works by aggregating the knowledge of society which these self same people feel free to 2nd guess in different situations. Of course, given that these people end up effectively disagreeing with the stock market on various things like artificial intelligence, perhaps this is evidence they should trust the stock market less. Although I would note hear that, even if it leads to less accurate beliefs, ignoring authority is desirable for there to be disagreement in society because the discussion that follows will generally make society more accurate in its beliefs. Even if individual members are pushed further from the truth. Though I do think outside of expert communities on the margin we have too much willingness to ignore authority. Also to be fair while the arguments for complete adherence to authority can be philosophically strong. Most people can’t actually follow such a principle in practice.
Did you read this post from Daniel Muñoz - https://open.substack.com/pub/bigifftrue/p/stop-blocking-everyone-you-disagree?utm_source=share&utm_medium=android&r=1ackrk? It deals with a similar question of where we should source our information and beliefs.
An interesting point it brought out is that the best informed person is not necessarily the most informative. If we just take on the opinions of the best informed, our opinions will be entirely uninformative for those who already have access to their opinions.
It feels kind of paradoxical that we can be smarter collectively by being willing to risk being more wrong individually. But, this is kind of the heart of science, evolution, and all progress. Come up with new (likely wrong) ideas and put them to the test.
Yep I did read it! Good post good post, the masses indeed lead us along the search for truth.
There’s a good LessWrong post about this too, how you can waste 20 years of your life on a totally wrong idea and it will contribute to society, but it’s a bit cruel to be encouraged to waste 20 years of your life on a false idea
https://www.lesswrong.com/posts/PGfJdgemDJSwWBZSX/science-isn-t-strict-enough
I wrote a pretty similar article recently about this general subject (how to determine truth and to what extent one should put their faith in "the experts"): https://glasshalftrue.substack.com/p/science-is-a-liar-sometimes
I agree that actually, fully understanding something is the best way to reach the truth on something, rather than deferring to experts. But at the same time, practically speaking you will not have the wherewithal to personally vet 99.99% of your knowledge, so there will always inevitably be a few false beliefs that sneak in; the best you can do is have epistemic humility and be willing to continuously re-evaluate what you think you know and who you trust.
Love this!
Persuasiveness is a powerful trait, and intelligent, persuasive people are a little scary because it's so easy to fall into a bit of a trance from their rhetorical skill and their smooth explication of the best arguments [for their side]. *Especially* for subjects out of your own wheelhouse.
Hell, I find myself too easily convinced by smart people making good arguments in fields that aren't my own and it takes a lot of effortful, System 2 engagement to stop myself from simply adopting the beliefs so clearly explicated for me.
What I love about Scott and many others in the SSC/ACX commentariat (hehe) is the combination of:
(1) attempting to steelman other positions [and expressing that value directly],
(2) explicit willingness to change / acknowledge error [and expressing that value directly], and
(3) cultivating a community of "check my work please" and in-depth arguments in the comment section [and expressing the value of poking holes in each other's thoughts with civility].
It's similar to why I'm a scientist: science in practice is not always perfectly self-correcting, and peer-review / conferences aren't a pure quest for truth above all by any means, but the enterprise is at least formed around those values. (Am I a science glazer now? Is this glazey enough? Am I doing it right??)
I recently wrote an article about not deferring to people smarter than you.
https://n328kf.substack.com/p/conservatives-against-intellectuals
The subtitle is "I'll never be as smart as Scott Alexander."
Regarding your claim that once you understand an argument, authority is a footnote, I think this neglects the possibility that you might think you understand an argument while not actually getting it. Pretty much nobody who ever misunderstands an argument or things they have rebuted it when they haven’t thinks they don’t understand the argument. For example, back in the day, I used to be confident that I had seen the problems with the argument from fine-tuning and therefore the actual value of physical constants shouldn’t be a problem I need to worry about. I still think it doesn’t work as an argument for God, but now that I have a better grasp of anthropics, I think I was clearly in error when I thought I had seen why the argument was mistaken. Basically, if you and a super smart person disagree about whether an argument is correct, you should all else equal assume that the smarter person is correct.
I do think Scott is a poor example, because while I’m pretty confident, he is smarter than I and possibly you his actual performance in his prediction contests makes me think you’d be better off, consulting the opinion of superforecasters. Of course as a practical matter, I noticed that I feel comfortable disregarding their opinions where I think the reasoning is clearly wrong, for example on existential risk from artificial intelligence. Similarly, I think many worlds is correct, even though the community of physicists who have probably forgotten more on the topic then I remember is sharply divided when it comes to this subject. This does indicate that subconsciously my brain still thinks it’s knowledgeable enough to judge the topics to some extent, but I noticed I thought the same back when I was in my auntie string theory phase, and in retrospect, I was clearly not thinking clearly in that time period and definitely wasn’t as knowledgeable as I needed to be to form confident opinions. I think in practice you need to give substantial way to authority when reasoning about a topic, but I do think that I’m not making a mistake when I don’t give authority in finite wait and do let sufficient evidence from my own thinking overrule It. Of course, this could be a mistake, and I noticed most people have overly confident opinions on topics where they have done, minimal research and thinking, even though they are aware that experts disagree. Think of topics like the minimum wage where people will actively get angry at the suggestion that it’s not definitely a good idea, even though professional economists disagree on the topic. Of course, I noticed I myself have several topics on which I think expert disagreement is irrelevant, like the non-existence of libertarian free will or my belief that electrons aren’t conscious.
I absolutely agree that you can’t judge who is an expert without already having a good understanding of the topic. As Scott noted if you are a young earth creationist, you think the relevant experts are fundamentalist preachers, and listen to the experts can’t possibly save you then. In fact, very often, you’ll judge whether you can trust the experts by looking at whether you think they come to correct conclusions in which case you can’t then turnaround and use your conclusions to form your own. At the very least, even if you don’t want to judge them by their conclusions, you have to judge them by something like the quality of their arguments. In which case you’re already filtering for people who agree with you on which arguments are valid.
In general, the actual solution, most people in our position appear to resolve on is generally trust the experts, but feel free to disagree with them if you have thought about the topic sufficiently. of course, I noticed that many of these same people are much more absolute in their trust of the stock market. Even though it effectively works by aggregating the knowledge of society which these self same people feel free to 2nd guess in different situations. Of course, given that these people end up effectively disagreeing with the stock market on various things like artificial intelligence, perhaps this is evidence they should trust the stock market less. Although I would note hear that, even if it leads to less accurate beliefs, ignoring authority is desirable for there to be disagreement in society because the discussion that follows will generally make society more accurate in its beliefs. Even if individual members are pushed further from the truth. Though I do think outside of expert communities on the margin we have too much willingness to ignore authority. Also to be fair while the arguments for complete adherence to authority can be philosophically strong. Most people can’t actually follow such a principle in practice.