I think part of the situation here is that you are responding to a bit of a troll. Rohan seems very clever at posting things that are subtle baits for engagement (and some, like part 2, clearly not subtle at all).
It’s possible you noticed this while writing your post, but I suspect he is not arguing in good faith, he is arguing for the sake of arguing - this isn’t wrong or anything.
But I feel… a phrase… rising out of my bones…. Don’t feed the trolls! I’ve wanted to comment on blogs like he posted but delete them every time, usually once I realize they aren’t actually interested in becoming more informed so much as a public spectacle.
Yep, I agree that Rohan is a probably a bit of a troll. He’s also pretty funny, especially in some of his other posts. But I think trolls who have more influence than you do are fine targets for rebuttal (he got a ton of likes!)
I’m fine with putting on a public spectacle to defend my side, though. It’d be fun if Rohan responded, but this post is like most social media bouts: an attempt to try to sway readers on the fence to my side. The tone of my article here is definitely pretty snarky and it isn’t exactly an academic paper, but I will defend the practice of making good faith arguments to death with, uh, hopefully mostly good faith arguments.
I think this sort of demonstrates why people tick differently. Because I thought it was a good article! I don’t think one needs a comprehensive ideology/philosophical framework to say that another one is bad. In fact, I somewhat object to implementing such ideologies and frameworks (yay wuwei and all that). In this regard, pointing out how (prominent) rationalists are kinda weirdos and conduct themselves in ways that may run counter to what they say is an argument against them of sorts.
Take, for instance, thought experiments. I’m not going to argue they aren’t important for public policy makers - especially the mathematical ones. But for most people, including rationalists, they aren’t particularly useful. We shouldn’t be attached to our conclusions there, because doing so could get in the way of us truly treating each other “Good.”
I admit that I shall defend even the most abstract hypotheticals till the day I die. I feel they’re a useful tool for deciding what you prioritize in situations, holding different variables constant, and I actually made a Substack account specifically to defend abstract hypotheticals underneath the Scott Alexander post he mentions (lol). I became an effective altruist from really thinking about them, so I gotta be the #1 defender.
I want to defend rationalists from the sort of “point and laugh” attacks because there are common stories of meeting people at rationalist meetups who say “I’m not a rationalist” seemingly mostly to distance themselves from the negative associations with the label. I think their ideas are mostly good! I’m an idea guy! That means something! Maybe if their PR keeps going down I’ll have to switch to some label that means the same thing without changing my beliefs, but I like the name, too.
I see it sort of as Chigurh’s Maxim: “If the rule you followed brought you to this, of what use was that rule?” He’s the bad guy and I know he was really talking about something else, but I’ve thought of it that way quite a bit.
Because, I mean, if it doesn’t work., why follow it? I don’t think rationalists are quite as unsuccessful as he claims (some do quite well), but he has a point if the ideology’s exponents aren’t that great, why bother with it?
There is a sort of antirational argument that following your instincts is more useful in some cases. Of course we know people who have done much worse by following their instincts. I think it all depends on how good your reason and your instincts are and that’s a case-by-case thing everyone has to follow on their own.
He has some good points that rationalists do focus too much on thinking and not enough on doing. I posted a weak defense of rationalism on the free article, I may post a weak defense of his article here; I’m kind of halfway between the two of you.
I might actually drop my critiques of rationalism a little later if I have time. I don’t think it’s all good or all bad.
The funniest part is that rationalists even understand the problem. Scott once wrote about the Gray Tribe, people with a tendency to over-analyze everything, and he named the gray because they tend to be unsexy and uncool both literally and metaphorically.
The original post was more than a little mean spirited, but it did have some good points to it. Critiquing rationalists for being fat might be cruel, but given the costs of being unfit, and the degree to which those costs are born by society as a whole, basic physical fitness should be a fairly nonnegotiable starting point for a supposedly rational individual. It also helped me to coalesce my discomfort with Rationality, and the related Effective Altruism.
First, there’s a tendency to apply the tenets of Rationality unequally, essentially rationalizing the position you want to be true. Basic heuristics apply if they support your position, not otherwise. It’s the same taint of hypocrisy that tars most objectivists, or the tendency of many right-leaning influencers to glorify Stoicism and misquote Marcus Aurelius. A few posts from Scott Alexander come to mind, but perhaps the most relevant (because I have firsthand knowledge of several of the issues) is his post on prison and crime. Scott Alexander has done this a few times (again, I only notice when the topic crosses my work), but the analyst he presented as one of three viewpoints, David Roodman, was so intellectually dishonest I could hardly believe it. The phrase I use is “skull-fucking the data.” I apologize if that’s a little cruder than you prefer, but when you add in a $50,000 per year cost as the social penalty for keeping a FELON in PRISON in order to make the numbers match your preferred outcome, I do not and will not value your rationality.
Second, and related, is the staggering arrogance that comes when someone has convinced themselves they are above the rest of the rabble. Ozy Brennan posted about his time on a murder trial, and during that post, commented he considered breaking his very clear instructions to not to do his own research into the crime, believing that, clearly, those instructions were for less intelligent people, who would clearly do less effective research. Given that his idea of research appeared to be googling his favorite journalist, and his level of surprise over how stupid the average homicide really is (my jurisdiction was one victim short of a mass shooting because someone got sprayed with a water hose), that arrogance is thoroughly misplaced.
Third…the tendency for navel-gazing, for searching for the best theoretical scenario instead of the best possible outcome, puts people like me off. One of the sayings in my profession is “sometimes you need to eat a shit sandwich.” I can’t sit there pontificating forever while avoiding making a decision.
Trying to find the actual replies section…guess I’ll post the note.
So there’s a couple of separate questions here, which don’t necessarily have the same answer.
Does rationalism work as theory, i.e. as a method for finding truth?
I’m probably not sophisticated enough philosophically (or, quite likely, intelligent enough) to assess this. I believe philosophers have poked holes; I am not really qualified to judge. Utilitarianism, for instance, has been debated pro and con by lots of philosophers. A lot of the AI and shrimp welfare stuff I am simply too ignorant or uncaring to comment on; I have no idea whether AI is really going to destroy the world, and frankly don’t care about shrimp or animals, because I’m a bad and selfish person. I’ll punt on this one.
Does rationalism work as practice, i.e. does it do what it sets out to do?
Among the most prominent practitioners, Yudkowsky’s not that impressive, Scott Alexander was a good psychiatrist and a great writer, Mowshowitz was a successful trader and now has 4 kids, so evidently it’s not a slamdunk but not incompatible with success. And certainly they have brought some interesting ideas out there.
But his critique does have a point—rationalists do tend to deemphasize stuff like exercise and have problems dating on average. This seeps into…
Should I become a rationalist?
There is a real sense in which, when it comes to prophets, “by their fruits shall ye know them”. I think the results are again mixed here…
I think if you take it with a grain of salt it’s a nice way to find right-of-center nerds (a group I fall into) to hang out with. I don’t think if you don’t like right-of-center people or nerds it’s all that useful. Given that you’ve said you’re a healthy weight and have a girlfriend I suspect you have found the correct level of engagement with this stuff.
That said he’s right, not everyone wants to be an asexual econ blogger, and there have been people like the Zizians who have gone really off the rails. I think you need some degree of skepticism and detachment from rationalism for optimal results. (Of course many other ideologies are this way too.)
The honest-to-God truth is most people are better off working out, exercising, and not thinking too deeply about any of this. But if you are the sort of overthinky person who enjoys it, it’s probably not a bad way to meet other people like you, as long as you don’t sell all your stuff because the AI is coming. It’s indoor fun for intellectuals, as a right-wing author once wrote. And as such it’s better than Marxism or intersectionality (unless you’re trying to get tenure) or fascism, but Ghostwind’s point that maybe one needs more than indoor fun is definitely also worth considering.
I agree with nearly everything here, my only push back here is Yud’s level of success. From basically being a nobody with no degree, he has by words and persuasion alone got to a point where literally the most powerful people in the world have heard his message, probably 1/3 of all AI researchers are AI-risk pilled thanks to Yud, at least half have heard a lot of his arguments etc, etc.
Like in terms of him successfully getting the message out about an existential risk he believes is important, I think he has literally done as well as anyone can.
He has gotten the message out, but I think that AI progress has now moved on in spite of him. Rationalists will say that he “pioneered the field of AI safety“ but I think they also kind of pioneered modern AI progress in general—I wonder how many not safety-minded people were turned onto the field of AI by the likes of Bostrom and Yudkowsky.
Yeah, that post’s a mess, vibe wise, prose wise, and communication wise. Still, about what you’d expect from somebody with a >90% p(doom) in the era of “AI 2027,” right?
I’m of the mind that you can’t really separate the good and the bad of a person’s impact though - even if he’s gone off the rails, and is in some mordant “despair but take those 1-in-a-million coin flips anyways!” sort of mental space and not producing good work anymore, he still had a big impact in terms of being one of the founding Rat lights and in terms of being one of the most well known AI safety pioneers. To your point, maybe he was net negative, because he persuaded enough capabilities people to go in, too - I don’t think any of us can ever know, but from what was knowable and predictable from his standpoint, he did his best to get the word out, and did a good job of it.
Just like Musk - sure, he’s obviously out of his mind now and just flailing aimlessly and destroying things left and right while yes-men cheer him on, but he still made space flight 40x cheaper, and still made electric cars cool and shifted electric vehicle research and adoption to a higher / faster curve, and whatever else he’s done. There’s good and bad in everybody’s legacy.
This is an excellent point. If you view his life’s purpose as raising awareness of AI risk, he has been far more successful than one would expect from a degreeless autodidact.
The Rationalists have been behind the curve on recognizing the overt fascism of Trumpism because many of their main players were traumatized by “wokes.” I’m not saying that wokes don’t deserve some of their reputation, but surely you can concede that sometimes your movement has exactly the reputation it deserves.
Ghostwind leans right, further than most rationalists, so that’s not his point.
The leftist argument that rationalism is insuffiicently antifascist is another one—one even Scott Alexander took up in a (paywalled) post. FWIW I think you are correct, though some may have actually preferred it.
I’d counter the toxicity of your movement toward young men is its own problem and was largely responsible for this (though the rationalists will slap another five layers of argument on top), though that’s another story.
People are responsible for their own bad decisions. Blaming leftists for hurting fee fees of adult young men is deeply pathetic.
I’m not saying, mind you, that the left’s toxicity towards young men isn’t a problem and doesn’t need work, but don’t you dare say “largely responsible" for the fascism. The damned are damned by their own hand.
I think part of the situation here is that you are responding to a bit of a troll. Rohan seems very clever at posting things that are subtle baits for engagement (and some, like part 2, clearly not subtle at all).
It’s possible you noticed this while writing your post, but I suspect he is not arguing in good faith, he is arguing for the sake of arguing - this isn’t wrong or anything.
But I feel… a phrase… rising out of my bones…. Don’t feed the trolls! I’ve wanted to comment on blogs like he posted but delete them every time, usually once I realize they aren’t actually interested in becoming more informed so much as a public spectacle.
Yep, I agree that Rohan is a probably a bit of a troll. He’s also pretty funny, especially in some of his other posts. But I think trolls who have more influence than you do are fine targets for rebuttal (he got a ton of likes!)
I’m fine with putting on a public spectacle to defend my side, though. It’d be fun if Rohan responded, but this post is like most social media bouts: an attempt to try to sway readers on the fence to my side. The tone of my article here is definitely pretty snarky and it isn’t exactly an academic paper, but I will defend the practice of making good faith arguments to death with, uh, hopefully mostly good faith arguments.
Haha, fair enough! I see the appeal for sure and good stuff can come from it. Enjoy the fray, friend! 🍻
I think this sort of demonstrates why people tick differently. Because I thought it was a good article! I don’t think one needs a comprehensive ideology/philosophical framework to say that another one is bad. In fact, I somewhat object to implementing such ideologies and frameworks (yay wuwei and all that). In this regard, pointing out how (prominent) rationalists are kinda weirdos and conduct themselves in ways that may run counter to what they say is an argument against them of sorts.
Take, for instance, thought experiments. I’m not going to argue they aren’t important for public policy makers - especially the mathematical ones. But for most people, including rationalists, they aren’t particularly useful. We shouldn’t be attached to our conclusions there, because doing so could get in the way of us truly treating each other “Good.”
I admit that I shall defend even the most abstract hypotheticals till the day I die. I feel they’re a useful tool for deciding what you prioritize in situations, holding different variables constant, and I actually made a Substack account specifically to defend abstract hypotheticals underneath the Scott Alexander post he mentions (lol). I became an effective altruist from really thinking about them, so I gotta be the #1 defender.
I want to defend rationalists from the sort of “point and laugh” attacks because there are common stories of meeting people at rationalist meetups who say “I’m not a rationalist” seemingly mostly to distance themselves from the negative associations with the label. I think their ideas are mostly good! I’m an idea guy! That means something! Maybe if their PR keeps going down I’ll have to switch to some label that means the same thing without changing my beliefs, but I like the name, too.
I see it sort of as Chigurh’s Maxim: “If the rule you followed brought you to this, of what use was that rule?” He’s the bad guy and I know he was really talking about something else, but I’ve thought of it that way quite a bit.
Because, I mean, if it doesn’t work., why follow it? I don’t think rationalists are quite as unsuccessful as he claims (some do quite well), but he has a point if the ideology’s exponents aren’t that great, why bother with it?
There is a sort of antirational argument that following your instincts is more useful in some cases. Of course we know people who have done much worse by following their instincts. I think it all depends on how good your reason and your instincts are and that’s a case-by-case thing everyone has to follow on their own.
He has some good points that rationalists do focus too much on thinking and not enough on doing. I posted a weak defense of rationalism on the free article, I may post a weak defense of his article here; I’m kind of halfway between the two of you.
I might actually drop my critiques of rationalism a little later if I have time. I don’t think it’s all good or all bad.
I’m excited to read them!
Hey, I'm just wondering, why doesn't this post show up on your home page or archive?
Does it not? It should, I probably messed with some setting somewhere.
Triple checked just to make sure I'm not going crazy. On Safari on my iPhone, and also on my mobile Substack app, I'm not seeing it.
The funniest part is that rationalists even understand the problem. Scott once wrote about the Gray Tribe, people with a tendency to over-analyze everything, and he named the gray because they tend to be unsexy and uncool both literally and metaphorically.
The original post was more than a little mean spirited, but it did have some good points to it. Critiquing rationalists for being fat might be cruel, but given the costs of being unfit, and the degree to which those costs are born by society as a whole, basic physical fitness should be a fairly nonnegotiable starting point for a supposedly rational individual. It also helped me to coalesce my discomfort with Rationality, and the related Effective Altruism.
First, there’s a tendency to apply the tenets of Rationality unequally, essentially rationalizing the position you want to be true. Basic heuristics apply if they support your position, not otherwise. It’s the same taint of hypocrisy that tars most objectivists, or the tendency of many right-leaning influencers to glorify Stoicism and misquote Marcus Aurelius. A few posts from Scott Alexander come to mind, but perhaps the most relevant (because I have firsthand knowledge of several of the issues) is his post on prison and crime. Scott Alexander has done this a few times (again, I only notice when the topic crosses my work), but the analyst he presented as one of three viewpoints, David Roodman, was so intellectually dishonest I could hardly believe it. The phrase I use is “skull-fucking the data.” I apologize if that’s a little cruder than you prefer, but when you add in a $50,000 per year cost as the social penalty for keeping a FELON in PRISON in order to make the numbers match your preferred outcome, I do not and will not value your rationality.
Second, and related, is the staggering arrogance that comes when someone has convinced themselves they are above the rest of the rabble. Ozy Brennan posted about his time on a murder trial, and during that post, commented he considered breaking his very clear instructions to not to do his own research into the crime, believing that, clearly, those instructions were for less intelligent people, who would clearly do less effective research. Given that his idea of research appeared to be googling his favorite journalist, and his level of surprise over how stupid the average homicide really is (my jurisdiction was one victim short of a mass shooting because someone got sprayed with a water hose), that arrogance is thoroughly misplaced.
Third…the tendency for navel-gazing, for searching for the best theoretical scenario instead of the best possible outcome, puts people like me off. One of the sayings in my profession is “sometimes you need to eat a shit sandwich.” I can’t sit there pontificating forever while avoiding making a decision.
I particularly enjoyed the bit where he couldn't work out how progress studies is any different from history.
I agree that there's got to be some trolling going on. It's difficult to tell how much is genuine ignorance though.
Trying to find the actual replies section…guess I’ll post the note.
So there’s a couple of separate questions here, which don’t necessarily have the same answer.
Does rationalism work as theory, i.e. as a method for finding truth?
I’m probably not sophisticated enough philosophically (or, quite likely, intelligent enough) to assess this. I believe philosophers have poked holes; I am not really qualified to judge. Utilitarianism, for instance, has been debated pro and con by lots of philosophers. A lot of the AI and shrimp welfare stuff I am simply too ignorant or uncaring to comment on; I have no idea whether AI is really going to destroy the world, and frankly don’t care about shrimp or animals, because I’m a bad and selfish person. I’ll punt on this one.
Does rationalism work as practice, i.e. does it do what it sets out to do?
Among the most prominent practitioners, Yudkowsky’s not that impressive, Scott Alexander was a good psychiatrist and a great writer, Mowshowitz was a successful trader and now has 4 kids, so evidently it’s not a slamdunk but not incompatible with success. And certainly they have brought some interesting ideas out there.
But his critique does have a point—rationalists do tend to deemphasize stuff like exercise and have problems dating on average. This seeps into…
Should I become a rationalist?
There is a real sense in which, when it comes to prophets, “by their fruits shall ye know them”. I think the results are again mixed here…
I think if you take it with a grain of salt it’s a nice way to find right-of-center nerds (a group I fall into) to hang out with. I don’t think if you don’t like right-of-center people or nerds it’s all that useful. Given that you’ve said you’re a healthy weight and have a girlfriend I suspect you have found the correct level of engagement with this stuff.
That said he’s right, not everyone wants to be an asexual econ blogger, and there have been people like the Zizians who have gone really off the rails. I think you need some degree of skepticism and detachment from rationalism for optimal results. (Of course many other ideologies are this way too.)
The honest-to-God truth is most people are better off working out, exercising, and not thinking too deeply about any of this. But if you are the sort of overthinky person who enjoys it, it’s probably not a bad way to meet other people like you, as long as you don’t sell all your stuff because the AI is coming. It’s indoor fun for intellectuals, as a right-wing author once wrote. And as such it’s better than Marxism or intersectionality (unless you’re trying to get tenure) or fascism, but Ghostwind’s point that maybe one needs more than indoor fun is definitely also worth considering.
I agree with nearly everything here, my only push back here is Yud’s level of success. From basically being a nobody with no degree, he has by words and persuasion alone got to a point where literally the most powerful people in the world have heard his message, probably 1/3 of all AI researchers are AI-risk pilled thanks to Yud, at least half have heard a lot of his arguments etc, etc.
Like in terms of him successfully getting the message out about an existential risk he believes is important, I think he has literally done as well as anyone can.
He has gotten the message out, but I think that AI progress has now moved on in spite of him. Rationalists will say that he “pioneered the field of AI safety“ but I think they also kind of pioneered modern AI progress in general—I wonder how many not safety-minded people were turned onto the field of AI by the likes of Bostrom and Yudkowsky.
Have you read Death with Dignity? (https://www.lesswrong.com/posts/j9Q8bRmwCgXRYAgcJ/miri-announces-new-death-with-dignity-strategy). I think that was the one essay that changed me from cautiously dismissing Yudkowsky to being generally skeptical of like, all of his claims.
Yeah, that post’s a mess, vibe wise, prose wise, and communication wise. Still, about what you’d expect from somebody with a >90% p(doom) in the era of “AI 2027,” right?
I’m of the mind that you can’t really separate the good and the bad of a person’s impact though - even if he’s gone off the rails, and is in some mordant “despair but take those 1-in-a-million coin flips anyways!” sort of mental space and not producing good work anymore, he still had a big impact in terms of being one of the founding Rat lights and in terms of being one of the most well known AI safety pioneers. To your point, maybe he was net negative, because he persuaded enough capabilities people to go in, too - I don’t think any of us can ever know, but from what was knowable and predictable from his standpoint, he did his best to get the word out, and did a good job of it.
Just like Musk - sure, he’s obviously out of his mind now and just flailing aimlessly and destroying things left and right while yes-men cheer him on, but he still made space flight 40x cheaper, and still made electric cars cool and shifted electric vehicle research and adoption to a higher / faster curve, and whatever else he’s done. There’s good and bad in everybody’s legacy.
This is an excellent point. If you view his life’s purpose as raising awareness of AI risk, he has been far more successful than one would expect from a degreeless autodidact.
Thanks for this! That article was really annoying me haha.
Autistic levels of missing the point.
The Rationalists have been behind the curve on recognizing the overt fascism of Trumpism because many of their main players were traumatized by “wokes.” I’m not saying that wokes don’t deserve some of their reputation, but surely you can concede that sometimes your movement has exactly the reputation it deserves.
Ghostwind leans right, further than most rationalists, so that’s not his point.
The leftist argument that rationalism is insuffiicently antifascist is another one—one even Scott Alexander took up in a (paywalled) post. FWIW I think you are correct, though some may have actually preferred it.
I’d counter the toxicity of your movement toward young men is its own problem and was largely responsible for this (though the rationalists will slap another five layers of argument on top), though that’s another story.
People are responsible for their own bad decisions. Blaming leftists for hurting fee fees of adult young men is deeply pathetic.
I’m not saying, mind you, that the left’s toxicity towards young men isn’t a problem and doesn’t need work, but don’t you dare say “largely responsible" for the fascism. The damned are damned by their own hand.