49 Comments

I think there's a third option for dealing with the person in the audience: ignore her.

Expand full comment

It is difficult to ignore a distraction in a public setting where attention is surplus, scattered and variable. For one thing, collective 'ignoring' is more difficult to coordinate effectively.

Expand full comment

I would even say, you could politely make the point that, while gratuitous offence should generally be avoided, the higher ideals of truth and freedom of speech should not be sacrificed in the name of avoiding offence. None of this ought to be abrasive - I would doubt the good faith of any individual who found such an argument itself offensive, I would tend to interpret it as a learned behaviour for winning arguments without actually having to put forward any reasoning.

Expand full comment

Um does anyone find shrimp welfare "intellectually engrossing"? I feel very confident that almost all EAs in animal welfare aren't nerd-sniped into helping shrimp (or other farmed animals), but rather pursue it because it looks incredibly important/effective. I think intellectual curiosity does bias people to prioritise AI risk and anthropics on non-veridical grounds, but also in practice AI risk does happen to be very real and urgent. (Anthropics could be important too, but I would guess it's discussed somewhat more than it needs to be on the margin.)

Expand full comment

Thats my impression as well, 90% of the people i met in EA think its an important issue to investigate and work on but they dont find it that interesting after the introduction to the issue

Expand full comment

The quote from Anna Karenina is a good analogy for the current state of the Effective Altruism movement... A reminder that in the pursuit of the common good, something higher must guide us, not just intellect

I have always thought of EA as kind a "cult" and the people who join the cult are smart but rudderless, aimless, looking to fill a god-sized hole in their heart... searching for something to grab on to... I don't know the EA community well enough to say this for sure, but I am guessing they are mostly atheists, right? or maybe i wrong

Expand full comment

The vast majority are non-religious, yes. https://rethinkpriorities.org/publications/eas2019-community-demographics-characteristics#:~:text=Religious%20Affiliation,2018%20and%202017%20(80%25)%20.

Though even as a Christian, I think that Caplan is wrong to dismiss their cause areas. There's a Christian effective altruism group, and I'm sure they would agree.

Animal torture really does matter, and even if their claims of existential crisis are wrong, those things could still lead to mass casualty catastrophes, and there doesn't seem to be any eschatological reason for ignoring catastrophes.

Expand full comment

Why does something higher than intellect have to guide us?

What things are 'higher' than intellect and why are they higher?

And yeah, your point about EAs using it as a source of meaning is probably correct. But isn't that a good thing? People getting their meaning from religion does not really improve the world, whereas people getting their meaning from helping others probably will. The latter is vastly better than the former

Expand full comment

Intrinsically religious people have better lives across practically all metrics. So even if Christianity is false (I don't think it is), it improves the world by a lot.

Expand full comment

What metrics are you referring to? I only know of happiness and amount donated. I haven't seen it but would be willing to bet the religious have more friends so I'll grant that.

But isn't it a big stretch to then say "it improves the world by a lot" ? Especially given religious wars and conflicts?

Also, I think an interesting question is whether the world would be even better off if we abandoned religion and replaced it with a meaning framework similar in spirit to effective altruism. It seems likely to me that all the benefits of religion come from the sense of meaning, duty and community they have. If we can devise a non-religious social movement that gives meaning, fosters community and encourages improving the world as a duty, then that seems like a positive step. The current EA movement is far from perfect (neither are any religions in practice), but it's ideals are in that direction. What do you think?

Expand full comment

This "higher" is our instincts and intuitions, right? But the problem is, our intuition works quite well in small, comprehensible groups (better than rational and calculated rules anyway), but it works very badly in overwhelmingly large groups of humans (+ other animals?) or in situations too different from our evolutionary history (e.g. AI doom). That's the whole reason for a movement like EA.

Expand full comment

The common good is an unqualifiedly noble objective.

If an individual lacks "heart," he probably must simply live with his condition, as if he were born short, or stupid, or blind: he must work around it as best he can.

Expand full comment

> When a normal utilitarian concludes that mass murder would maximize social utility, he checks his work! He goes over his calculations with a fine-tooth comb, hoping to discover a way to implement beneficial policy changes without horrific atrocities.

I think this is overly-optimistic. I've found cognitive biases abound in rationalist circles and their powers to convince themselves are truly impressive.

That's not to say they're equivalent to Leninists, but there's a sometimes ironic extra-dangerousness about rationalists that are genteel and thoughtful mental contortionists. You don't suspect them. And they don't suspect themselves as much as they should.

Expand full comment

To quote Scott A. , who insists EA centers on a simple question:

"Are you donating 10% of your income to the poorest people in the world? Why not?"

Think that 10% is the wrong number, and you should be helping people closer to home? Fine, then donate some amount of your time, money, something, to poor people in your home country, in some kind of systematic considered way ... . If you’re not doing this, your beef with effective altruism isn’t “the culture around (EA) ...”, your beef is whatever’s preventing you from doing that. You may additionally have an interesting intellectual point about the culture around (EA), much as you might have an interesting intellectual point about which Bible translations you’d prefer if you were Christian. But don’t mistake it for a real crux. - end of quote - https://astralcodexten.substack.com/p/effective-altruism-as-a-tower-of

Expand full comment
Comment deleted
Aug 4, 2023
Comment deleted
Expand full comment

It is not necessarily meant as "critique". Maybe a defense. Maybe a motte-and-bailey? This time Bryan 'spent his time talking about EA', and he pokes friendly fun at ideas about Doom(by AI), "shrimp welfare" and doubts their sincerity ("the common good and the immortality of the soul no closer to heart than those about a game of chess"). - I quote ACX to illustrate what EA is principally about (the Motte): "Help, in a less-dumb way." ("open borders, education, having children, poverty, housing": all important aspects, often discussed) And as Scott is a lot of things, but not least: as sincere as a human can be - I feel: Bryan's impression may well be spot-on with some EA-guys, less with others. - That said: Scott (CIV4), the Zvi (Magic) and putanumonit are very serious about games, too. ;) Wished I could attend Bryan's role-play-event!

Expand full comment

The Tolstoy passage didn't even strike me as amounting to much of a critique. I was expecting something more, like Dickens on "telescopic philanthropy" or Stirner on replacing God with humanity or "the good".

Expand full comment

The idea that "effective altruists" and by extension "rationalists" need to be sold on anything as a group is disturbing. It's like the crowd in Life of Brian shouting in unison that they will think for themselves. A group that took these labels seriously would not look like such a homogeneous blob. Instead it's clearly hero worship of a few important people in the movement who set the agenda.

Expand full comment

When EAs converge on some group ideas, there are two competing hypotheses:

1) EA is a cult

2) Those ideas are right

I doubt it's all 1), more likely it's a decent mixture of both. "Cult" is a strong word, bringing to mind Scientology or Heaven's Gate; it's clear that EA is not like that.

Expand full comment

That's tantamount to saying there's no way to converge on any meaningful goals as for how to actually effectively help sentient beings, or that EA people are antisocial. The entire premise of EA is that there in fact exist more and less efficient ways to use resources to improve wellbeing or reduce suffering. If all they did was go lonely wolf and refuse to accept any ideas from anyone else, it would either imply there never was any way to differentiate between ways to reach these goals, or that they were much more antisocial than most people.

That is, of course some ideas will eventually surface and stay there. If not, there wouldn't be much point in even trying.

By the way, I'm not saying it isn't possible your comment describes the contemporary EA community.

Expand full comment

I just believe if that if those groups are going to use these concepts as their names, they should highly value pluralism, but everything I've seen from them have been the same memetic ideas repeated with little deviation. I have nothing against groups forming for political/social aims, but they shouldn't be taking pluralistic concepts to name their harshly collectivist groups if they don't want me to complain about them!

Expand full comment

Getting rid of immigration restrictions is such an underrated EA cause area. I’ve spoken personally to many EAs who support open borders, but it somehow hasn’t turned into a big cause area. Maybe it’s too blatantly political for EA’s tastes

Expand full comment

I have an issue with EA stressing whether an issue is neglected [at least in the history]. If an issue isn't neglected enough, it's not worth talking about. Which sometimes leads to weird results.

Expand full comment

I considered working on it myself but several relatives just hate the idea and i emotionally burned out on the whole thing and almost burned bridges

I dont know how tractible it is, and politics wise building and energy reform seems more effective and tractible

Expand full comment
Comment deleted
Aug 5, 2023
Comment deleted
Expand full comment

The thing is that the people who most strongly oppose open borders also oppose the welfare state, so I struggle to take this argument in good faith.

Expand full comment

No one likes having less money and just doing spreadsheets about QALYs isn't that fun so you need to offer potential donors/boosters something to encourage them to participate.

This is just the EA version of the fancy banquets, money spent on 5ks for the cause or the gimmicks that make it seem like you've got a more direct connection to the person in need and all the other trappings around more mainstream charities.

Sure, it's an unorthodox version of it and they can't really admit that's what they are doing but how else do you make it work. If you tried having the fancy banquet it would interfere with the other draw: getting to feel smug about how you are donating more smartly.

Expand full comment

Hmm, theres probably something to that actually, at least for me;

(I donate 10% of my income efficient EA charities, done so for about 6 years. Been a micture between animal rights, global health, and X risk, currently just put into global health. Im not an high impact person but have affected 5-10 peoples carrerrs and donations somewhat to an EA direction)

I found out about EA 7 years ago i think? Were very passionate about it for a while, and then i kinda knew everything on an intellectual level and burned out+got bored of the subject.

X risk+ AI made it interesting and novel again, and helped me continue to donate snd investigate things.

I dont know if my pledge would havd stopped without it, but it was definitely strengthened by that. Currently my altruistic interest pendle between global health, X risk and animals every 6 months

Expand full comment

I've probably misunderstood you, but it seems you're arguing that EAs need to *feel* the need to do good, as well as *figure out rationally* that they ought to do good.

But I'd argue the fact that EAs discourage going with their heart and prioritise formal analysis is *feature* not a bug.

From a consequentialist POV, who cares how someone decides that helping the poor is something they should do?

Also, why do you believe shrimp welfare bizarre? Sure, its not something most people consider, but you think through your opinions. Do you have a critique in mind or just find it strange?

Expand full comment

BC seems to go with revealed preference and appearences in morality, and revealed preference shows that people dont care about animals, especially not shrimp, because we factoryfarm billions of them every year.

I and some EAs argue that whether you personally care about something is separate from whether it is good or not, and also that our revealed preferences just show what any one individual prefers personally in life, but that revealed preference says nothing on whether something is moral or not

My medium suspicion is that if factory farming is made 100% irrelevant by new tech, and that if people would go vegan, then BC would switch his opinion and go with “i dont think animals matters but people show that they do”

Expand full comment

Some years ago at Open Borders: The Case, I suggested a kind of first-approximation meta-ethics consisting of (a) universal altruism with (b) division of labor. In principle, value everyone's welfare equally and act accordingly. But in practice, information and incentive problems with long-distance charity are huge, so it's legit to keep most of your generosity close to home. It often won't take the form of formal charity but of service to our friends and families. It starts to look kind of like how ordinary people live already.

I think effective altruists should seriously consider whether most ordinary, decent people are more effective altruists than most of the self-styled "Effective Altruists." Maybe they do; I'm not super plugged into that crowd's convos. (Too busy serving my family!) But I'm sympathetic in theory.

https://openborders.info/blog/a-meta-ethics-to-keep-in-your-back-pocket/

Expand full comment

EA aligned person here:

I found point 1 not very convincing, as you are going with “it appears morally important and emotionally compelling to me” as the principle for judging whether something morally matters or not.

A key part of my worldview (and i suspect many other EA people) is that while our gut morality instincts are useful and reliable for private interactions, they are not made for large scale or systemic interactions.

I find it important to be intellectually open and investigate new frontiers.

I and many EAs i suspect, think that we dont know everything we could when it comes to morality, whether deontologicly or utilitarian wise.

You seem to have the view that what matters morally is obvious in majority of cases, and that appearences are the most important moral knowledge. I think appearences are useful but serve as a guiding post, not the final say

Also, you put extremely low weight on hypotheticals and general worries that havnt happened yet, and also on animals and animal suffering.

I think most people put way to much weight on hypotheticals, but i think it shouldnt be dismissed everytime.

Thus, your point on EAs does not resonate with me and i suspect many people there.

Additionally, going with jonathain haidths 6 moral values, your moral profile is likely very different then most EAs.

The moral values as a recap:

Care/harm

Fairness/cheating

Loyalty/betrayal

Authority/subversion

Sanctity/degredation

Freedom/controll

My intuition is that you score Low on:

Care/harm, loyalty, authority, sanctity

And extremely high on:

Fairness/cheating, freedom.

I cant find any statistics on the average EA, but i would assume they score like a mixture between libertarian and progressive. So...

High: care,

Medium: freedom, loyalty, fairness, authority

Low: sanctity

Personally i score:

Extremely high on care and liberty

Low on everything else,

In order to convince me or EAs that we should be less intellectual and more guided by our moral insticts, you likely need to:

1, show that intellectually considering things is outscored morally by gut consideration, in similar population groups and contexts

2, show that animals dont matter. And saying that revealed preferences and voluntary actions shows that people dont care about animals is not proof in my eyes, as i am skeptic about our gut instincts and see us as a conformist species

3, show that what morally matters is the same as what is emotionally compelling.

4, clarify why revealed preference shows morality. (I have read your “proof by hypocritical behaviour) and similar posts and havnt found them compelling

5: Explain why past “obvious moral failings” were obvious moral failings to people even in that time, and explain why people commited them anyway.

And also why mora feelings change overtime. SDB is not an answer for long term trends in my opinion.

The fact that the appearent majority of people during slavery times felt that slavery was fine and moral, serves to me as very strong proof that self interest makes revealed preference around morality extremely dubious.

These are my thoughts at least.

Expand full comment

I know people who are sometimes struck by strong and empathetic charity reflexes on clear, humanly comprehensible topics. And I know people, like maybe Konstantin's brother, who don't have very strong empathic reflexes, and who therefore wouldn't spend anything on charity hadn't they come to it through slow, rational thought. If they also reach the shrimps during these deep contemplations, it is a small cost (I hope shrimps are not highly conscious, otherwise the shrimp-advocates are absolutely right). Honestly I think the rational ways of charity are better overall than the emotional ways (which is the whole point of EA). Instinctive charity works well in small groups, rational charity in large groups, etc.

Expand full comment

What is the moral code of altruism? The basic principle of altruism is that man has no right to exist for his own sake, that service to others is the only justification of his existence, and that self-sacrifice is his highest moral duty, virtue and value.

Do not confuse altruism with kindness, good will or respect for the rights of others. These are not primaries, but consequences, which, in fact, altruism makes impossible. The irreducible primary of altruism, the basic absolute, is self-sacrifice—which means; self-immolation, self-abnegation, self-denial, self-destruction—which means: the self as a standard of evil, the selfless as a standard of the good.

Do not hide behind such superficialities as whether you should or should not give a dime to a beggar. That is not the issue. The issue is whether you do or do not have the right to exist without giving him that dime. The issue is whether you must keep buying your life, dime by dime, from any beggar who might choose to approach you. The issue is whether the need of others is the first mortgage on your life and the moral purpose of your existence. The issue is whether man is to be regarded as a sacrificial animal. Any man of self-esteem will answer: “No.” Altruism says: “Yes.”

-Ayn Rand

The readiness to sacrifice one's personal work and, if necessary, even one's life for others shows its most highly developed form in the Aryan race. The greatness of the Aryan is not based on his intellectual powers; but rather on his willingness to devote all his faculties to the service of his community.

-Adolf Hitler

1920s and 1930s German culture, as Nazism was gaining power was filled with calls for sacrifice from Christians, Marxists, intellectuals, and most passionately, Nazis. Each group attacked the alleged selfishness of the other groups, exactly as Catholics, Protestants and other Christian groups attacked each other throughout its Christian history. Hitler loudly, passionately, constantly, hysterically called for Germans to sacrifice themselves to nation, race, state and himself. German military policy was based more on sacrifice and recklessness than victory and bravery. Thus the refusal to retreat from Stalingrad after victory became impossible. The Marxist Soviet Union had the same policy. Your disagreement with Hitler is merely the extent of the sacrifice. Hitlers superficial disagreement w/Christianity was merely the difference over the sacrifice of the strong and the sacrifice of the meek. Both Hitler and Christianity explicitly denounce selfishness. Morality is a guide to life, not the sacrifice of life. There is no rational justification of sacrifice, thus the respect for subjectivism and mysticism as defenses of sacrifice. And the most important sacrifice for all advocates is the sacrifice of the mind. Intellectual pride is the biggest Christian sin.

Americas moral uniqueness, hated by Left and Right, is its politics of life, liberty and the pursuit of happiness. Many Christians at that time correctly denounced this selfishness. Todays Christians are returning to that view. Or will you absurdly claim that crucifixion is the base of individual rights?

Expand full comment

Am putting this on a coffee mug

EA is a minor flaw compared to the Leninist practice of negligently doing massive evil in the vague hope of realizing a “greater good.”

Expand full comment

Sorry Bryan, you way too late to the conversation. Jay Rollins said it all before.

https://www.wonderlandrules.com/p/effective-altruism-is-a-cult

Expand full comment