"Sometimes morality conflicts with other values and it needn’t always win." This seems nuts to me. What is ethics if it's not how-you-choose-between-values or how-you-make-choices? You seem to be saying something like "well, I think I'll make a moral decision now, but maybe I'll make an aesthetic decision later, and a hedonistic decision after that, and so forth... Each has its place." But then what do you call the process or system by which you decide which space each decision-making process has?
I think everyone has a decision system like this, we just dont usually say it outloud.
Even people that are classic deontologists need to weight different options between each other, where a classicly moral deontological decision loses out due to being too energy intensive
So we cut people moral slack. Utilitarianism tends to be much more ambitious(?) then deontology as deontology assumes that general rules or such should be followed, where Utilitarianism (and to an extent, Effective altruism) Has the goalpost for potential morality or goodness way higher, so there the moral slack stands out
The question there is very different from the question that EA's or utilitarians try to answer.
I think the one your talking about here is about general Prestige and accomplishment, rather than a specific instance of "what is the best thing to do here?"
I was not talking at all about prestige and acomplishment. My point is that the best EA that six figure earners can do from an utilitarian perspective is working some more. There are very few altruistic activities (if any) in which they can add more value to society.
On the other hand any EA that non-earners living on welfare could do will be great from an utilitarian perspective since their contribution to society is already negative.
Being this the case I find extremely weird that the author use the "six-figure earning working in a soup kitchen" example. Working in a soup kitchen is a waste of his/her valuable time for a six-figure earner but is a great contribution for a non-earner.
Good reply, sorry for not responding: Tired right now so i cant respond a lot
On the "soup kitchen" example being great contribution for non-earner; yeah i agree since they are already unemployed/on welfare: Im on welfare in sweden and try to regurly help out that way. I dont have the energy for a full EA style job, so besides minor advocasy i just try to help out wherever i can in the community: i have a history of messing up big responsibility so that works better for me
>> My point is that the best EA that six figure earners can do from an utilitarian perspective is working some more.
> No, because that isn’t altruism; it’s enlightened self-interest.
Did you read your own comment? When did utilitarianism become a synonym for altruism?
Furthermore, why is it more altruistic to do a small amount of good while incurring a penalty than to do a larger amount of good while benefiting from your own actions?
Nice write up, Chris. I do note, however, that you didn’t address the opening issue from the quote. The person turning meal prices into people he is letting die isn’t focused on quality but quantity, how much he is obligated to give vs enjoying himself. That seems to be an entirely open question, not solved by maximizing the quality of what you do give.
Hi Chris, I think there's an unfortunate ambiguity in ethical theory between this "narrow" conception of morality (as just one kind of value amongst many) vs the broader notion of *how one ought to live*, all things considered.
The latter question is clearly more important. So if you do think of utilitarianism as only a correct account of *narrowly* moral normativity, how much this matters will depend significantly on the further question of how much weight the narrowly-moral receives in the correct account of all-thing-considered practical normativity.
What's your theory of what we ought to do in the broader sense?
Hi Richard - Agreed on the unfortunate ambiguity. Although I can’t say that I have a worked-out theory, I suspect that the narrowly-moral should get more weight than most of us give it. (I think this becomes pretty clear when we reflect on the opportunity cost of the non-narrowly-moral stuff we indulge in—do we *really* think our expensive new entertainment system should get more weight all things considered than the life that money could save?)
Agreed that the impartial good warrants much more attention than almost any of us actually give it. Though note that this in turn limits how much mileage you can get out of the "morality isn't everything" line.
On quality versus quantity of giving, a useful analogy from some EA book or other (sorry, I forgot which) is to investing your money for your own profit: you don’t have to invest all your money, but the money you do invest should be in the best investment vehicles possible.
On utilitarianism competing with other values, it’s hard not to interpret “watching the Eagles game” as giving yourself hedons instead of giving potentially more hedons to others. The worry is that the utilitarian can (does?) convert literally everything you might do into a hedonic calculation like this, and describing some activities as promoting “non-moral” values just obfuscates this.
Maybe living a balanced and harmonious life in accord with the virtues of humor, justice, charity, art, science, and love is morality, and maximizing good for others is just part of that hard to achieve balance... that "morality must be balanced against other things" strikes me as an abuse of language.
I agree with David Gross - you can't really be a utilitarian with this perspective. You're trying to preserve the label "utilitarian" by redefining "morality" to be "whatever utilitarianism says is good". But that is an illegitimate way to argue that utilitarianism is moral. Morality, as understood by everyone else who speaks English, is the system that tells you which actions are better than which other actions. If you believe that it's better to occasionally buy sports tickets than to donate all your money to the Maximum Impact Fund, _and_ you believe that utilitarianism says the opposite, then you are saying that utilitarianism is not a correct system of morality, that the moral prescriptions of utilitarianism are wrong.
Morality cannot conflict with other values - morality defines what your values are. A value that conflicts with your morality is a value that you don't have.
Knowing myself, I have come to favor giving to people in need in my community, NYC. I want to to be able to see and monitor the effect of my giving. I will give more if this criterion is satisfied. That breaks rules of effective altruism, I know. Although in combatting food insecurity, one can get a 3:1 or more ban fro your book by supporting food pantries, which typically get a lot of food donated or at cost.
I had a related thought when reading this post. If I'm allowed to decide how much I give, but when I give I should optimize it, is that really so different from giving to a "suboptimal" charity that makes me feel better (than the optimal one)? I'm, in a sense, just changing the relative payout to myself and others. Giving locally would be an example, either because you can see the results, or because making your community better improves your own life as well as others.
“Donating $1,000 from your altruism budget to an opera house instead of to malaria relief is wrong. As in the lifeguard case, the cost to you of either form of help is the same, so you have no excuse not to allocate your help to where it’s needed most.”
The cost to the giver is not exactly the same if you consider non-monetary value. Because giving to the opera might make you feel better than giving to malaria relief. So the net cost to you is actually different.
As an EA person myself, i think its fine if people are just clear of what their goals with charities are: i just want to encourage effectiveness, and if people are interested in effectiveness but havnt considered it, im happy to explain and advocate it.
theres a ton of moral uncertainty in the world, so i dont have any right to coerce or force people. So i go with the principle of respect and the market place of ideas.
Morality is a philosophical metasystem to solve problems: specificly, the problem of what things we want in society, the world and humanity. And some people think that what we should want or aspire for is completely different than other people, without anyone really realizing it. Aka: People have different values and goals.
From the sounds of your statement, the problem you want morality to solve is how individuals should socially act in a predictable way, and with that goal, Cutting people moral slack or giving them wiggle room is absurd as it diminishes the potential of the goal itself.
the goal of most utilitarians and some EA's is how to have more Impact/Utility creation. So as long as it creates more utility, thats good. If people sacrificed more for socially maximal utility, that would be Good-er: but similar to a company earning money, while a 4% net profit is very good, and 3.8% is clearly less good, its still good.
"Sometimes morality conflicts with other values and it needn’t always win." This seems nuts to me. What is ethics if it's not how-you-choose-between-values or how-you-make-choices? You seem to be saying something like "well, I think I'll make a moral decision now, but maybe I'll make an aesthetic decision later, and a hedonistic decision after that, and so forth... Each has its place." But then what do you call the process or system by which you decide which space each decision-making process has?
I think everyone has a decision system like this, we just dont usually say it outloud.
Even people that are classic deontologists need to weight different options between each other, where a classicly moral deontological decision loses out due to being too energy intensive
So we cut people moral slack. Utilitarianism tends to be much more ambitious(?) then deontology as deontology assumes that general rules or such should be followed, where Utilitarianism (and to an extent, Effective altruism) Has the goalpost for potential morality or goodness way higher, so there the moral slack stands out
It's wrong to give money to the opera house but right to buy tickets to the Philadelphia Eagles game. I'm having trouble detecting a principle here.
"A six-figure earner does some good by volunteering at a soup kitchen"
A six-figure earner does a lot of good working. That's the reason he/she is paid a six-figure salary.
It is the non-earner living on welfare the one that should be looking for some good to do ... for a change
The question there is very different from the question that EA's or utilitarians try to answer.
I think the one your talking about here is about general Prestige and accomplishment, rather than a specific instance of "what is the best thing to do here?"
I was not talking at all about prestige and acomplishment. My point is that the best EA that six figure earners can do from an utilitarian perspective is working some more. There are very few altruistic activities (if any) in which they can add more value to society.
On the other hand any EA that non-earners living on welfare could do will be great from an utilitarian perspective since their contribution to society is already negative.
Being this the case I find extremely weird that the author use the "six-figure earning working in a soup kitchen" example. Working in a soup kitchen is a waste of his/her valuable time for a six-figure earner but is a great contribution for a non-earner.
Good reply, sorry for not responding: Tired right now so i cant respond a lot
On the "soup kitchen" example being great contribution for non-earner; yeah i agree since they are already unemployed/on welfare: Im on welfare in sweden and try to regurly help out that way. I dont have the energy for a full EA style job, so besides minor advocasy i just try to help out wherever i can in the community: i have a history of messing up big responsibility so that works better for me
You do things that benefit other most. Period.
Doing things that benefit other less is a worse idea. No matter why.
>> My point is that the best EA that six figure earners can do from an utilitarian perspective is working some more.
> No, because that isn’t altruism; it’s enlightened self-interest.
Did you read your own comment? When did utilitarianism become a synonym for altruism?
Furthermore, why is it more altruistic to do a small amount of good while incurring a penalty than to do a larger amount of good while benefiting from your own actions?
Nice write up, Chris. I do note, however, that you didn’t address the opening issue from the quote. The person turning meal prices into people he is letting die isn’t focused on quality but quantity, how much he is obligated to give vs enjoying himself. That seems to be an entirely open question, not solved by maximizing the quality of what you do give.
Hi Chris, I think there's an unfortunate ambiguity in ethical theory between this "narrow" conception of morality (as just one kind of value amongst many) vs the broader notion of *how one ought to live*, all things considered.
The latter question is clearly more important. So if you do think of utilitarianism as only a correct account of *narrowly* moral normativity, how much this matters will depend significantly on the further question of how much weight the narrowly-moral receives in the correct account of all-thing-considered practical normativity.
What's your theory of what we ought to do in the broader sense?
Hi Richard - Agreed on the unfortunate ambiguity. Although I can’t say that I have a worked-out theory, I suspect that the narrowly-moral should get more weight than most of us give it. (I think this becomes pretty clear when we reflect on the opportunity cost of the non-narrowly-moral stuff we indulge in—do we *really* think our expensive new entertainment system should get more weight all things considered than the life that money could save?)
Agreed that the impartial good warrants much more attention than almost any of us actually give it. Though note that this in turn limits how much mileage you can get out of the "morality isn't everything" line.
On quality versus quantity of giving, a useful analogy from some EA book or other (sorry, I forgot which) is to investing your money for your own profit: you don’t have to invest all your money, but the money you do invest should be in the best investment vehicles possible.
On utilitarianism competing with other values, it’s hard not to interpret “watching the Eagles game” as giving yourself hedons instead of giving potentially more hedons to others. The worry is that the utilitarian can (does?) convert literally everything you might do into a hedonic calculation like this, and describing some activities as promoting “non-moral” values just obfuscates this.
Maybe living a balanced and harmonious life in accord with the virtues of humor, justice, charity, art, science, and love is morality, and maximizing good for others is just part of that hard to achieve balance... that "morality must be balanced against other things" strikes me as an abuse of language.
> Postscript: Am I really a utilitarian?
I agree with David Gross - you can't really be a utilitarian with this perspective. You're trying to preserve the label "utilitarian" by redefining "morality" to be "whatever utilitarianism says is good". But that is an illegitimate way to argue that utilitarianism is moral. Morality, as understood by everyone else who speaks English, is the system that tells you which actions are better than which other actions. If you believe that it's better to occasionally buy sports tickets than to donate all your money to the Maximum Impact Fund, _and_ you believe that utilitarianism says the opposite, then you are saying that utilitarianism is not a correct system of morality, that the moral prescriptions of utilitarianism are wrong.
Morality cannot conflict with other values - morality defines what your values are. A value that conflicts with your morality is a value that you don't have.
See also Scott Alexander's "Infinite Debt": https://slatestarcodex.com/2014/05/10/infinite-debt/
Knowing myself, I have come to favor giving to people in need in my community, NYC. I want to to be able to see and monitor the effect of my giving. I will give more if this criterion is satisfied. That breaks rules of effective altruism, I know. Although in combatting food insecurity, one can get a 3:1 or more ban fro your book by supporting food pantries, which typically get a lot of food donated or at cost.
I had a related thought when reading this post. If I'm allowed to decide how much I give, but when I give I should optimize it, is that really so different from giving to a "suboptimal" charity that makes me feel better (than the optimal one)? I'm, in a sense, just changing the relative payout to myself and others. Giving locally would be an example, either because you can see the results, or because making your community better improves your own life as well as others.
I agree. Disagree with Chris on this:
“Donating $1,000 from your altruism budget to an opera house instead of to malaria relief is wrong. As in the lifeguard case, the cost to you of either form of help is the same, so you have no excuse not to allocate your help to where it’s needed most.”
The cost to the giver is not exactly the same if you consider non-monetary value. Because giving to the opera might make you feel better than giving to malaria relief. So the net cost to you is actually different.
As an EA person myself, i think its fine if people are just clear of what their goals with charities are: i just want to encourage effectiveness, and if people are interested in effectiveness but havnt considered it, im happy to explain and advocate it.
theres a ton of moral uncertainty in the world, so i dont have any right to coerce or force people. So i go with the principle of respect and the market place of ideas.
Morality is a philosophical metasystem to solve problems: specificly, the problem of what things we want in society, the world and humanity. And some people think that what we should want or aspire for is completely different than other people, without anyone really realizing it. Aka: People have different values and goals.
From the sounds of your statement, the problem you want morality to solve is how individuals should socially act in a predictable way, and with that goal, Cutting people moral slack or giving them wiggle room is absurd as it diminishes the potential of the goal itself.
the goal of most utilitarians and some EA's is how to have more Impact/Utility creation. So as long as it creates more utility, thats good. If people sacrificed more for socially maximal utility, that would be Good-er: but similar to a company earning money, while a 4% net profit is very good, and 3.8% is clearly less good, its still good.