6 Comments
May 14·edited May 14Liked by Richard Pettigrew

I found the part about Chris and Kiran interesting. One of the central theses of common-sense ethics is that you should place higher weight on the welfare of people you have certain relationships with, and in proportion to the closeness of those relationships.

A major unanswered question for common-sense ethics is the *dynamic* network-formation question of how we should decide whether to form relationships given that doing so will change the weights in our moral utility function. So, for example, if the act-of-will in the story (i.e. "committing") would require Chris to place greater weight on the Kiran's welfare (or at least give him a reason to do so), how does that affect whether or not he should perform the act-of-will in question?

Expand full comment
author

This is fascinating! Do you have reading recommendations on this? Chang does talk a little about the moral implications. Her general line is that we can create these sorts of will-given reasons only when the world-given reasons don't give us reason not to; or, perhaps better, you can create them whenever you like, but they can't override or even be weighed against world-given reasons, and so they only break ties when the world-given reasons don't say anything one way or the other. That doesn't answer the question you're posing, though.

Expand full comment
May 17·edited May 17

[Admittedly, I only read the first half of your article.]

I don't have references about ethics, but in dynamic decision theory, there is the idea of Strotz representation, where the agent is sophisticated in understanding that they are dynamically inconsistent and plans accordingly (see refs below).

In real life, I think this kind of sophistication is quite routine in some contexts. For example, a professor may be reluctant to allow a short meeting with a new graduate student because that would start to establish a "closer" connection between them, which could lead to expectations and obligations that they don't want to have. Thus the prof tries to keep the student at an arms length by only responding only by email or just ignoring them, even though a single meeting wouldn't have taken long.

Some people would distinguish between (1) "obligations" (often due to "reasonable expectations") and (2) simply putting higher weight on people's welfare. I'm not sure I buy that distinction, but maybe that's worth thinking about.

Strotz Representation (1955): https://econweb.ucsd.edu/~jandreon/Econ264/papers/Strotz%20RES%201956.pdf

Gul and Pesendorfer (2005) "The revealed preference theory of changing tastes." gives an axiom called "No Compromise" that is equivalent to a Strotz representation.

https://www.jstor.org/stable/3700658

Finally, I'm skeptical when people say something is "just a tie-breaker" or is lexically inferior. Perfect ties virtually never happen, so I think the second-letter clause would virtually never come into play.

Expand full comment

I haven't read the whole thing, but the kidney example (which resonates with previous decisions of mine, though not kidney-related) seems to me to be backwards. Chris has been deferring a decision about the relationship, and Kiran's need for a kidney forces him to make the choice, one way or the other.

Expand full comment
author

I think that's one way to read it, and I suspect I could have been more careful about telling the story in a way that didn't naturally give that impression. But I think we can imagine cases in which Chris has long known about Kiran's need--perhaps they were acquaintances before they recently started dating. It's other factors that push Chris to the commitment, but once he's made it, Kiran's need for a kidney becomes a reason for Chris to donate.

Expand full comment

Whichever way you run it, the key point is that the choices aren't independent. Supposing (for example) Chris is a consequentialist, it would make sense to rank both of the outcomes (Commit, Donate) and (Don't Commit, Don't Donate) above the mixed outcomes (Don't commit, donate) and (Commit, Don't Donate). Formally identical to believing (or not) two propositions p and q which are not independent.

Expand full comment