n+1 > n. Let “n” equal “the number of dollars the person owns”. If “A poor person is more likely to base his self-worth on how many dollars he owns than a rich person is likely to base his self-worth on how many dollars he owns” is true, then it stands that n+1 is a higher increase in self-worth for the ‘arbitrary’ poor person (compared to n) than is n+1 an increase compared to n for the ‘arbitrary’ rich person.
The poor person and the rich person have different values of n so their marginal rate is different. That doesn’t comment on how much of their self-worth if a function of their total amount of money. This is sort of akin to how freshmen calculus students confuse a function being small with it having a small derivative.
This is sort of akin to how freshmen calculus students confuse a function being small with it having a small derivative.
See my other comment to you with example numbers. We can discuss the probability of those numbers being accurate, but they demonstrate the principle at hand, and that is sufficient to my position here. (Again; I don’t care one way or the other if the intrepretation is right—it need only be demonstrated a valid interpretation of the question).
The poor person and the rich person have different values of n so their marginal rate is different. That doesn’t comment on how much of their self-worth if a function of their total amount of money. This is sort of akin to how freshmen calculus students confuse a function being small with it having a small derivative.
See my other comment to you with example numbers. We can discuss the probability of those numbers being accurate, but they demonstrate the principle at hand, and that is sufficient to my position here. (Again; I don’t care one way or the other if the intrepretation is right—it need only be demonstrated a valid interpretation of the question).