One of the things that I searched for in EA and didn’t find, but think should exist: algorithm, or algorithms, to decide how much to donate, as a personal-negotiation thing.
There is Scott Alexander’s post about 10% as Schelling point and way to placate anxiety, there is the Giving What You Can calculation. but both have nothing with personal values.
I want an algorithm that is about introspection—about not smashing your altruistic and utilitarian parts, but not other parts too, about finding what number is the right number for me, by my own Utility Function.
and I just… didn’t find those discussions.
in dath ilan, when people expect to be able to name a price for everything more or less, and did extensive training to have the same answer to the questions ‘how much would you pay to get this extra’ and ‘how much additional payment would you forgo to get this extra’ and ‘how much would you pay to avoid losing this’ and ‘how much additional payment would you demand if you were losing this.’, there are answers.
What is the EA analog? how much I’m willing to pay if my parents will never learn about that? If I could press a button and get 1% more taxes that would have gone to top Giving Well charities but without all second order effects except the money, what number would I choose? What if negative numbers were allowed? what about the creation of a city with rules of its own, that take taxes for EA cause—how much i would accept then?
where are the “how to figure out how much money you want to donate in a Lawful way?” exercises?
Or maybe it’s because far too many people prefer and try to have their thinking, logical part win internal battle against other, more egotistical ones?
Where are all the posts about “how to find out what you really care about in a Lawful way”? The closest I came about is Internal Double Crux and Multi-agent Model of the soul and all its versions. But where are my numbers?
One of the things that I searched for in EA and didn’t find, but think should exist: algorithm, or algorithms, to decide how much to donate, as a personal-negotiation thing.
There is Scott Alexander’s post about 10% as Schelling point and way to placate anxiety, there is the Giving What You Can calculation. but both have nothing with personal values.
I want an algorithm that is about introspection—about not smashing your altruistic and utilitarian parts, but not other parts too, about finding what number is the right number for me, by my own Utility Function.
and I just… didn’t find those discussions.
in dath ilan, when people expect to be able to name a price for everything more or less, and did extensive training to have the same answer to the questions ‘how much would you pay to get this extra’ and ‘how much additional payment would you forgo to get this extra’ and ‘how much would you pay to avoid losing this’ and ‘how much additional payment would you demand if you were losing this.’, there are answers.
What is the EA analog? how much I’m willing to pay if my parents will never learn about that? If I could press a button and get 1% more taxes that would have gone to top Giving Well charities but without all second order effects except the money, what number would I choose? What if negative numbers were allowed? what about the creation of a city with rules of its own, that take taxes for EA cause—how much i would accept then?
where are the “how to figure out how much money you want to donate in a Lawful way?” exercises?
Or maybe it’s because far too many people prefer and try to have their thinking, logical part win internal battle against other, more egotistical ones?
Where are all the posts about “how to find out what you really care about in a Lawful way”? The closest I came about is Internal Double Crux and Multi-agent Model of the soul and all its versions. But where are my numbers?