In order to make a rationalist extremely aggravated, you can tell them that you don’t think that belief structures should be internally logically consistent.
There are ways to argue for that too. Both the aggravated rationalist and your friend have inconsistent belief systems, as you say the difference is just that the aggravated rationalist would like for that to change, while your friend is fine with that.
The point is this: you can value keeping the “is”-state, and not want to change the as-is for some optimized but partly different / differently weighed preferences.
If I were asked “You wanna take this pill and become Super-Woomba, with a newly modified and now consistent preference system”, I’d be inclined to decline. Such self-modifications, while in some cases desirable, should not change me beyond recognition, otherwise I as I exist now would for all purposes stop existing.
There are ways to argue for that too. Both the aggravated rationalist and your friend have inconsistent belief systems, as you say the difference is just that the aggravated rationalist would like for that to change, while your friend is fine with that.
The point is this: you can value keeping the “is”-state, and not want to change the as-is for some optimized but partly different / differently weighed preferences.
If I were asked “You wanna take this pill and become Super-Woomba, with a newly modified and now consistent preference system”, I’d be inclined to decline. Such self-modifications, while in some cases desirable, should not change me beyond recognition, otherwise I as I exist now would for all purposes stop existing.