Yeah, I read the Metaethics Sequence twice so far, but I’m still not really convinced by it. Though that doesn’t mean that I know of better metaethical theories than Eliezer’s, I’m just confused and very uncertain so I would like to hear Konkvistador’s arguments.
I think it is where I first came upon the random walk challenge to allegedly “observed” moral progress. I do think I upgraded the argument even in that basic post, please tell me if you disagree.
Also I think Eliezer was basically working to rescue the notion of moral progress because that is what he sees as “adding back up to normality”. I disagree, I think normality is the futility of preserving your values or their coherently extrapolated successors. Finding a way to make something like “moral progress” real or even preserve currently held values would be a massive project comparable in difficulty and perhaps even importance to developing FAI (which is one potential solution to this problem). I find it telling he dosen’t seem directly touch on the subject afterwards.
I would like to use this opportunity to remind you that you owe us a post about this :-)
ETA: Sorry, I should have read the grandgrandparent first. Anyway, I’m eagerly awaiting your post!
Have you seen this post by Eliezer?
Yeah, I read the Metaethics Sequence twice so far, but I’m still not really convinced by it. Though that doesn’t mean that I know of better metaethical theories than Eliezer’s, I’m just confused and very uncertain so I would like to hear Konkvistador’s arguments.
I’m not really convinced by it either.
I think it is where I first came upon the random walk challenge to allegedly “observed” moral progress. I do think I upgraded the argument even in that basic post, please tell me if you disagree.
Also I think Eliezer was basically working to rescue the notion of moral progress because that is what he sees as “adding back up to normality”. I disagree, I think normality is the futility of preserving your values or their coherently extrapolated successors. Finding a way to make something like “moral progress” real or even preserve currently held values would be a massive project comparable in difficulty and perhaps even importance to developing FAI (which is one potential solution to this problem). I find it telling he dosen’t seem directly touch on the subject afterwards.