Sorry, I should have been clearer. Let’s suppose that a copy of you spent however long it takes to write an honest textbook with the solution to alignment (let’s call it N Yudkowsky-years), and an evil copy of you spent N Yudkowsky-years writing a deceptive textbook trying to make you believe in a false solution to alignment, and you’re given one but not told which. How long would it take you to reach 90% confidence about which you’d been given? (You’re free to get a team together to run a bunch of experiments and implementations, I’m just asking that you measure the total work in units of years-of-work-done-by-people-as-competent-as-Yudkowsky. And I should specify some safety threshold too—like, in the process of reaching 90% confidence, incurring less than 10% chance of running an experiment which kills you.)
Get me to adopt a solution wrong in a particular direction, like a design that hands the universe over to them? I can maybe figure out the first time through who’s out to get me, if it’s 200 Yudkowsky-years. If it’s 200,000 Yudkowsky-years I think I’m just screwed.
Get me to make any lethal mistake at all? I don’t think I can get to 90% confidence period, or at least, not without spending an amount of Yudkowsky-time equivalent to the untrustworthy source.
Sorry, I should have been clearer. Let’s suppose that a copy of you spent however long it takes to write an honest textbook with the solution to alignment (let’s call it N Yudkowsky-years), and an evil copy of you spent N Yudkowsky-years writing a deceptive textbook trying to make you believe in a false solution to alignment, and you’re given one but not told which. How long would it take you to reach 90% confidence about which you’d been given? (You’re free to get a team together to run a bunch of experiments and implementations, I’m just asking that you measure the total work in units of years-of-work-done-by-people-as-competent-as-Yudkowsky. And I should specify some safety threshold too—like, in the process of reaching 90% confidence, incurring less than 10% chance of running an experiment which kills you.)
Depends what the evil clones are trying to do.
Get me to adopt a solution wrong in a particular direction, like a design that hands the universe over to them? I can maybe figure out the first time through who’s out to get me, if it’s 200 Yudkowsky-years. If it’s 200,000 Yudkowsky-years I think I’m just screwed.
Get me to make any lethal mistake at all? I don’t think I can get to 90% confidence period, or at least, not without spending an amount of Yudkowsky-time equivalent to the untrustworthy source.