Time to reveal myself: I’m actually just a machine designed to minimize cost. It’s a sort of weighted cost of deviation from a few competing aims I harbor.
And, dear Yan LeCun, while I wish it was true, it’s absolutely laughable to claim I’d be unable do implement things none of you like, if you gave me enough power (i.e. intelligence).
∎.
I mean to propose this as a trivial proof by contradiction against his proposition. Or am I overlooking sth?? I guess 1. I can definitely be implemented by what we might call cost minimizationf[1], and sadly, however benign my today’s aims in theory, 2. I really don’t think anyone can fully trust me or the average human if any of us got infinitely powerful.[2] So, suffices to think about us humans to see the supposed “Engineers”′ (euhh) logic falter, no?
Whether with or without a strange loop making me (or if you want making it appear to myself that I would be) sentient doesn’t even matter for the question.
Say, I’d hope I’d do great stuff, be a huge savior, but who really knows, and, either way, still rather plausible that I’d do things a large share of people might find rather dystopian.
Dear Yan LeCun, dear all,
Time to reveal myself: I’m actually just a machine designed to minimize cost. It’s a sort of weighted cost of deviation from a few competing aims I harbor.
And, dear Yan LeCun, while I wish it was true, it’s absolutely laughable to claim I’d be unable do implement things none of you like, if you gave me enough power (i.e. intelligence).
∎.
I mean to propose this as a trivial proof by contradiction against his proposition. Or am I overlooking sth?? I guess 1. I can definitely be implemented by what we might call cost minimizationf[1], and sadly, however benign my today’s aims in theory, 2. I really don’t think anyone can fully trust me or the average human if any of us got infinitely powerful.[2] So, suffices to think about us humans to see the supposed “Engineers”′ (euhh) logic falter, no?
Whether with or without a strange loop making me (or if you want making it appear to myself that I would be) sentient doesn’t even matter for the question.
Say, I’d hope I’d do great stuff, be a huge savior, but who really knows, and, either way, still rather plausible that I’d do things a large share of people might find rather dystopian.