If Albert tries to circumvent the programmers then he thinks his judgement is better than theirs in this issue. This is in contradiction that Albert trusts the programmers. If Albert came to this conclusion because of a youth mistake trusting the programmers is preciously the strategy he has employed to counteract this.
Also as covered in ultrasophisticated cake or death expecting the programmer to say something ought to be as effective as them saying just that.
It might also be that friendliness is relative to a valuator. That is “being friendly to programmers”, “being friendly to Bertham” and “being friendly to the world” are 3 distinct things. Albert thinks that in order to be friendly to the world he should be unfriendly to Bertham. So it would seem that there could be a way to world-friendliness if Albert is unfriendly both to Bertham and (only in sligth degree) the programmers. This seems to run a little counter to intuition in that friendliness ought to include being friendly to an awful lot of agents. But maybe friendliness isn’t cuddly, maybe having unfriendly programmers is a valid problem.
Analogical problem that might slip into relevance to politics which is hard-mode
Lbh pbhyq trg n fvzvyne qvyrzzn gung vs lbh ner nagv-qrngu vf vg checbfrshy gb nqzvavfgre pncvgny chavfuzrag gb (/zheqre) n zheqrere? Gurer vf n fnlvat ebhtuyl genafyngrq nf “Jung jbhyq xvyy rivy?” vzcylvat gung lbh jbhyq orpbzr rivy fubhyq lbh xvyy.
If Albert tries to circumvent the programmers then he thinks his judgement is better than theirs in this issue. This is in contradiction that Albert trusts the programmers. If Albert came to this conclusion because of a youth mistake trusting the programmers is preciously the strategy he has employed to counteract this.
Also as covered in ultrasophisticated cake or death expecting the programmer to say something ought to be as effective as them saying just that.
It might also be that friendliness is relative to a valuator. That is “being friendly to programmers”, “being friendly to Bertham” and “being friendly to the world” are 3 distinct things. Albert thinks that in order to be friendly to the world he should be unfriendly to Bertham. So it would seem that there could be a way to world-friendliness if Albert is unfriendly both to Bertham and (only in sligth degree) the programmers. This seems to run a little counter to intuition in that friendliness ought to include being friendly to an awful lot of agents. But maybe friendliness isn’t cuddly, maybe having unfriendly programmers is a valid problem.
Analogical problem that might slip into relevance to politics which is hard-mode Lbh pbhyq trg n fvzvyne qvyrzzn gung vs lbh ner nagv-qrngu vf vg checbfrshy gb nqzvavfgre pncvgny chavfuzrag gb (/zheqre) n zheqrere? Gurer vf n fnlvat ebhtuyl genafyngrq nf “Jung jbhyq xvyy rivy?” vzcylvat gung lbh jbhyq orpbzr rivy fubhyq lbh xvyy.
What the Fhtagn happened to the end of your post?
http://rot13.com/
It seems I am unable to identify rot13 by simple observation of its characteristics. I am ashamed.
Don’t feel bad; your command of the technical jargon of the Cthulhu mythos more than makes up for any deficiencies in rot13 recognition!