This type of Hollywood Rationality is explicitly listed in TV Tropes under “Straw Vulcan” (failure to understand that goal achievement may require locally backward steps). I haven’t encountered anyone who explicitly associates that with LW in particular, and anyone who’s taken an introductory AI course should know better than to think a backward step is beyond computation, logic, Bayesian agents, etc.
This type of Hollywood Rationality is explicitly listed in TV Tropes under “Straw Vulcan” (failure to understand that goal achievement may require locally backward steps). I haven’t encountered anyone who explicitly associates that with LW in particular, and anyone who’s taken an introductory AI course should know better than to think a backward step is beyond computation, logic, Bayesian agents, etc.