I’m not sure what “no rationality” would mean. Evolutionarily relevant kinds of rationality can still be expected, like preference to sexually fertile mates, fearing spiders/snakes/heights, and if we’re still talking about something at all similar to Homo Sapiens, language and cultural learning and such, which require some amounts of rationality to use.
I wonder if you might be imagining rationality in the form of essentialism, allowing you to universally turn the attribute off, but in reality there no such off switch that is compatible with having decision making agents.
I’m not sure what “no rationality” would mean. Evolutionarily relevant kinds of rationality can still be expected, like preference to sexually fertile mates, fearing spiders/snakes/heights, and if we’re still talking about something at all similar to Homo Sapiens, language and cultural learning and such, which require some amounts of rationality to use.
I wonder if you might be imagining rationality in the form of essentialism, allowing you to universally turn the attribute off, but in reality there no such off switch that is compatible with having decision making agents.