What implies that only a human can do that?
Zubon
Related quote from July’s thread:
Most people are neurologically programmed so they cannot truly internalize the scope and import of deeply significant, long run, very good news. That means we spend too much time on small tasks and the short run. Clearing away a paper clip makes us, in relative terms, too happy in the short run, relative to the successful conclusion of World War II.
-- Tyler Cowen
Duplicate, although with the Yogi Berra attribution.
Note the alt text about talking its way out of the box.
I am not saying we should discard our intuitions about relative outrage, but we ought to look at them more closely rather than just riding them to a quick conclusion.
Tyler Cowen, “Just How Guilty Is Volkswagon?”
Did you mean to reply to a different post? That doesn’t seem relevant to either the quote or the source article. A better metaphor here would be not believing in linens when someone puts on a white sheet and jumps out at you.
I passed the Project Management Professional certification exam.
And, going with Viliam’s comment, market power is less threatening than political power, which includes criminal justice and the military. Channel your sociopaths towards dollars, not guns.
People condition on information that isn’t true.
Andrew Gelman, “The belief was so strong that it trumped the evidence before them.”
As a rule, news is a distraction from worthy intellectual pursuits.
-- Bryan Caplan, expanded here
- Sep 10, 2015, 2:05 AM; 0 points) 's comment on Open thread 7th september − 13th september by (
It wanders from the original quote, but “irrationality is slow suicide” is a great connection to make. (And if you want a quote, I’m sure you can find something like that from Rand.)
New games enjoyed day one: Tesla vs. Edison and Blood Rage.
Tentative location: Hall F, the green tables just behind the CCG/TCG HQ.
Time corrected to PM. Thanks.
Additional note to #3: humans are often the weakest part of your security. If I want to get into a system, all I need to do is convince someone to give me a password, share their access, etc. That also means your system is not only as insecure as your most insecure piece of hardware/software but also as your most insecure user (with relevant privileges). One person who can be convinced that I am from their IT department, and I am in.
Additional note to #4: but if I am willing to forego those benefits in favor of the ones I just mentioned, the human element of security becomes even weaker. If I am holding food in my hands and walking towards the door around start time, someone will hold the door for me. Great, I am in. Drop it off, look like I belong for a minute, find a cubicle with passwords on a sticky note. 5 minutes and I now have logins.
The stronger your technological security, the weaker the human element tends to become. Tell people to use a 12-character pseudorandom password with an upper case, a lower case, a number, and a special character, never re-use, change every 90 days, and use a different password for every system? No one remembers that, and your chance of the password stickynote rises towards 100%.
Assume all the technological problems were solved, and you still have insecure systems go long as anyone can use them.
Now it is a strange thing, but things that are good to have and days that are good to spend are soon told about, and not much to listen to; while things that are uncomfortable, palpitating, and even gruesome, may make a good tale, and take a deal of telling anyway.
― J.R.R. Tolkien explains how we get problems with the availability heuristic in The Hobbit
Most people are neurologically programmed so they cannot truly internalize the scope and import of deeply significant, long run, very good news. That means we spend too much time on small tasks and the short run. Clearing away a paper clip makes us, in relative terms, too happy in the short run, relative to the successful conclusion of World War II.
-- Tyler Cowen
- Nov 15, 2015, 5:06 PM; 5 points) 's comment on Rationality Quotes Thread November 2015 by (
Yes: what we learn from trolley problems is that human moral intuitions are absolute crap (technical term). Starting with even the simplest trolley problems, you find that many people have very strong but inconsistent moral intuitions. Others immediately go to a blue screen when presented with a moral problem with any causal complexity. The answer is that trolley problems are primarily system diagnostic tools that identify corrupt software behaving inconsistently.
Back to the object level, the right answer is dependent on other assumptions. Unless someone wants to have claimed to have solved all meta-ethical problems and have the right ethical system, “a right answer” is the correct framing rather than “the right answer,” because the answer is only right in a given ethical framework. Almost any consequentialist system will output “save the most lives/QALYs.”
You’ve stated compatibilism, and from that perspective free will tends to look trivial (“you can choose things”) or like magical thinking.
Many people have wanted there to be something special about the act of choosing or making decisions. This is necessary for several moral theories, as they demand a particular sense in which you are responsible for your actions that does not obtain if all your actions have prior causes. This is often related to theories that call for a soul, some sort of you apart from your body, brain, genetics, environment, and randomness. You have a sense of self and many people want that to be very important, as you think of yourself as important (to you, if no one else).
You may have read Douglas Adams and recall him describing the fundamental question of philosophy as what life is all about when you really get down to it, really, I mean really. A fair amount of philosophy can be understood as people tacking “really” onto things and considering that a better question. “Sure you choose, but do you choose what you choose to choose? Is our will really free? I mean really, fundamentally free, when you take away everything else, really?”