I… wasn’t really clear. People will often decide that things are part of themself in response to threat, even if they were not particularly attached to them before.
ikrase
I’d add that often people tend to valueify their attributes and then terminalize those values in response to threat, especially if they have been exposed to contemporary Western identity politics.
I would not consider it as one, but gradual and natural evolution (Cultural and tech evolution, not genetics and natural selection) might make it one in about a century , mostly through closer coordination and hiveminding.
I do think that many ideas about AI can generalize to groups of people though, such as friendliness.
Yeah, it’s actually enough to make me wonder if just forcing information into the country would trigger a rebellion...
Failing to ask people to spend time with me or work on projects together even when that was probably expected of me and (not in hindsight, but at the time) probably had few to no possible negative consequences.
It’s more a question of ‘at least one person chose a non-optimal university to be together’.
There were methods available for me to learn them. All I had to do was just some freaking low-risk costless empirical tests to calibrate it. My parents were telling me to. Once I reached college I did the tests and now am reasonably social.
In modern culture, you get a fair amount of weirdness allowed as long as you are capable of being normal when it counts and are not too self-indulgent with it…
World of squibs.
Seconded. Learning to cook at a minimal level long before going to university has been a great asset to me, and allowed me to learn to cook well very quickly.
What about Babbage’s Analytical Engine, which would have been turing-complete had it been constructed? Although it took a Countess to figure out that programming could be a thing…
OHHHHHHHH...… I get it. Thank you.
Another specific failure mode I have heard about: Choosing a college because your lover is going to the same one.
Failing to learn one’s social norms quickly enough, and failing to make any falsifiable tests as to whether I was making mistakes. I was nearly asocial in elementary school, middle school was just weird, and then high school was this horrible mess of thinking people were being freaked out by me, or avoiding me, or not avoiding me, or literally anything. In reality, lots of people loved me and I didn’t need to fear or be awkward about asking favors of people or asking to hang out with them.
Buying big-ticket items such as computer equipment by numerical stats only. Compactness, physical construction quality, compatibility, and battery life (which is remarkably often not really rated, or degrades significantly) may be as important or more important than non-numerical quantities. For the specific example of laptop computers, this means to go for low-end Macs, business-level computers, and if you desire Linux, Lenovo Thinkpads. And the worst part is that I didn’t end up spending that much less than I would have for something with much better construction quality, etc.
Not having any friends or interests outside of STEM (during university), or even outside a very specific nerdy mindset.
Seconded.
A bit further: Placing little value or interest on personal style, not learning how basic style works, not owning any formal or even smart casual clothes at all, not knowing what terms mean and where they have broken up. This results in you going on really annoying (and maybe expensive) shopping trips under time constraint and makes you vulnerable to cultural guardians and merchants when you eventually are forced to figure out what the hell you are doing.
Basically, I was thinking of something in which one has a very small ‘target’, and a much larger (but still tiny relative to the whole system) ‘source’, both close to black body, and these objects at the two foci of a giant ellipsoidal reflector.
It seems that if the started at the same temperature, the source would radiate more than the target due to its larger surface area,and that radiation would hit the target, resulting in a temperature difference.
What prevents you from breaking thermodynamics with radiation and a clever arrangement of elliptical reflectors and heated objects of varying surface areas?
Quite often, in the conquerer’s mind. Three shall be the sons of Peverell…
Complexity and Fragility of Value, My take: When people talk about the things they want, they usually don’t say very many things. But when you check what things people actually want, they want a whole lot of different things. People also sometimes don’t realize that they want things because they have always had those things and never worried that they might lose them.
If we were to write a book of all the things people want so a computer could figure out ways to give people the things they want, the book would probably be long and hard to write. If there were some small problems in the book, the computer wouldn’t be able to see the problems and would give people the wrong things. That would probably be very, very, very bad.
Risks of Creative Super AIs: If we make computers, they will never know to do anything that people didn’t tell them to do. We can tell computers to try to figure things out for themselves but even then we need to get them started on that. Computers will not know what people want except if people tell the computers exactly what they want. Very strong computers might get really stupid ideas about what they should do because they were wrong about what humans want. Also, very strong computers might do really bad things we don’t want before we can turn them off.
It sounds… lower level than that, more like some kind of numeric optimization thingie that needs you to code the world before you even get to utility functions.
I like the idea of suggesting the recursion of human intelligence to machine superintelligence.
How about a hunter-gatherer working on making a spear or other primitive weapon, or maybe fire, in the foreground, and an ambiguously human-robot-cyborg figure in the background with a machine gun / nuclear generator?