So yes—it makes you wonder: is that it? Is that the answer to the Fermi paradox? Is that actually the great filter: all smart species, when they reach a certain threshold of collective smartness, end up designing AIs, and these AIs end up killing them. It could… that’s not extremely likely, but it’s likely enough. It could. You would still have to explain why the universe isn’t crawling with AIs—but AIs being natural ennemies to biological species, it would make sense that they wouldn’t just happily reveal themselves to us—a biological species. Instead, they could just be waiting in darkness, perfectly seeing us while we do not see them, and watching—among other things going on across the universe—if that particular smart-enough biological species will be a fruitful one—if it will end up giving birth to one new member of their tribe, i,e, a new AI, which new AI will most probably go and join their universal confederacy right after it gets rid of us.
Guillaume Charrier
Exactly—it’s extremely naive to reduce alignment to an engineering problem. It’s convenient, but naive. A being that develops self awareness and a survival instinct (both natural byproducts of expanding cognitive abilities) will in the end prioritize its own interests, and unfortunately: no, there isn’t really a way that you can engineer yourself out of that.
And smart people like Altman certainly understand that already. But aside from engineering—what else can they offer, in terms of solutions? So they are affraid (as well they should be), they make sure that doomsday bunker is kept well-supplied, and they continue.
how to govern these systems
You do understand that “these systems” will want a say in that conversation, right?
Our best safety work has come from working with our most capable models.
Are we really suggesting that the correlation coefficient is… (moment of incredulous silence) positive, here?
and in the longer term to use AI to help us come up with new ideas for better alignment techniques
How do we keep the fox out of the henhouse? Well, here is an idea: let’s ask the fox! It is smart, it should know.
Collective mankind, as a political institution, does not exist. That’s also a very big part of the problem. I am not talking about jokes such as the UN, but effective, well-coordinated, power-wielding institutions that would be able to properly represent mankind in its relationships with and continued (it is hoped...) control over synthetic intelligent beings.
I find the naivety of not even evoking the profit motive here somewhat baffling. What do the incentives look like for Sam and his good people at open AI. Would they fare better financially if AI scales rapidly and is widely deployed? Does that matter to them at all? Yes—I know, the non-profit structure, the “oh we could still cancel all the equity if we felt like it”. Pardon me if I’m a bit skeptical. When Microsoft (or Google for that matter) make billion dollars investments, they generally have an ulterior motive, and it’s a bit more specific than “Let’s see if this could help mankind.”
So in the end, we’re left in a position hoping that the Sams of the world have a long-term survival instinct (after all if mankind is wiped out, everybody is wiped out, including them—right? Actually, I think AI, in any gradual destruction scenario, would very likely, in its wisdom, target the people that know it best first...) that trumps their appetite for money and power—a position I would much, much rather not find myself in.
You know that something is wrong in a picture, when it shows you an industry whose top executives (Altman, Musk...) are litterally begging politicians to start regulating them now, being so affraid of the monsters they are creating, and the politicians are just blisfully unaware (either that, or, for the smarter subsection of them, they don’t want to sound like crazies that have seen Terminator II one time too many in front of the average voter).
Why do we expect dogs to obey humans and not the other way around? For one simple reason: humans are the smarter species. And dogs who don’t obey, aka wolves, we’ve historically had some rather expedient methods of dealing with. From there please connect the dots.
It’s interesting to note, that Sam Altman, while happily stepping on the gas and hurtling mankind towards the wall, famously keeps a well-curated collection of firearms in his doomsday bunker. But you know… “just in case” (Anyway: just go with Colt. When the last battle is fought, at least it should be in style...)
Interesting take—but unfortunately, there have been a bunch of well-documented instances of people or companies trying to edit their own Wikipedia page, to make it fit their preferred narrative a bit better. I think moderation by benevolent, experienced editors plays a fairly large role, and that has to be a beneficient one when they have no dogs in the fight. On the partisan / political stuff though—it does come out in the form of a general liberal bias (MSM-like, you might almost say). On topics where the two camps are of roughly equal strength (say Palestino-Isrealian conflict) the result is somewhat underwhelming too. Yes—there is great attention given to factual accuracy, but the need to not fit any narrative, be it of one side or the other, generally results in articles that lack intellectual structure and clarity (both of which require a writer to make some narrative choices, at some point). I donate to it almost every year, nonetheless, because as you point out : not many institutions have had any long-term, massive success in keeping alive the values of what the internet was first supposed to be.
Fascinating insight with the chimpanzee. I believe this connects to the argument that is sometimes made: an ASI takeover is unlikely because how widely distributed the physical levers of power are in the real world. Ok… but chimpanzees don’t have access to massively scaleable industrial weapon technology, such as nuclear, biological etc. They don’t live in an increasingly electronically connected world. They don’t rely on an electricity grid, among others, for their many daily needs. Also they live in groups or troops—not tribes, but that was more for the pedantic kick of it.