Image generation still struggles with hands, text, and consistency. (AI “art” has failed every time it’s tried to compete with artists. They’re getting laid off because AI is cheaper than stock images and better than nothing.) Google’s most “advanced” text predictor just told millions of people to eat rocks and put glue on pizza. (Maybe that’s the eeevil superintelligence trying to destroy us?)
GPT-3 was around in 2018 and GPT-4, Claude etc. are minuscule improvements. So much for “exponential progress”—in fact, AI is getting dumber and we are running out of data to fix it. AI writing is still trash regardless of the model and it is incredibly easy to tell. Yes, that includes your “essay”. I’ve read it. (Hint: If AI was really that good, people would be using it for more than spam websites. And no, AI is not being adopted in the workplace, even after three years of you guys shoving Blockchain 2.0 down everyone’s throat. Don’t believe everything OpenAI’s marketing tells you.)
The oncoming AIpocalypse looks more like the Willy Wonka Experience than Terminator to me. It indeed poses a serious threat to our society, but only through sheer incompetence.
That problem hasn’t gone away?
Classic singularitarian, simply ignoring inconvenient evidence. (Your
cult leaderhero Eliezer Yudkowsky predicted the singularity would have occurred by 2021 in 1996. In 2005, shadow demon Ray Kurzweil predicted AGI by 2029. Notice a pattern? Just like fusion power and mind uploading, the singularity is always about 20 years away.)