I am a defendant of the idea that we have already achieved rudimentary AGIs with modern LLMs (as much of a hot take this is), and even though the path to superintelligence is going to be difficult and will probably require a few more technical breakthroughs to make more effective use of available data, I don’t think this will take us longer than a decade, or 15 years at most.
When I discuss this idea with some of my CS friends and co-workers, about how AI will inevitably replace most software engineering jobs (picture supercharged Github Copilot that can make entire websites and back-end services on command) most of them ask me the obvious follow-up question: So what can I do about it? Yeah, I can help with AI safety/alignment progress but my job is going to disappear no matter what I do, and probably sooner than many other more ‘physically’ demanding ones.
I am always left stumped by this − I simply don’t know what to tell them, specially to undergraduates that are still full of hope and totally didn’t sign up for this dumpster fire. Should I tell them to just continue doing their thing and see what happens? Let fate take its course and hope for the best? This sounds all too happy-go-lucky for my taste.
I’d like to hear what you guys think about this matter, what do you answer when asked such questions?
[Question] What career advice do you give to software engineers?
I am a defendant of the idea that we have already achieved rudimentary AGIs with modern LLMs (as much of a hot take this is), and even though the path to superintelligence is going to be difficult and will probably require a few more technical breakthroughs to make more effective use of available data, I don’t think this will take us longer than a decade, or 15 years at most.
When I discuss this idea with some of my CS friends and co-workers, about how AI will inevitably replace most software engineering jobs (picture supercharged Github Copilot that can make entire websites and back-end services on command) most of them ask me the obvious follow-up question: So what can I do about it? Yeah, I can help with AI safety/alignment progress but my job is going to disappear no matter what I do, and probably sooner than many other more ‘physically’ demanding ones.
I am always left stumped by this − I simply don’t know what to tell them, specially to undergraduates that are still full of hope and totally didn’t sign up for this dumpster fire. Should I tell them to just continue doing their thing and see what happens? Let fate take its course and hope for the best? This sounds all too happy-go-lucky for my taste.
I’d like to hear what you guys think about this matter, what do you answer when asked such questions?