This has been an option for decades, a fully capable LLM does not meaningfully lower the threshold for this. It’s already too easy.
This has been an option since the 1950s. Any national medical system is capable of doing this, Project Coast could be reproduced by nearly any nation state.
I’m not saying it isn’t a problem, I’m just saying that the LLMs don’t make it worse.
I have yet to find a commercial LLM that I can’t make tell me how to build a working improvised explosive (I can grade the LLMs performance because I’ve worked with the USG on the issue and don’t need a LLM to make evil).
I think tacit knowledge is severely underrated in discussions of AGI and ASI.
In HPMOR, there is a scene near the end of the book where our hero wins the day by using some magic that would be equivalent to flying a drone around an extremely complicated path involving lots of loops in places not directly observable for our hero.
Our hero has never once in the book practiced doing this.
In theory, if I possess a drone and have a flight path the drone is capable of flying, I can pick up the controller for the first time and make it happen.
In practice, I will fail spectacularly. A lot of writing in this space assumes that with sufficient ‘thinking power’, success on the first attempt is assured.