How much would AI developers be willing to sacrifice? They may be sufficiently concerned to at this risk as explained, but motivated and well-funded organizations (or governments) should have no problem attempting to influence, persuade or convert a fraction of AI developers to think otherwise.
I wonder if global climate change can be used as an analogy highlighting what some climate scientists are willing to publish due to funding and/or other incentives beyond scientific inquiry.
How much would AI developers be willing to sacrifice? They may be sufficiently concerned to at this risk as explained, but motivated and well-funded organizations (or governments) should have no problem attempting to influence, persuade or convert a fraction of AI developers to think otherwise.
I wonder if global climate change can be used as an analogy highlighting what some climate scientists are willing to publish due to funding and/or other incentives beyond scientific inquiry.