I have a view that would seem contrarian in this community.
SIAI shifted its focus from triggering an AI-based Singularity to doing ‘saferty’ research, because Yudkowsky understood that SIAI is no better off at building AGI than any other AI research organization. Actually, worse, because of low funding and at that time the sole full-time member.
I have a view that would seem contrarian in this community.
SIAI shifted its focus from triggering an AI-based Singularity to doing ‘saferty’ research, because Yudkowsky understood that SIAI is no better off at building AGI than any other AI research organization. Actually, worse, because of low funding and at that time the sole full-time member.