Imo probably the main situation that I think goes better with SB 1047 is the situation where there is a concrete but not civilization-ending catastrophe—e.g. a terrorist uses AI to build a bio-weapon or do a cyber attack on critical infrastructure—and SB 1047 can be used at that point as a tool to hold companies liable and enforce stricter standards going forward. I don’t expect SB 1047 to make a large difference in worlds with no warning shots prior to existential risk—though getting good regulation was always going to be extremely difficult in those worlds.
I agree warning shots generally make governance easier, but, I think SB 1047 somewhat differentially helps more in worlds without warning shots (or with weaksauce ones)?
Like, with a serious warning shot I expect it to be much easier to get regulation passed even if there wasn’t already SB 1047, and SB 1047 creates more surface area for regulating agencies existing and noticing problems before they happen.
Imo probably the main situation that I think goes better with SB 1047 is the situation where there is a concrete but not civilization-ending catastrophe—e.g. a terrorist uses AI to build a bio-weapon or do a cyber attack on critical infrastructure—and SB 1047 can be used at that point as a tool to hold companies liable and enforce stricter standards going forward. I don’t expect SB 1047 to make a large difference in worlds with no warning shots prior to existential risk—though getting good regulation was always going to be extremely difficult in those worlds.
I agree warning shots generally make governance easier, but, I think SB 1047 somewhat differentially helps more in worlds without warning shots (or with weaksauce ones)?
Like, with a serious warning shot I expect it to be much easier to get regulation passed even if there wasn’t already SB 1047, and SB 1047 creates more surface area for regulating agencies existing and noticing problems before they happen.