I work on the events team at CEA, and I’m currently (lightly) exploring supporting a global AGI safety conference in 2024. It probably won’t be CEA-branded, or even EA-branded, I’m just keen to make it happen because we run a lot of conferences and it seems like we’d be able to handle the ops fairly well.
If you’re interested in helping or giving feedback, feel free to reach out to me at ollie@eaglobal.org :)
I think having something like an AI Safety Global would be very high impact for several reasons.
Redirecting people who are only interested in AI Safety from EAG/EAGx to the conference they actually want to go to. This would be better for them and for EAG/EAGx. I think AIS has a place at EAG, but it’s inefficient that lots people go there basically only to talk to other people interested in AIS. That’s not a great experience either for them, or for the people who are there to talk about all the other EA cause areas.
Creating any amount of additional common knowledge in the AI Safety sphere. AI Safety is begging big and diverse enough that different people are using different words in different ways, and using different unspoken assumptions. It’s hard to make progress on top of the established consensus when there is no established consensus. I defiantly don’t think (and don’t want) all AIS researchers to start agreeing on everything. But just some common knowledge of what other researches are doing would help a lot. I think that a yearly conference where each major research group gives an official presentation of what they are doing and their latest results, would help a lot.
Networking.
I don’t think that such a conference should double as a peer-review journal, the way many ML and CS conferences do. But I’m not very attached to this opinion.
I think making it not CEA branded is the right choice. I think it’s healthier for AIS to be it’s own thing, not a sub community of EA, even though there will always be an overlap in community membership.
What’s your probability that you’ll make this happen?
I’m asking because if you don’t do this, I will try to convince someone else to do it. I’m not the right person to organise this my self. I’m good at smaller, less formal events. My style would not fit with what I think this conference should be. I think the EAG team would do a good job at this though. But if you don’t do it someone else should. I also think the team behind the Human-aligned AI Summer School would do a good job at this, for example.
I responded here instead of over email, since I think there is a value in having this conversation in public. But feel free to email me if you prefer. linda.linsefors@gmail.com
I agree with your claims about why this event might be valuable. In fact, I think 3 might be the biggest source of value.
I also agree AIS should be it’s own thing, that was part of the motivation here. It seems big enough now to have it’s own infrastructure (though I hope we’ll still have lots of AIS researchers attend EAG/EAGx events).
Probabilities:
75% the CEA events team supports an event with at least 100 people with an AIS focus before end of 2024.
55% the CEA events team supports an event with at least 500 people with an AIS focus before end of 2024.
Thanks for adding clarity! What does “support” mean, in this context? What’s the key factors that prevent the probabilities from being >90%?
If the key bottleneck is someone to spearhead this as a full-time position and you’d willingly redirect existing capacity to advise/support them, I might be able to help find someone as well.
I work on the events team at CEA, and I’m currently (lightly) exploring supporting a global AGI safety conference in 2024. It probably won’t be CEA-branded, or even EA-branded, I’m just keen to make it happen because we run a lot of conferences and it seems like we’d be able to handle the ops fairly well.
If you’re interested in helping or giving feedback, feel free to reach out to me at ollie@eaglobal.org :)
I think having something like an AI Safety Global would be very high impact for several reasons.
Redirecting people who are only interested in AI Safety from EAG/EAGx to the conference they actually want to go to. This would be better for them and for EAG/EAGx. I think AIS has a place at EAG, but it’s inefficient that lots people go there basically only to talk to other people interested in AIS. That’s not a great experience either for them, or for the people who are there to talk about all the other EA cause areas.
Creating any amount of additional common knowledge in the AI Safety sphere. AI Safety is begging big and diverse enough that different people are using different words in different ways, and using different unspoken assumptions. It’s hard to make progress on top of the established consensus when there is no established consensus. I defiantly don’t think (and don’t want) all AIS researchers to start agreeing on everything. But just some common knowledge of what other researches are doing would help a lot. I think that a yearly conference where each major research group gives an official presentation of what they are doing and their latest results, would help a lot.
Networking.
I don’t think that such a conference should double as a peer-review journal, the way many ML and CS conferences do. But I’m not very attached to this opinion.
I think making it not CEA branded is the right choice. I think it’s healthier for AIS to be it’s own thing, not a sub community of EA, even though there will always be an overlap in community membership.
What’s your probability that you’ll make this happen?
I’m asking because if you don’t do this, I will try to convince someone else to do it. I’m not the right person to organise this my self. I’m good at smaller, less formal events. My style would not fit with what I think this conference should be. I think the EAG team would do a good job at this though. But if you don’t do it someone else should. I also think the team behind the Human-aligned AI Summer School would do a good job at this, for example.
I responded here instead of over email, since I think there is a value in having this conversation in public. But feel free to email me if you prefer. linda.linsefors@gmail.com
Thanks, Linda!
I agree with your claims about why this event might be valuable. In fact, I think 3 might be the biggest source of value.
I also agree AIS should be it’s own thing, that was part of the motivation here. It seems big enough now to have it’s own infrastructure (though I hope we’ll still have lots of AIS researchers attend EAG/EAGx events).
Probabilities:
75% the CEA events team supports an event with at least 100 people with an AIS focus before end of 2024.
55% the CEA events team supports an event with at least 500 people with an AIS focus before end of 2024.
Thanks for adding clarity! What does “support” mean, in this context? What’s the key factors that prevent the probabilities from being >90%?
If the key bottleneck is someone to spearhead this as a full-time position and you’d willingly redirect existing capacity to advise/support them, I might be able to help find someone as well.
oops, sorry, I don’t check LW often!
I use support to allow for a variety of outcomes—we might run it, we might fund someone to run it, we might fund someone and advise them etc.
Buy-in from important stakeholders (safety research groups, our funders etc.). That is not confirmed.
This isn’t the key bottleneck, but thank you for this offer!