Shortform #88 Retreat Debriefing & Staying in Touch | I will review some AI Safety literature
Today was a bifurcated day, I spent the morning and early afternoon dealing with the remnants of travelling (picking up my car from my local airport and then my baggage from an airport an hour away from home because that’s where my connecting flight had been routed to last night after the initial one was unexpectedly cancelled) then finally made it to my apartment to shower & recombobulate.
I met with my Norfolk Rationalists co-organizer (shoutout Yitz!!! https://www.lesswrong.com/users/yitz) to essentially debrief and share as much knowledge I gained from the organizer’s retreat as possible. We had many lovely conversations and have concrete actions planned for growing & strengthening the Norfolk Rationalists community. More on that another time, there will be announcements.
I enjoyed reaching out & saying hi to other organizers to check on how their travel went, what they are up to. Many more people to go, if you haven’t heard from me and were at the retreat you should soon, I met and remember almost every organizer there (had at least a 5-10 minute conversation with 90%-95% or so of attending organizers).
Several individuals much more knowledgeable about AI Alignment than myself recommended that I check out Coherent Extrapolated Volition and clarified why I need to familiarize myself with existing AI Alignment research a bit more before I go off and do an in-depth analysis of political values re: AI Alignment. Yitz and I will be reviewing that original article & associated content then publish our review of the idea after determining how well it holds up regarding values determination. This and other AI Safety / Alignment work will take place at upcoming local meetup / coworking sessions in Norfolk separate from the Norfolk Rationalists group (as that is an exclusively social group). Looking forward to increasing my AI Alignment knowledge & helping how I can.
Back to my day-job tomorrow, decently looking forward to that in addition to everything else.
Shortform #88 Retreat Debriefing & Staying in Touch | I will review some AI Safety literature
Today was a bifurcated day, I spent the morning and early afternoon dealing with the remnants of travelling (picking up my car from my local airport and then my baggage from an airport an hour away from home because that’s where my connecting flight had been routed to last night after the initial one was unexpectedly cancelled) then finally made it to my apartment to shower & recombobulate.
I met with my Norfolk Rationalists co-organizer (shoutout Yitz!!! https://www.lesswrong.com/users/yitz) to essentially debrief and share as much knowledge I gained from the organizer’s retreat as possible. We had many lovely conversations and have concrete actions planned for growing & strengthening the Norfolk Rationalists community. More on that another time, there will be announcements.
I enjoyed reaching out & saying hi to other organizers to check on how their travel went, what they are up to. Many more people to go, if you haven’t heard from me and were at the retreat you should soon, I met and remember almost every organizer there (had at least a 5-10 minute conversation with 90%-95% or so of attending organizers).
Several individuals much more knowledgeable about AI Alignment than myself recommended that I check out Coherent Extrapolated Volition and clarified why I need to familiarize myself with existing AI Alignment research a bit more before I go off and do an in-depth analysis of political values re: AI Alignment. Yitz and I will be reviewing that original article & associated content then publish our review of the idea after determining how well it holds up regarding values determination. This and other AI Safety / Alignment work will take place at upcoming local meetup / coworking sessions in Norfolk separate from the Norfolk Rationalists group (as that is an exclusively social group). Looking forward to increasing my AI Alignment knowledge & helping how I can.
Back to my day-job tomorrow, decently looking forward to that in addition to everything else.