booked a call!
wachichornia
Will do. Merci!
Is there a way “regular” people can “help”? I’m a serial entrepreneur in my late 30s. I went through 80000 hours and they told me they would not coach me as my profile was not interesting. This was back in 2018 though.
If Eliezer is pretty much convinced we’re doomed, what is he up to?
You are correct Willa! I am probably the Pareto best in a couple of things. I have a pretty good life all things considered. This post is my attempt to take it further, and your perspective is appreciated.
I tried going to EA groups in person and felt uncomfortable, if only because everyone was half my age or less. Good thing the internet fixes this problem, hence me writing this post.
Will join the discord servers and send you a pm! Will check out Guild of the Rose.
Opened a blog as well and will be trying to write, which from what I’ve read a gazillion times, is the best way to improve your thinking.
Merci for your message!
Sent you a pm!
Bonjour !
Been reading lesswrong for years but never posted: I feel like my cognitive capacities are nowhere near the average in this forum.
I would love to exchange ideas and try to improve my rationality with less “advanced” people, wondering if anyone would have recommendations.
Been thinking that something like the changemyview subreddit might be a good start?
Thanks
Thank you for taking the time to reply. I had to read your comment multiple times, still not sure if I got what you wanted to say. What I got from it:
a) Ideology is not the most efficient method to find out what the world is
b) Ideology is not the most efficient method to find out what the would ought to be
Correct?
You ask if biased solutions are a good or a bad thing. I thought biases were generally identified by rationality as bad things in general, is this correct?
We should hence strive to live and act as ideology-free as possible. Correct?
I have a very rich smart developer friend who knows a lot of influential people in SV. First employee of a unicorn, he retired from work after a very successful IPO and now it’s just finding interesting startups to invest in. He had never heard of lesswrong when I mentioned it and is not familiar with AI research.
If anyone can point me to a way to present AGI safety to him to maybe turn his interest to invest his resources in the field, that might be helpful