From what I learned so far, the LessWrong community did not appear to be a big fan of the reptilian brain, and wants to overcome biases.
Where have you learned that? It sounds to me like you project preconceived notions you got elsewhere on our community.
Our community isn’t opposed to emotions. CFAR doesn’t have classes that center around overcoming specific biases but classes on various techniques and those techniques acknowledge that humans have emotions that matter.
Your post reminds me of a talk at the first Quantified Self conference where I was. A lot of attempts at optimizing according to simple feedback processes ignore the fact that second-order cybernetics matters.
Where have you learned that? It sounds to me like you project preconceived notions you got elsewhere on our community.
Our community isn’t opposed to emotions. CFAR doesn’t have classes that center around overcoming specific biases but classes on various techniques and those techniques acknowledge that humans have emotions that matter.
Your post reminds me of a talk at the first Quantified Self conference where I was. A lot of attempts at optimizing according to simple feedback processes ignore the fact that second-order cybernetics matters.