I am a PhD student in computer science at the University of Waterloo, supervised by Professor Ming Li and advised by Professor Marcus Hutter.
My current research is related to applications of algorithmic probability to sequential decision theory (universal artificial intelligence). Recently I have been trying to start a dialogue between the computational cognitive science and UAI communities. Sometimes I build robots, professionally or otherwise. Another hobby (and a personal favorite of my posts here) is the Sherlockian abduction master list, which is a crowdsourced project seeking to make “Sherlock Holmes” style inference feasible by compiling observational cues. Give it a read and see if you can contribute!
See my personal website colewyeth.com for an overview of my interests and work.
I do ~two types of writing, academic publications and (lesswrong) posts. With the former I try to be careful enough that I can stand by ~all (strong/central) claims in 10 years, usually by presenting a combination of theorems with rigorous proofs and only more conservative intuitive speculation. With the later, I try to learn enough by writing that I have changed my mind by the time I’m finished—and though I usually include an “epistemic status” to suggest my (final) degree of confidence before posting, the ensuing discussion often changes my mind again.
I don’t think you got it.
What did you think about my objection to the Flynn example, or the value of the rationalist community as something other than an autism support group? I feel like you sort of ignored my stronger points and then singled out the widget job interview response because it seems to miss the point, but without engaging with my explanation of how it doesn’t miss the point. The way that you constructed the hypothetical there was plenty of time to come up with an honest way to talk about how much he enjoyed widgets.
The one of the things I value is people knowing and understanding the truth, which I find to be a beautiful thing. It’s not because someone told me to be honest at some point, it’s because I’ve done a lot of mathematics and read a lot of books and observed that the truth is beautiful.
I also wouldn’t shoot someone so I could tell someone else the truth. I don’t know where you got these numbers.
I suppose I’m not completely longtermist about my pursuit of truth, but I’m not completely longtermist about my other values either—sometimes the short term is easier to predict and get feedback from etc.