If you’re trying to persuade smart programmers who are somewhat wary of sci-fi stuff, and you think nanotech is likely to play a major role in AGI strategy, but you think it isn’t strictly necessary for the current argument you’re making, then my default advice would be:
Be friendly and patient; get curious about the other person’s perspective, and ask questions to try to understand where they’re coming from; and put effort into showing your work and providing indicators that you’re a reasonable sort of person.
Wear your weird beliefs on your sleeve; be open about them, and if you want to acknowledge that they sound weird, feel free to do so. At least mention nanotech, even if you choose not to focus on it because it’s not strictly necessary for the argument at hand, it comes with a larger inferential gap, etc.
If you’re trying to persuade smart programmers who are somewhat wary of sci-fi stuff, and you think nanotech is likely to play a major role in AGI strategy, but you think it isn’t strictly necessary for the current argument you’re making, then my default advice would be:
Be friendly and patient; get curious about the other person’s perspective, and ask questions to try to understand where they’re coming from; and put effort into showing your work and providing indicators that you’re a reasonable sort of person.
Wear your weird beliefs on your sleeve; be open about them, and if you want to acknowledge that they sound weird, feel free to do so. At least mention nanotech, even if you choose not to focus on it because it’s not strictly necessary for the argument at hand, it comes with a larger inferential gap, etc.