Sort of not obvious what exactly “causal closure” means if the error tolerance is not specified. We could differentiate literally 100% perfect causal closure, almost perfect causal closure, and “approximate” causal closure. Literally 100% perfect causal closure is impossible for any abstraction due to every electron exerting nonzero force on any other electron in its future lightcone. Almost perfect causal closure (like 99.9%+) might be given for your laptop if it doesn’t have a wiring issue(?), maybe if a few more details are included in the abstraction. And then whether or there exists an abstraction for the brain with approximate causal closure (95% maybe?) is an open question.
I’d argue that almost perfect causal closure is enough for an abstraction to contain relevant information about consciousness, and approximate causal closure probably as well. Of course there’s not really a bright line between those two, either. But I think insofar as OP’s argument is one against approximate causal closure, those details don’t really matter.
Sort of not obvious what exactly “causal closure” means if the error tolerance is not specified. We could differentiate literally 100% perfect causal closure, almost perfect causal closure, and “approximate” causal closure. Literally 100% perfect causal closure is impossible for any abstraction due to every electron exerting nonzero force on any other electron in its future lightcone. Almost perfect causal closure (like 99.9%+) might be given for your laptop if it doesn’t have a wiring issue(?), maybe if a few more details are included in the abstraction. And then whether or there exists an abstraction for the brain with approximate causal closure (95% maybe?) is an open question.
I’d argue that almost perfect causal closure is enough for an abstraction to contain relevant information about consciousness, and approximate causal closure probably as well. Of course there’s not really a bright line between those two, either. But I think insofar as OP’s argument is one against approximate causal closure, those details don’t really matter.