There are other reasons to be wary of consciousness and identity-altering stuff.
I think under a physical/computational theory of consciousness, (ie. there’s no soul or qualia that have provable physical effects from the perspective of another observer) the problem might be better thought of as a question of value/policy rather than a question of fact. If teleportation or anything else really affects qualia or any other kind of subjective awareness that is not purely dependent on observable physical facts, whatever you call it, you wouldn’t be able to tell or even think of/be aware of the difference, since thinking and being reflectively aware are computational and physical processes! However we humans are evolved without reliable copying mechanisms, so our instincts care about preservation of the self because it’s the obvious way to protect our evolutionary success (and we can be quite willing to risk personal oblivion for evolutionary gains in ways we have been optimized for). This is just a part of our survival policy and is not easy or even safe to change just because you believe in physicalism. For one thing, as others have said, ethics and social theory becomes difficult because our sense of ethics (such as agency, punishment and caring about suffering) are all evolved in relation to a sense of self. It’s possible that if teleportation/copying tech becomes widely useful, humans will have to adapt to a different set of instincts about self, ethics and more (edit: or maybe abandon the concepts of self and experience altogether as an illusion and prefer a computation-based definition of agency or whatever), because those who can’t adapt will be selected against. But in the present world, people’s sense of value and ethics (and maybe even psychological health) depend on an existing sense of self, and I don’t see a good way or even a practical reason to transition to a different theory of self that allows copying, if doing so may cause unpredictable mental and social cost. See also discussions about meditation that lowers sense of ego and subjective suffering that can have serious side effects (like motivation and social norms) - I don’t know what it subjectively feels like, but if the meditation is purely changing subjective qualia without doing anything to the physical brain and computation, there should be no observable effects, good or bad! The problem is subjective experience and sense of identity is not independent from other aspects of our life.
What’s the endgame of technological or intelligent progress like? Not just for humans as we know it, but for all possible beings/civilizations in this universe, at least before it runs out of usable matter/energy? Would they invariably self-modify beyond their equivalent of humanness? Settle into some physical/cultural stable state? Keep getting better tech to compete within themselves if nothing else? Reach an end of technology or even intelligence beyond which advancement is no longer beneficial for survival? Spread as far as possible or concentrate resources? Accept the limited fate of the universe and live to the fullest or try to change it? If they could change the laws of the universe, how would they?