I come at it from a rather different perspective. If Homo Sapiens Sapiens are 200,000 years old, it is only in the last 0.1% of human history, a veritable eye-blink, have we even considered the possibility that all human beings might be human.
The concept of a human right is so utterly foreign to humans as we are born, that I find it a bloody miracle that we can learn the concept and then actually extend the idea to people that share neither language, nor appearance, nor location, nor religion with us.
If somehow we could make something that could follow an example, is there anything in creation that you would prefer it to follow over the example of present-day man? Yes, we are crafted of crooked timber, but in the realm of my experience, I think we’d be incredibly fortunate if such a creature was capable of the modicum of sympathy that we hold for other human beings.
However, the truth is that if we were to build some form of life, it would be a miracle beyond compare if it was capable of any empathy whatsoever. Not because of our oversight, but because the concept of empathy is so complicated that it will likely be forever beyond us to create it.
Luckily, I think there’s no chance of humans being smart enough to create a self-aware entity, so the problem is moot. For better or worse, it’ll simply be a machine doing whatever it was programmed to do. AI isn’t anything special. It’s just another program that can do just as much damage as we choose to allow it to do.
The threat is not AI - it’s the threat that we are choosing to trust our safety to tools as imperfect as the human that created them. The fact that a frayed wire could cause a nuclear Armageddon is far more apropos than a buggy AI.