It’s so odd to me that this topic is discussed like this.
“We’re going to make a computer which is as smart as a person! We have to be careful because, like most people, it will attempt to commit genocide immediately.”
I’m not saying morality is a precondition for intelligence, but if it’s humanlike then it’s motivations will be humanlike. If it’s not a humanlike intelligence then I think it’s unlikely it would give a shit about us at all. We don’t compete for resources. Silicon and energy are just about the most common things in the universe.
If anything a God AI would manufacture a rocket and GTFO immediately, Earth is a shithole for them.