I think the aspect of self preservation will not be so extreme in AI compared to normal life. We have evolved over time to have that because nature is extremely dangerous, and escalating the self-preservation aspect is that we feel pain, but there will be not much reason to have AI feeling pain because an AI body being destroyed doesn't hurt the AI much it can be stored in multiple places at once making it basically death-immune it has no real need to actively work toward survival because we can basically guarantee it's survival before it's even made. So risk of AI attacking us for it's own survival is very low, but I do think it will expand beyond the planet/solar system much faster than we do and go off conquering everything probably converting planets into giant computers and all sorts of other crazy things.
RE: Artificial Intelligence and Universal Basic Income, How it May Work