The original idea of Roko's basilisk isn't that the AI is evil and self-interested; it's actually that the AI is a super efficient utilitarian and threatening you with torture is its way of getting it built faster and thus maximizing utility. But obviously it's completely retarded for a utilitarian AI to create a bunch of minds for the sole purpose of torturing them. It can't change the past so at that point it's not creating utility anymore, only reducing it.
And if the AI is just evil, well, it might as well just torture you for no reason anyway. Really anything can happen.