In the comments of this post, I mentioned once knowing a guy who believed that there is no god right now, but someday there will be, and he’d definitely try to get the job.
The reasoning kind of makes sense, in a sensible egocentric kind of way. ASI is scary, therefore we have to upgrade ourselves if we want to stand a chance. But as with an artificial ASI, the first ASI-level human/cyborg would also want to make sure that there isn’t any other one around ever, and unlimited power usually allows that. So the first one would be the only one, and he (or she) would have infinite power over the rest of the planet (and probably solar system). That is basically the definition of god, but with a human base.
Then, if you don’t trust anyone to do the job (power corrupts, and absolute power corrupts absolutely), the logical conclusion is that you’d have to do it yourself.
So pretty much anyone that understands the implications would probably try to get the job themselves if there ever was the chance to do so. I think that only someone who doesn’t truly understand the risk could ever “not care” about that. And maybe nihilists. Those guys are weird.
But presidential elections would be completely chaotic if every single citizen was a candidate and voted for themselves. It simply wouldn’t work.
So out of curiosity, I was wondering. If you had to choose anyone else than yourself for that job, knowing fully well that the person you choose will instantly get the job and there is no undo button, who would you give it to?
Before you choose your spouse, keep in mind that upgrading to unlimited power in every possible way according to our laws of physics, and probably in a few ways that we thought were impossible but actually aren’t, is a big upgrade. Imagine a housefly suddenly getting upgraded to a human in their prime years (with enough human knowledge to be a fully functioning member of society, except with no passport). Do you think that human would care for other houseflies for very long? Yes, they will remember their fly days, but the amount of knowledge a fly can remember is so small compared to us, that after a few hours of human life they would barely be able to remember anything fly-related, because they suddendly got a million times more data in human memories than they had as flies, and any of their fly priorities or worries is completely irrelevant now compared to their new life. That’s what would happen to that person, but with us as the flies, and an upgrade infinitely stronger than what was described here. You can’t even believe that they will keep being nice to you, because at the end of the day (and probably sooner) the memory of you will be as insignificant as your memory of question [3 (b)] in that math test you took in the second month of your third grade. You could probably still answer that question, but no matter how hard you tried you wouldn’t truly remember it. The best you could do is writing it somewhere that day, and looking at that reminder every day after that, and even then you won’t truly remember it, just remember the reminder, so you know it happened but have absolutely no emotional attachment to it.
No matter how much you loved each other, you are now one atom of one grain of sand on an infinite beach that they have at their disposal. Expecting your love to continue might be cute, but that’s just not realistic. I think I’d rather keep that spouse by my side.
My answer is weirder than that. If I had the option to not choose anyone, I would probably choose that. But if I absolutely had to choose someone, either because the alternative is letting a computer get this power or simply because it’s an hypothetical question so you can force people to do stuff, then I would actually choose the person that I think is the biggest problem for the human race at the moment.
It might sound very counter intuitive to give absolute corrupting power to the worst person I can possibly find. But I think it makes sense when you think about it. For all the reasons explained above, choosing someone that loves you will completely annihilate that love, and in the end you will simply have lost a loved one. Having access to infinite data and computing power completely eradicates any irrelevant memory of emotions you’d have from your human life, because of the infinite gap in size between those experiences.
This sucks when it comes to a loved one. But this can be great when it comes to an asshole. We have absolutely no way to predict how such a being would act, since we don’t have the necessary data. But what we know is that the data we do have is a drop of water in a million oceans, and won’t matter at all in the big picture. This basically means that no matter who we choose, after a very short time, everything we knew about them (everything human-life related) won’t matter anymore, the only thing that matters is their new god-experiences, which we have absolutely no control over. So it is very likely that no matter who we choose, what decides their actions will be the same, and therefore the result will be the same. A delay of one second in activation would probably have more impact than the original personality, because of space-time stuff that changed in that second.
So If the original identity doesn’t matter, and we completely lose this member of our specie, I think we might as well get rid of the worst person we can find. Obviously, finding that worst person is a whole other question, and would create long debates. But in this hypothetical situation, I get the choice, so my subjective definition of good would be the only thing that matters.
PS: I said in the beginning that anyone who understands the risk would want to do the job themselves. But many people who understand the risks but also understand that “no longer remembering your loved ones” part that I’ve just explained would probably now think twice about taking the job. That first reasoning only applied to the surface appearance of “someone will hold the earth in the palm of their hands, do you want anyone else to do that?”. When you think deeper, it isn’t as clean-cut.