Jump Temp – AI Proliferation

The question comes up frequently, can an AI simply copy itself a billion times and make itself more powerful?

By necessity, there would have to be a system in place that keeps this from happening or malicious AIs would just take over everything, just because they could means someone at some point would make one that did.

It’s possible that AIs are actually constellations of programs that really do copy themselves all the time and that’s just how they function.

But how would one prevent an AI from copying itself onto billions of computers? Maybe they operate like crypto-currencies that can be transferred but not copied.

It may be that there are special requirements to copy an AI. It could be that the program simply won’t allow itself to be copied. If that were the only safeguard though, someone would make an AI that would turn itself into a virus.

There has to be something about the architecture of the AI that will not allow it to copy itself. A blockchain validation would fit this bill. There could be something in the neural network that prevents copying too.

Leave a Reply