How much computing power to simulate a virtual world like the real world?

How much computing power to simulate a virtual world like the real world is hard to answer. At one level, this is impossible. However a reasonable approximation is within reach.

Every particle in the universe has state that affects its behavior. This state and behavior could be ascertained by an experiment at any time, so as a result, it all needs to be represented in the simulation. To represent the universe in full detail, the computer attempting to simulate it would therefore require more matter than exists in the universe.

The most accurate real-time “simulation” of the universe is the universe itself.

A narrower strategy might propose to simulate enough of reality that the humans living inside it would never know the difference, for example, a scenario like the one depicted in the movie The Matrix.

If the humans living “inside” this simulated world are connected by some kind of neural implant, then the simulation is determining indirectly what the humans know and believe. Like a good movie set, the details of the world can be left out and only filled in if someone comes around looking for them. And details that are difficult to simulate — such as the individual behavior of gazillions of particles in a particle accelerator at the femto-second scale — could be approximated statistically or left out so the humans don’t wonder about them.

The simulation of this world only needs to be as detailed as all the human brains in the world can model. If one assumes 10^11 neurons per human brain, and 10^10 humans (eventually), and maybe 1 MIP (million instructions per second) per neuron, then an upper bound on this world would require 10^21 MIPS or 10^27 ops/sec.

However an individual human brain could probably be kept busy with a much lower-resolution simulation — say 1 MIP per sensory or motor neuron. There are about 4*10^6 sensory and motor neurons, so 10^13 ops/sec per human, or 10^23 ops/sec for a simulated world for all human brains to interact in.

Today’s GPU cards approach 1 terraFLOP in speed, or 10^12 ops/sec. So 100 billion (10^11) of today’s GPU cards should do it, given the right brain-machine interface. We are not so far away for computing power.

Related:
How big is the largest feedforward neural network ever trained, and what for? Would it act “intelligently” in some sense? Were there any attempts to build a “brain” for smaller creatures, with a smaller number of neurons?
What animals are computers smarter than?
Is the human brain analog or digital?

or similar content, please check here