From: Daniel Burfoot (daniel.burfoot@gmail.com)
Date: Thu Jan 24 2008 - 19:50:53 MST
On Jan 24, 2008 1:18 PM, Matt Mahoney <matmahoney@yahoo.com> wrote:
>
> Quantum mechanics takes exponential time to solve, at least our
> probabilistic
> interpretation of it does.
>
I don't pretend to fully understand this, but see Scott Aaronson
"NP-Complete Problems and Physical Reality". Aaronson claims that the fact
that nature does not solve NP-hard problems should be considered as a new
physical law.
We can't assume that concepts like distance, time, and velocity
> have any meaning in the simulating universe.
That may be true. If there is no notion of time in the simulating universe,
it's hard to talk about notions such as computational power (can one define
computational power without appealing to the concept of time...?)
>
> > 4) Conservation of energy can be interpreted as conservation of code
> length,
> > due to the similarity between the Boltzmann factor and the Shannon
> optimal
> > codelength rule.
>
> Conservation of mass/energy limits the size of the universe. The
> relationship
> you suggest between thermodynamic and information theoretic entropy is
> unrelated to energy. It is related to the direction of time. A
> computation
> irreversibly loses information, so its uncertainty about its environment
> increases.
This is what I mean. Consider a system of interacting particles with various
energy states. The probability that a particle is in a given energy state
is:
p(E)=exp{-E/kT}/Z
Assuming the simulation encodes the state of the particles using an optimal
code, the code length requirement is:
-log P(E) = E/kT + log(Z)
The 1/kT factor is shared by all the particles in the system, and the log(Z)
term does not change with changes in the particle's energy state. Thus an
interchange of energy between any two particles will result in zero net code
length difference.
The strong version would require high algorithmic complexity in the laws of
> physics, which we don't observe.
>
I'm suggesting that if, motivated by the strong version of the simulation
argument, we looked harder for "glitches" in the simulation we might find
some. In other words, the simulation would actually have high algorithmic
complexity, because of computational limitations of the simulating
machine(s), but appears to have low algorithmic complexity.
Dan
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:01 MDT