Quote (DoctorOfSpace)
It doesn't matter how fast or how slow you run a simulation on the machine, inevitably the energy available to the computer will drop below the required amounts and the simulation will end.
Your example is a statement about the limits of modern computer architecture, not the limits set by physics itself. Freeman Dyson wrote a very famous paper on this topic.
Quote (midtskogen)
It could detect whether a glitch has occurred or whether intelligence has discovered the truth, and simply end the simulation, rewind to the last good point and continue with slightly different parameters.
Precisely.
Quote (midtskogen)
But energy isn't temperature.
I'm talking about the temperature of the cosmic background radiation, which is the ultimate thermodynamical limit to the amount of energy available to do work by any computational system in our universe.
Edit: I feel I should elaborate a little further. When we talk about the CMB, we often talk about it as having a temperature. But what we are actually referring to is the wavelength and therefore energy of the photons. We correlate it to temperature because it almost perfectly follows a blackbody distribution, where peak wavelength is a simple function of temperature.