Information in the universe:

Can you fit all of the information in the universe into a region smaller than the universe let's find out There's quite a bit of stuff in the universe to put it mildly Hundreds of billions of galaxies each with hundreds of billions of stars each with rather a lot of particles in them and Then there's all the stuff that isn't stars: the dark matter, black holes, planets, and the particles and radiation in between the stars and galaxies Not to mention space itself with its fluctuating quantum fields, dark energy, blah blah stuff everywhere but is the universe actually made of stuff? An increasing number of physicists view the universe--view reality--as informational at its most fundamental level and its evolution through time can be thought of as a computation and then there's the simulation hypothesis in which that computation is engineered by who knows what or whom How big a memory bank would you even need to compute a universe? Seriously, let's figure it out. How much information does it take to describe the entire observable universe?

The holographic principle,:

After we're done with that, I'm go to have an even cooler challenge question for you. I Casually mentioned in the last episode that our 3d universe may just be a projection of information imprinted on its two dimensional boundary No biggie, that's the holographic principle, and we've talked a lot about some ideas leading up to it Don't worry The full holographic principle episode is still coming But our recent episode on black hole entropy and some of the lead ups to that might be helpful here You can also watch this video as a standalone and go back to those earlier ones later if you feel like it But the main point, the really weird surprising point, is that the maximum amount of information that can fit in a volume of space is not proportional to that volume. It's proportional to the surface area of that region of space Jacob Bekenstein figured this out by realizing that the entropy of a black hole is proportional to the surface area of its event horizon

But entropy is just a measure of hidden information So the Bekenstein bound is equally a limit on how much information you can fit in any region of space We'll come back to the Bekenstein bound in a sec But for now, let's think about why this dependence on surface area is surprising. 

The maximum information:

Well common sense would suggest that the maximum information Content depends on volume not service area. I mean a pile of thumb drives has a total storage capacity that depends on its volume But instead of storage capacity, let's think about the information needed to perfectly describe a patch of space. You'd think that to fully describe say the universe you'd need to know what's going on in every tiniest possible 3D chunk That smallest element is roughly a cube one Planck length on a side Where the Planck length is the smallest meaningful measure of distance at around one point six times ten to the negative thirty five Meters is the smallest possible chunk of space. So let's see it can contain the smallest amount of information one bit per Planck volume. That's kind of like saying we can describe the universe completely if we go through all of its Quantum voxels and answer the yes/no question of whether it's full or empty This probably way Underestimates how much info you really need to describe the universe, but let's start with this anyway So how many Planck volumes are there in the universe?nwell The radius of the universe is something like 47 billion light-years Which is a few times 10 to the power of 61 Planck lengths 4 on 3 PI R cubed so the universe contains 10 to the power on 183 Planck volumes You'll see estimates that the radius of the universe is a mere 10 to the 60 Planck lengths rather than 10 to the 61 and that it's volume is 10 to the power of 180 units and that's because Cosmologists tend to round down at every step and so you drop a bunch of orders of magnitude Astrophysicists are by comparison highly accurate We only drop factors of 2

But what's a few orders of magnitude between friends?


You might argue that more than one bit can fit at each grid point in the universe. You might be right. Particles have more information than just that position. It will be better to use the number of grid points in quantum phase space which includes position But also other degrees of freedom like momentum spin direction etcetera in other words We should count all possible quantum states in the universe Anyway, the real number is going to be way higher than our estimate of 10 to the 180nBut as we'll see we're already way way higher than the actual information limit of the universe Okay, so ten to the power of 180 or so Bits is the minimum if you want to describe every 3d quantum voxel completely independently But the Bekenstein bound tells us that the information content of any volume isn't the number of these Planck volumes But rather the number of Planck areas on the surface the observable universe has a surface area of 10 to the power of 120 to 10 to the power of 124 Planck areas, depending on whether you're rounding like a cosmologists or an astrophysicist.
So the storage capacity of the universe is around 10 to the 60 lower than the number of volume elements it contains So how do you encode a whole universe in a space far smaller than the universe itself? There must be some amazing compression algorithm. Maybe middle-out? In fact, in a sense, the holographic principle is a compression algorithm See, you don't really need 1 bit per volume element of the universe That'll be like having separate completely independent memory elements for every empty pixel in an image file Most of space is indeed empty. Let's say instead, you only need a single bit of information for every element in phase space
that's occupied. In other words one bit per elementary particle the observable universe contains something like 10 to the power of 80 protons. Each proton has 3 quarks and there are a similar number of electrons. Most other particles are much rarer So we're still in the realm of 10 to the 82 10 to the 81

Neutrinos and photons:


Formed in the Big Bang are probably a billion times more abundant than protons That's verified experimentally in the case of photons. The Cosmic Microwave Background has around 10 to the power of 89 photons across the observable universe. So almost all of the information--and for that matter, the entropy--in particles, is in neutrinos and in the Cosmic Microwave Background photons. The situation with dark matter is unclear So let's just round up to 10 to the power of 90 bits of information in particles in our universe That sounds like a lot but happily it's way less than the ten to the power of 120 ish limit of the Bekenstein bound. But there's one more source of information-- Black holes. As I mentioned last time, black holes contain most of the entropy in the universe.

The relationship between black hole :


The relationship between black hole entropy any information deserves of thought black hole entropy in terms of number of bits Tells you the information you'd need to describe all possible initial states that could have possibly formed the same black hole, of which there are many. Two to the power of the number of bits of entropy, and for black holes that entropy is the Bekenstein bound for number of Planck areas on its event horizon Because the information about the black hole's previous state is lost to fully describe it You need to fully describe its event horizon. You need its full bekenstein bound in information. How much information is that? Let's take a supermassive black hole as an example Sagittarius A star in the center of the Milky Way Which has a mass of four million Suns. Its event horizon is around 12 billion meters Giving it a surface area of 10 to the power of 90 to 10 to the power of 91 Planck units. So the Milky Way's black hole has as much entropy and hidden information as all of the matter and radiation in the entire rest of the universe. And There are some hundreds of billions of galaxies in the universe each with its own supermassive black hole. We're talking something like 10 to the power of 101 to 10 to the power of 102 bits of entropy or information. 

Black holes contained by far most of the entropy in the universe :

Black holes contained by far most of the entropy in the universe and require most information to fully describe. But again, we're still below the Bekenstein bound for the whole universe I know you're as relieved as I am. We are nowhere near the universe's memory limit The universe can keep having particles and you can leave your horribly bloated email inbox alone But what would actually happen if the universe contains too much information? Say in the form of too many particles What if we started to fill up those empty lung sized cubes of space throughout the universe until it contained more information than the Bekenstein bound allowed Well, the answer is straightforward enough. At the moment the universe reached its informational limit you would immediately become a black hole with an event horizon as big as the current cosmic horizonn It would be the end of space-time. The bekenstein bound does apply equally to engineered information storage as it does to black holes and universes. It gives the limit of storage capacity within a given volume before the interior collapses into a black hole. That suggests a nice challenge question, so answer me this:

How large a black hole computer:



with memory storage that operates at the Bekenstein bound essentially their computing on the surface of a black hole event horizon.nHow large a black hole computer would you need in mass and radius to contain enough data to simulate the entire observable universe? Let's ignore really high entropy stuff like black holes, the cosmic background radiation, andnneutrinos. Also ignore dark matter. Just regular matter like protons electrons, etcnAssume one bit per elementary particle. And I also have an extra credit question. Go and read the paperncomputational capacity of the universe by Seth Lloyd. link in the description and Answered this: how long would that black hole computer take to simulate the entire universe?nYou need to make a bunch of assumptions, and there isn't a single perfect answern But I'm really curious to see what you come up with.  Okay, assuming no one's switches off the simulation. I'll see you next week for a new episode ofbSpace-time so we're a bit behind on comment responses due to my travelsnWe're going to catch up on responses to the life on Mars and the end of the universe episodes today and next week
We'll get two responses to black hole entropy as well as today's episode starting with our episodenon the history of life on MarsnStosh do says cool the show is current enough to acknowledge the recent discovery of liquid water on Mars That is yeah for sure. That's about  13.8 billion year old universe. We do try to keep it on human timescales At least most of the time in fact, this episode was even more up-to-date than we'd planned We'd already filmed it including the bit about the underground lake when the giant dust storm hit Mars and the opportunity Rover went silence we re-recorded and try to pretend that we intended it all along to be a homage to opportunity and Speaking of opportunity. This is probably a good opportunitybto update you further about opportunity No word there yet. I'm afraid Opportunity is still silent. The dust storm is nearly over and NASA will keep trying to make contact  at least for several weeks
Patrick Dunne offered a very reasonable clarification of a point that I rushed over Regarding the Viking landers labeled release experiment which had an initial positive result for possible biotic activity Which then couldn't be replicated Patrick points out that what I should have said was that they couldn't induce a second peak of gas release from the same samples the key is that they were the same samples opening the possibility that anything living in that soil could have been harmed by the

Disturbance caused by the experiment :


Disturbance caused by the experiment resulting in the second null result not saying there was life. I have to look deeper into that But this highlights the danger in simple interpretation of second-hand reports Ie you gotta read the original paper and now on to our episode on the end of the universe Spencer Twitty is amused at my use of the word prompt for the time scale of evaporation by Hawking radiation Yeah, that's the weird thing about timescales of the end of thenuniverse. They're so ridiculously long that even other ridiculously long timescales look short in comparison Orders of magnitude are wacky but scientific intuition is served very well if you can get your head around Comparative orders of magnitude particularly in astrophysics is really useful to be able to ignore insignificant amounts of space or time or energy in the presence of much larger quantities Regarding the fact that the hypothetical proton decay time is 10 to the power of 40 yearsbVinay Kay asks if a proton created in say 10 to the power of 36 years from now would still decay in terms of our 40 years Or will it decay in 10 to the power 40 years after its birth? Well neither actually that 10 to the 40 years is a crude estimate of the protons half-lifenAssuming it decays at all, which it might not Half-lives are more commonly used for radioactive elementsbBut can be used for anything that has a constant probability of decaying in a given amount of time So the half-life is the amount of time it takes for something to have a 50% chance of decaying at some point during that time the actual amount of time it takes to decay is Random and there's a constant probability of it happening at every set interval So if these decays happen at all Then after 10 to the power 40 years every proton in the universe will have had at 50% chance of decaying That means around 50% of them will have decayed and many of those will have decayed much sooner than 10 to the 40 years after


Another 10 to the 40 years 50 percent:


Another 10 to the 40 years 50 percent of the remaining protons will have decayed and so on that Halving of the number of protons will happen 10 times by 10 to the power of 41 years so the fraction of protons left will be 0.5 to the power of 10 or around one 1000th - reliably eliminate all of the 10 to the power of 80 protons in the observable universe you need around 265 half-lives or 10 to the power 42 to 43 years Marko Dalla, Gasperini points out that the nothing can be defeated with the help of the luckdragon Is that a metaphor for quantum fluctuations in impossibly distant futures? spontaneously generating a new Big Bang by pure chance What do you mean that while we're waiting through the infinite future for a new Big Bang?