Physics of information

Information is physical: it must be stored, processed, and transmitted according to the laws of physics.

We can flip the script too.

Physics is informational: entropy quantifies information, present computes future from past, and black holes scramble information.

As we prod for rigor, we find a deep and beautiful connection between fundamental physics and the fundamental limits of information processing.

What are the fundamental energetic limits of information processing? For the last several decades, there has been a perceived exchange rate between energy and information: this is the famous “Landauer’s bound” on the minimal energy required to reset the state of a system and thus erase its previous memory.

Our recent work revises this exchange rate, leveraging recent advances in quantum nonequilibrium thermodynamics. We must faithfully predict the state of a system that we wish to transform; there is a severe energetic cost for misaligned expectations. See our paper on the Impossibility of achieving Landauer’s bound for almost every quantum state.

This is part of our larger effort to establish the fundamental limits of computation via recent advances in nonequilibrium thermodynamics. For example, we found that time-reversal symmetries play a profound role in determining energetic efficiency of information processors. These insights suggest new paradigms for computation that may someday unplug your future devices for good.

Next
Next

Quantum information