If you're talking about Landauer's principle, you're talking about a universe where entropy increases, which means some information is lost.
By the way, here's the paper I was talking about. I feel like you might enjoy reading it: https://sites.cc.gatech.edu/computing/nano/documents/Bennett...
Thanks for the paper. Brownian computers are a cool idea, they seem like exploitation of the ratcheting paradigm.
Entropy increasing is a illusion based on the specific selection of macroscopic observables / slices of configuration space that we use as inputs for entropy. There is no information lost as much as some species that live on certain observable sense space slices, becoming ignorant and unable to exploit the new patterns of information flow. Chaos and order are two sides of the same coin. There will be brownian computers, funny enough, in chaotic environments. Whereas there might be very compact and sharply defined high efficiency solitary entities in low energy environments.
Think virtual memory vs actual memory, forks, copy-on-write mechanics, etc. Are we juggling/managing memory or actually creating any? As far as we know, the universe itself is a reversible quantum supercomputer. There are no erasures and a reversible computer is 100% efficient.
If the formula is correct at all, it should apply to the reverse process of setting bits, not just deletion.