It's been some time since I've wielded the pen here so it's high time I laid down some lines. The following is the third in a series of articles reflecting on the interconnectedness of entropy, energy, life and the consequences that follow from their interplay. If you have not yet read Part I: Lay of the Land or Part II: Dispersion, I'll invite you to please go have a read of them before continuing. Feel free to assist in the navigation of our journey by opining openly regarding anything you've read here, are reminded of, feel is incorrect or insightful. We shall approach a deeper truth together. You can spark up a conversation with me @forgot_thought on twitter.
"Thus time draws forward each and everything. Little by little into the midst of men and reason uplifts it to the shores of light. For one thing after other did men see grow clear by intellect, till with their arts they've now achieved the supreme pinnacle." -Lucretius
Entropy insidiously implicates itself in all contained within the universe, it is inescapable. What is true of life is true of life's creations, those of humans are no less impacted. For even the inanimate will dissolve at the behest of entropy just as, too, the animate will as well.
When we consider computer hardware, we often don't reflect on its eventual obsolescence as a function of its resistance to entropy but this is, in fact the case. Metcalfe's law describes the obsolescence of hardware as result of humans having achieved more effective ordered structures through the confluence of increased ingenuity as well as higher proficiency in material sciences and productive processes. We're harnessing energy and organic resources to construct computing systems that perform their duties by utilizing energy more efficiently and thereby increasing the rates and complexity of function. This is humans generating more effective negentropic capabilities through ingenuity. An excellent example of this is the recently developed NeuRRAM chip.
Hardware is also subject to material decline at the behest of entropy. Mechanical failure, corrupted/lost data and the incapacity to translate energy into productive processing are all examples of this. With age, a computer system has an increasing probability of failure or obsolescence.
So, is entropy, therefore, an unavoidable and perpetual nuisance to computing endeavours or can it actually be harnessed through ingenious manipulation of human minds? Enter cryptography.
Cryptography is the art of studying and enacting strategies for securing information transfer (communicating) from one party to one or more others in an adversarial environment. It is the conversion of messages from a comprehensible form into an incomprehensible one and back again at the other end, rendering it unreadable by interceptors or eavesdroppers without secret knowledge . Cryptography is as old as language itself and a case could be made for the structure and function of RNA/DNA as the encryption of the instructions of life (We won't dive into that idea any further here but perhaps in a later piece.) How do we transmit information in a secure means such that we know, with relative certainty that only the intended recipient(s) have extracted the information? In essence we leverage entropy.
A difficult computation from one perspective can be an insurmountable obstacle. However, from another perspective, it is a tool worthy of exploiting against your adversaries. This article from Quanta Magazine is an excellent primer on the subject matter: Researchers Identify ‘Master Problem’ Underlying All Cryptography. Within Information Theory as developed by Claude Shannon, entropy is regarded as quantifying the amount of uncertainty involved in the value of a random variable or the outcome of a random process . Take for example a one way function whereby we multiply two large prime numbers:
"To get a feel for how one-way functions work, imagine someone asked you to multiply two large prime numbers, say 6,547 and 7,079. Arriving at the answer of 46,346,213 might take some work, but it is eminently doable. However, if someone instead handed you the number 46,346,213 and asked for its prime factors, you might be at a loss. In fact, for numbers whose prime factors are all large, there is no efficient way (that we know of) to find those factors. This makes multiplication a promising candidate for a one-way function: As long as you start with large enough prime numbers, the process seems easy to do, but hard to undo." 
When we examine this process closely, we come to realize that what we are doing when we secure information cryptographically, or generate cryptographic key pairs, etc, is generating entropy. We're exploiting this aspect of fundamental physics, that the universe tends towards increasing entropy. It is therefore easy to generate entropy but energy intensive to undo the entropic process without the secret developed simultaneously that provides the key to resolving said process. We therefore, do little work to produce a secure cryptographically secure means of transmitting information but must do tremendous work to reverse the process without the correct key. It marvellous.
This does not mean however that all cryptographic processes are sound. There exist numerous classes of these that are computationally secure as they are protected by the prohibitive computational expense although the methodology for breaking them is known. We must also not rush to embrace the latest renditions of cryptographic security as they are not proven effective until such time that they have been tested by the brutality of an adversarial environment and should only be deemed safe insofar as they have not yet been broken. It’s yet to be seen how quantum computing will change the face of cryptography. A great read on both of these fronts is: ‘Post-Quantum’ Cryptography Scheme Is Cracked on a Laptop.
In Entropy and It's Consequences; Part IV, we will begin diving into the realm of energy.
Thanks for reading and feel free to reach out to me on twitter at:
You can support my work by tipping my Paynym:
Or with XMR at: