Monday, September 20, 2010

The Obsolescence of Memory: Mnemonics in the Age of Google

We live in an age of quantification of ideas.
From the scribbles in Lascaux, to the zealous scribes of Alexandria and throughout the entirety of our halting march of progress, mankind has sought to immortalize itself through memorabilia. Some attempts are ultimately more successful than others; while the humble graffiti of early man lurked safely in a French grotto, the ambitious library of Egypt was a towering success – and then a towering inferno. Finally, however, we have ensconced the sum of our knowledge in an ultimate repository: one measured in bytes.
What, then, is the quantifiable sum of our magnum opus? We can dimly comprehend vast quantities of dollars and time and space, but mankind is unable to measure its ideas with a unit relevant to the referent. We know of the byte and the gigabyte, but we have no ratio of bytes to ideas. Nonetheless, our collective knowledge has been uploaded and quantified, and it weighs in at approximately 487 billion gigabytes.
That’s 487,000,000,000,000,000,000 bytes of ideas; the contributions of countless human brains, and a pool of ideas too deep to plumb. The result of this ineffable and overwhelming archive is our increasingly bipartite intellect, in which memory is externalized technologically, and synaptic connections are still made within the organic brain. If we were to evaluate humanity in the same way that we compare computers, we would say that computers (and the internet) store and enable access to read-only memory (ROM) far more efficiently than a human brain does. However, our squishy cerebrum is still largely superior in relation to random-access memory (RAM) and processing capabilities.
Our brain may lose bragging rights soon enough, though. Technology (in case you’ve missed this) is advancing very quickly. Very, very quickly. To say that the progress of human technology will outstrip that of its creators is like saying that the progress of a Ferrari will outstrip a tortoise with a head start. Evolution is a God slow to change His creations, while mankind is a tinkering deity, constantly improving its progeny until they will eventually surpass mankind’s ability to improve them any further. If this seems dubious, remind yourself how handmade electronics compare in quality to those made by machine.
To put it bluntly, if we were to chart humanity’s progress in the cognitive facets established above – namely, capacity to store and access data, and ability to draw meaningful conclusions thenceforth – throughout the past century, we would have two remarkably horizontal lines. Boys and girls, this won’t be changing in the near future. On the other hand, the same lines for technology could easily be mistaken for a Space Shuttle launch trajectory. Barring a sudden and widespread mutation, or a luddite revolution, mankind may soon no longer be the dominant species of Earth.
Ideally, a sort of symbiotic relationship will result; the subservient nature of mechanisms will remain, and mankind will not become so enamored of its tools that indolence and dependence will lead to the sort of techno-apocalypse that James Cameron seems to fear so sincerely. Alternatively, this may indeed occur, and evolution’s steel and plastic darlings will inherit the Earth.
We cannot beat technology, so (as the old maxim goes) we will most likely join them. As strange as it may seem, the cybernetic union of digital memory and biological computing may very well be the next step in human evolution, as natural as the selection of the apes who used tools over those who didn’t. The implications of this brave new symbiotic world, I do not know. However, until that world arrives, I shall internalize as much data as I possibly can; while the brain is compatible with the meme format, it does not yet accept bytes.

8 comments: