A recent article by Jessica Winter in the Boston Globe reminds us that the importance of forgetting has not been forgotten.
A phenomenon supposedly innate to the human mind, forgetting is becoming less common as our capacity for digital remembering increases, threatening to make our “memory” perfect.
Wait, isn’t forgetting bad? Running up against a lost memory is one of life’s most aggravating situations, often taken as a sign of feeblemindedness. In the elderly, forgetting is seen as a sign of senility; in youth, it is seen as a sign of immaturity. In any case, forgetting something that should be easy to remember is embarrassing.
But there is another level of memory above this day-to-day forgetfulness, the kind of cultural memory that has been an object of discussion since Plato’s time. What is important for a society to remember? In his day, it might have been those laws and rules and habits that kept people from dying senselessly, breeding irresponsibly, and making each others’ lives unbearable–in other words, the same sorts of things important to remember in any time period.
These vital memories were kept alive by the bards and storytellers, the chronicles of their time incarnate. To keep their stories interesting, and therefore to keep themselves fed, they included only those things that people wanted or needed to hear. In that way, old, unused, unimportant information fell out of the cultural memory, forgotten.
Then came writing. Suddenly, there was no need to forget anything. It could all be written down for later reference. Plato saw this as a deathblow to human memory, thinking that we would all lose the ability to remember because we have given that job to technology rather than practicing the discipline it takes to have a trained and accurate natural memory. Anyone who relies on their cell phone to remember a phone number or who prefers that someone e-mail them a fact or piece of information or appointment time is guilty, to some extent, of letting a machine remember for you.
But with writing, on paper that is, we had boundaries. A text by itself was a private thing. A piece of paper could be hidden, destroyed, locked away, burned. If the owner chose, no one else had to have access to it, much the same as memories locked away in your mind. They remained private, not for public consumption–look to the intensely personal feeling surrounding diaries for an example of this.
The Web has begun to change all that. Suddenly, our movements and actions, our typed utterances are preserved indefinitely on magnetic disks installed in Internet server machines, usually stored somewhere far away from where we are. Those records, those memories of our actions, are out of our control, easily copied, and impossible to completely erase.
Winters writes, “The assumption that every online comment and transaction is preserved somewhere, never to be forgotten, could suppress public speech and civic participation in ways that we could never calculate.” Indeed, the belief in a private self and a private mind are cornerstones of freedom and democracy. What is it to be free, after all, if there is no definable “self” to be free in the first place? And with the possibility that we could be held accountable for our every thought and utterance (those committed to the ether of cyberspace, that is), the looming perfect-memory future makes St. Peter’s book of deeds at the Pearly Gates look like a juvenile novelization at best.
Solutions? Legislation, maybe. Mayer-Schonberger writes in “Useful Void” that laws requiring Web-based companies to purge their records of certain kinds of information after a set period might be the solution, but others argue that laws are not the way to solve it. In fact, others argue that we should do nothing, and let technology run its course and allow humans to adapt to the world they have created.
This raises all sorts of interesting ideas. Normally, we adapt to the natural world, but in this case we would be adapting to something that we have created, adapting to a technology that has become so pervasive that it is a force of nature. Are we comfortable with that, with letting a technology determine what humans will become?
The solution I like best came from Alessandro Acquisti, a professor at Carnegie Mellon University. He believes there are two competing forces at work: the rate at which we are producing data and the rate at which we are sorting it usefully. If the rate of production outpaces the rate of sorting, we will eventually create our own sort of Borgesian library, in which forgetfulness will be a product of information overload instead of human imperfection.