Science Daily brings us “news” of the impending digital dark age, courtesy of Jerome McDonough at the University of Illinois at Urbana-Champaign. I write this with some sarcasm because the Digital Dark Age is not exactly news; people have been speculating about it for years, probably ever since some guy realized that he needed something off a 5.25-inch floppy drive in 1999.
But this paragraph from the story really caught my eye:
“E-mail is a classic example of that,” he said. “It runs both the modern business world and government. If that information is lost, you’ve lost the archive of what has actually happened in the modern world. We’ve seen a couple of examples of this so far.” (From ‘Digital Dark Age’ May Doom Some Data)
It made me dream up some kind of post-apocalyptic science-fiction-fantasy story, in which people have lost all modern knowledge because it was all recorded on computer files, computer that don’t work after the world-ending event.
It’s just a shade of a story for now; perhaps more will come later.
The Chronicle of Higher Education reports researchers have developed a way to use a computer and image software to analyze a painting and determine whether it is genuine or a forgery.
The experiment was performed on 101 high-resolution scans of Van Gough paintings from museums in the Netherlands. The program analyzed the artist’s brush strokes and create algorithm to describe Van Gough’s style.
The researchers told the CHE that while the algorithm was pretty successful, it could still be confused by the variety of brush strokes that the artist would use in a single painting.
An algorithm to describe the way an artist paints. It reminds me of the idea that a million monkeys sitting at a million typewriters will eventually produce Hamlet; it’s one of those attempts to rationalize artistic expression, to quantify it and explain it. All the better to sell it with, I suppose.
Talk about tricksters. Several news agencies are reporting (here and here, for example) this morning that some significant portions of the opening ceremony of the Olympics were faked. The cute little girl who sang the Chinese national anthem was lip synching; the real little girl singer, it seems, had crooked teeth and wouldn’t look good on TV.
And the stunning fireworks display that spanned the length of Beijing was both real and faked. While observers on the ground saw the real fireworks, computer-generated versions of it were interspersed with live television footage. That means that most of the world saw a Michael Bay-like opening ceremony that the a Chinese special effects house spent the past year perfecting.
All those involved say the substitutions and trickery were in the national interest; most of them have now been threatened with prison time for squealing, too. Meanwhile, the Chinese version of the Internet is buzzing with criticism; at least the situation is provoking some online debate, even though the posts are being deleted as fast as they can be posted by the country’s online censors.
A recent article by Jessica Winter in the Boston Globe reminds us that the importance of forgetting has not been forgotten.
A phenomenon supposedly innate to the human mind, forgetting is becoming less common as our capacity for digital remembering increases, threatening to make our “memory” perfect.
Wait, isn’t forgetting bad? Continue reading
New York Times writer Benedict Carey reports that they ways people choose to tell their life stories do more than get information across. The story patterns fall into predictable patterns based on the psychology of the teller, and those patterns reflect their present lives and future ambitions.
“Every American may be working on a screenplay,” Carey writes, “but we are also continually updating a treatment of our own life.”
Researchers at Northwestern University said these “screenplays” guide our behavior and depend on our mental state. People with mood trouble, for example, might remember the past brightly, but with certain good situations somehow marred. Civic-minded people might see life as a series of redemptive struggles. And so on.
The way people remember individual scenes of life alters the entire story, researchers said. Which leads Carey to a conclusion that literary theorists have known for many, many years: “Seeing oneself as acting in a movie or a play is not merely fantasy or indulgence; it is fundamental to how people work out who it is they are, and may become.” In other words: “The play’s the thing…”
In 1945, Vannevar Bush published his now infamous article in the Atlantic entitled “As We May Think.” In that article, he proposed the idea of the memex, a computer like device that would record its user’s interactions with the world for easy retrieval later. The system was based on microfilm (it was 1945, after all), and was meant to emulate human memory’s associative powers. The memex became the working inspiration for hypertext technologies, which now drive the Web and its 2.0 applications.