The New York Times gives us this morsel today, from writers Matt Richtel and Ashlee Vance:
There is nothing new about frustration with start-up times, which can be many minutes. But the agitation seems more intense than in the pre-Internet days. Back then, people felt less urgency to log on to their solitary, unconnected machines. Now the destination is the vast world of the Web, and the computer industry says the fast-boot systems cater to an information-addicted society that is agitated by even a moment of downtime.
The article equates the PC-makers’ rush to get boot times down with automakers’ attempts to narrow the time gap between 0 and 60 miles per hour.
I’m almost embarrassed by this labeling of our culture as speed obsessed, so impatient that even a few minutes (or seconds) of waiting for a computer to boot is an eternity of wasted time. I’m embarrassed, but I know it’s an accurate label. I have waited those endless seconds while my computer does something that is taking, no doubt, a reasonable amount of time. In those moments, I really feel like my computer has it out for me, that it wants to kill me with frustration.
But I’ve learned to use those times when nothing else is happening. I consciously switch my mind into “thinking” mode, instead of “computer” mode (where my mind resides far too often). I’ve learned that the moments in between activities can be very useful, and interesting insights are often found in those “liminal” spaces — to use a term from my undergrad days.
Business Week reported on Oct. 19 that, if elected, Barack Obama will likely appoint the first Cabinet-level chief technology officer because he feels the country “is not doing nearly enough to create jobs through technology.” The CTO’s job would be to expand broadband service to even more parts of the U.S., especially rural areas into which broadband doesn’t yet penetrate.
On his blog, Andrew Keen, author of Cult of the Amateur, responds to claims that expanding broadband is comparable to the building of railroads in the 1800s. Keen, who is often critical of the Web, writes:
[B]roadband provides a very different kind of transportation — one that allows individuals to escape their physical communities, to create virtual loyalties, to lose their identities in the narcissistic chaos of cyberspace.
I can’t agree wholly with Keen. Surely more broadband access in rural areas will not kill everything local. People will not revert to keyboard-potatoes, wasting away in front of their computer screens without ever visiting their local stores or picking up the local paper. Besides, the benefits of greater connectivity in currently unconnected areas — provided people aren’t just using the Internet for Ebay and celebrity news — will be too great to ignore.
Psychologists at the University of Montreal will soon begin studying Internet addiction.
Their study will focus on teens who don’t leave home, who don’t have relationships with other human beings and who “only speak in the language of the characters they play iwth in network video games” — no doubt a subtle prod at MMORPGs, such as World of Warcraft, and their copious jargon.
Professor Loise Nadeau, head of the university’s new addictions center said, “There is no reliable study or clinical data on the issue. … We are starting from scratch.”
It seems to me that Internet addiction was a big deal about a decade ago when people, you know, first started studying it. (for example) The Chronicle of Higher Education linked to three articles from its archives on Internet addiction, one published as early as 1998.
Perhaps Nadeau and her colleagues are looking at Internet addiction in a different way, taking a path that hasn’t been academically trod yet. I don’t know. But it’s a bit unfair and inaccurate for her to say that the University of Montreal is “starting from scratch” on this.
A new bit of research news has brought up old memories for me today. Researchers at the University of California have found that an area of the brain called the perirhinal cortext may help with the formation of associative memories.
British neuroscientist Baroness Greenfield points out that prescriptions for drugs like Ritalin and diagnoses of ADHD are on the rise. She correlates that with an increase in computer use over the past decade, asks a few open ended questions and implies that computer use is rotting children’s brains.
I don’t doubt that computers will change how we think. I don’t doubt that they already have changed how we think. I, too, read Nicholas Carr’s essay in the Atlantic and silently nodded in agreement for most of it. Yes, I find it harder now than I once did to sit down a read for extended periods of time or to read without skimming paragraphs that seem unimportant — but I owe that to years spent in grad school and not to years spent on the Internet.
“Law enforcement agencies at every level are exploiting fears about terrorism and child safety to encourage lawmakers to strip away statutory privacy protections for library records,” says the ALA. “This eliminates anonymity in the library, and encourages the mind set that ‘good’ people should have nothing to hide.”
That notion is ridiculous. There’s a sharp difference between “hiding” something and just wanting to keep it private, and a desire for privacy should never be taken as collusion with some mythical terrorist enemy. Go ALA!
A recent article by Jessica Winter in the Boston Globe reminds us that the importance of forgetting has not been forgotten.
A phenomenon supposedly innate to the human mind, forgetting is becoming less common as our capacity for digital remembering increases, threatening to make our “memory” perfect.
Wait, isn’t forgetting bad? Continue reading