Pop Culture

How much do you know how to know?

Google' Star Trek doodle.

I’ve got a feeling that with the advent of web 2.0, our methods of acquiring, processing, and storing knowledge are going to have to change fundamentally—and not just for programmers.

Jonathan V. Last’s article “Google and Its Enemies” on The Weekly Standard got me thinking about knowledge. But not so much knowledge itself, but the retention of knowledge. What does it mean to be knowledgeable in a web 2.0 twenty-first century? Is it a matter of how much you know, or how much you know how to know?

Take web development, for example. When I start building a website, the development process is largely conceptual. That is, while I understand the programming languages required to create pages, I don’t retain a whole lot of the syntax itself in my head. Now, I’m pretty sure the same kind of limited retention of knowledge wouldn’t work for, say, a non-native speaker of Japanese—she’s got to have a good deal of the language memorized in order to communicate. A non-native Japanese speaker can’t minimize the person to whom she’s speaking and Google a few common phrases so that she can keep up with the pace of the conversation (except maybe on an instant messenger). But when I’m developing, I do this sort of thing all the time, because I’ve got the web at my disposal and there’s no point in cramming thousands of values and properties into my head when I can access them on a whim. In fact, most of the time, I don’t even need to think about the problems I encounter along the way. Say if I need an alphabetized index in WordPress—Why should I build it myself when I’m positive someone else has already created it? If I can borrow open source code and plug it into my project, this gives me more time to work on the project itself, of which alphabetized indexes are only a small part.

picardSo this is common sense, right? Maybe, maybe not. I’ve got a feeling that with the advent of web 2.0, our methods of acquiring, processing, and storing knowledge are going to have to change fundamentally—and not just for programmers. I mean, did you ever think about how a Star Trek character like Jean-Luc Picard can be a Starfleet commander, archaeologist, and Shakespearean literature buff all while remaining knowledgeable of the diplomatic protocol of hundreds of alien cultures? (The answer is not: “Jean-Luc Picard is a fiction, you nerd.”) The answer has got to be: Google of the twenty-fourth century. You know, the U.S.S. Wikipedia they ask questions of every episode?

We mere Internet junkies aren’t the only ones being inundated with sheer volumes of information. Even as I write this, physicists are preparing for an information overload as the Large Hadron Collider (LHC) goes online to blast the Standard Model to bits. “The nearly 100 million channels of data streaming from each of the two largest detectors would fill 100,000 CDs every second, enough to produce a stack to the moon in six months,” writes Graham P. Collins in “The Future of Physics,” his Scientific American article, “So instead of attempting to record it all, the experiments will … [discard] almost all the information and [send] the data from only the most promising-looking 100 events each second to the LHC’s central computing system at CERN … for archiving and later analysis.” Hell, I don’t even have time to learn how to use my iPod let alone deal with a moon-sized stack of CDs that could quite possibly contain information about dark matter or mass-imbuing Higgs particles!

Multimedia and gadgetry are all around us, begging for our attention. I’m reminded of a Time article I read recently called “Bringing Up BlackBerry,” in which the author, Nancy Gibb, compares our relationships with our various “gizmos” to our relationships with our children: “Do our devices really make us more efficient or less so? Do they bind us—or isolate us, becoming screens against intimacy and contact… Like our children, they are little miracles, whose workings we can’t really understand, as they make our lives bigger in surprising ways…”

My point, I think, is that “rote” memorization is becoming obsolete, something we don’t need to spend as much time on as world-wide, random-access information systems like the Internet evolve and integrate with our daily experience. I’m not as hopeful or forward-looking as futurist visionaries like Michio Kaku, who in his 1997 book Visions predicts that we will have room temperature superconductors, maglev trains and fusion power by 2050, but who is to deny that in a century or two (maybe three or four?), bothering to learn how to speak multiple languages will also become obsolete, thanks to “web 9.0” technologies on hand to do it for us?


Also in Pop Culture


Post a Comment

Your Two Cents

Reply

Your Comments

0 Replies & Counting

There are no comments yet.