I no longer celebrate Halloween. The holiday is for me wrapped in so many memories of my mother, and she's dead.
Just kidding. It's true that I don't celebrate Halloween. It's also true my mom is dead. But they're not related. I don't celebrate Halloween anymore because I like growing older and think maturity has some much-underrated benefits.
Halloween is for adolescents. Many have argued that it's also a holiday for the young at heart. But in its current manifestation, Halloween—with its cartoonish representations of the supernatural and costumes that have as much to do with wish-fulfillment as anything else—actually feeds an adolescent mindset, one that sees the world as a field of options in which anyone at any time can be a superhero or a sexy nurse or offensive.
I won't bore you with yet another history of Halloween. The Dauphin County Library System has a perfectly serviceable narrative of the the holiday's origins, if you're interested. But I will ask you to consider Halloween the first mediated holiday in recorded memory.
All tools are media. Anything that enhances our native human abilities or performs a job that we're otherwise incapable of is media. That includes the hammer, which enhances our hands' ability to pound things, and the airplane, which enhances our legs' ability to propel us, not to mention giving us the ability to fly. It also includes things we traditionally call media, such as painting, television, and the Internet, which enable us to be in several places and times at once through representation.
Therefore, when our ancestors painted themselves, built bonfires to dance around, and did similar things that we now refer to as wild and primitive, they were practicing mediation. They were surrounded by death, and with autumn they knew they'd be surrounded by darkness for months to come. They had few options. They had only some paint and fire to keep winter and death at arm's length.
libraries | play | information | media | policy | culture
I no longer celebrate Halloween. The holiday is for me wrapped in so many memories of my mother, and she's dead.
Just as humans adapt to new tools, machines evolve—or, rather, their manufacturers adapt to technological advances, changing behaviors, and market conditions. Howard Rheingold explained this dance five years ago in his book Smart Mobs.
That the effects of text messaging on Japanese and Scandinavian societies have not appeared fully in the U.S. shows the impact money and policy have on technological behaviors. Business and government are very much part of the information ecology. Laboring under a hard-wired legacy telephone system, U.S. telecommunication carriers are loathe to write off their 20th century investments, while competition has made them reluctant to share or standardize their technologies.
But I have to wonder what will happen if our nation's mobile technology and practices ever sync with the rest of the world. Because, the example of the metric system aside, we can't afford to sit this one out indefinitely, can we?
My guess is that even as the devices get more powerful, information packets will get smaller, akin to SMS. If well-tagged and linked according to an open descriptive system, such as XML or RDF, even the smallest bits of information could be powerful because of their connection to others, creating a mass of accessible data. Furthermore, the information could be easier to code accurately and maintained, because they are small, and might not required traditional inputs, such as a keyboard.
It's all speculation, of course, and it'd require a major adaptation to the technology from us. We'd have to accept a model in which information is constantly flowing through and communicating with our mobile device, rather than existing natively on it—not an easy proposition for a society still struggling with intellectual property and the concept of ownership.
The notion that the Internet is changing people's relationship to information is not new. The principle of good enough is now well document as applying to both design and consumption on the Web. And reams of deep-log analysis conducted by the Centre for Publishing at University College London's School of Library, Archive and Information Studies show clearly that even serious academic researchers are not immune to the frenetic skimming and jumping-around behaviors the rest of us exhibit on the Internet.
What excites me, though, is the potential next evolutionary stage in our relationship with digital information: from the hunter-gatherer behaviors described above to a sower-reaper model. We have only begun to exploit the potential uses of XML-based technologies, such as RSS feeds, which push only the information we want to us. And we are approaching a critical mass on social networking sites, which have built-in trust metrics for referral—likewise a way to harvest information without having to search blindly.
Just as Google harnessed and exploited the Web's linked structure to improve search, XML and social media harness and exploit the human desire to save time in the face of mass information and connect with like-minded people. What impact might these tools have, both good and bad, if refined?
Back in his heyday, before moving to that retirement home for DJs, satellite radio, Howard Stern proved that you could give yourself a nickname and make it stick. It didn't hurt that he was the titular focus of a syndicated radio show that held huge shares in most major media markets. But still, he managed to persuade nearly every person who uttered his name to preface it with "King of All Media," virtually willing a mildly entertaining book and follow-up film into best-sellerdom. Did anyone else find that annoying? We should thank him, though, for launching the career of Paul Giamatti.
In stark contrast is Neil Gaiman, who really could lay a legitimate claim to media monarchy and who's done it without the advantage of a mic plugged into millions of ears. He started as a freelance journalist before raising the bar, and profile, of comic book writing with the Sandman series. He has since written popular and award-winning books, short stories, screenplays, and stage and radio plays in a logorrheic display of almost Asimovian proportions.
What enables him to move so effortlessly between media can be described only as intuition. Many have thought about and successfully described the subtle differences between media—what makes a good book vs. a good film vs. a good TV show. In fact, the bad-books-make-good-movies paradigm has been explored to death. But Gaiman is one of the few who's done it, who's consistently produced artistically and popularly successful works in multiple media. And nothing highlights his intuition more than his blog.
Gaiman was in the vanguard of blogging and continues it, almost daily, as a labor of love and outreach to his fans. The blog now reportedly registers more than a million unique hits each month. It's got the typical bloggy elements—brief news items, recommendations, links to neat sites, travel photos—but where it succeeds most, and puts Gaiman's media intuition on display, is in its tone. His blog is friendly and intimate. It's completely devoid of the creepy or salacious feeling of reading someone's journal on the sly. Rather, reading his blog feels like reading a letter from a friend.
Through his blog Gaiman gives his readers a glimpse of a successful but hard-working writer's life, complete with undoctored photos of his dark and baggy eyes. We cheer for him because of those circles and don't begrudge him success because of those bags. And we are charmed by him when he asks his precocious adolescent daughter to guest-blog in his absence. Above all, we feel while reading his blog as if we're in on a joke or a gentle secret—even if it's with a million other people.
They are media queens. They are familiar with everything it has to offer by way of example, insight, and reflection. . . . They are not just an audience of passive consumers. They are not even merely judges—though, Lord knows, they are that too.
They can do it themselves. They are performers.
As a result, I can forgive 2007 Dartmouth grad Alice Mathias, who tells us in her October 6 New York Times op-ed that people who treat Facebook as anything but a lark are wrong. It is and always was intended to be, she tells us, "a circus ring" and any attempt to turn it into "a legitimate social reference guide" threatens it.
There is a huge aspect to Facebook that entails play—and far be it from a Playful Librarian to discourage play. But the point of play is that it can be both fun and useful. Elements of play should infuse our home and work lives, especially since digital tools now enable us to blur the lines between work and home.
As of May 2007, 39% of unique Facebook users are over 35—the largest and seemingly most unlikely demographic. They didn't grow up with it and didn't use it in college. But they are using it now and in growing numbers, and it's likely they're using it to keep in touch with friends, family, and business contacts, all at the same time.
Ms. Mathias ignores these statistics, probably because they don't fit within her experience. That's the danger of social media: it convinces those who use it uncritically that they are at the center of everything, it's about them. And with the publication of Ms. Mathias's op-ed we can pinpoint the coming of age of the Facebook generation.
Blogging is not a creative act. Neither, for that matter, is the use of any social media, such as videocasting on YouTube. They require no more inherent creative talent than the ability to read, write, and fiddle with a bit of machinery. A post might exhibit creativity or exist for a creative purpose. But its existence might just as easily be instructive or persuasive in nature.
The true quality common to all social media is that it is expressive—it's an act of outreach, of communication. And for all my talk of content as king—which I still firmly believe—it's in that act of communication that the human interest lies. For it's in the content that we sense the personality and glimpse the person behind the machine.
We as readers or viewers intuit the risk and reward of that outreach. Sometimes we are moved to laugh or cry at a particular blog post, or we cringe at a misguided attempt to evoke a reaction. Often we just roll our eyes at yet another video or a dad getting hit in the balls by his toddler. But we're always intrigued and we imagine ourselves alongside that person reaching out to us.
And it's those who keep that illusion alive the longest—who convince us that we're center stage with them, that we're an essential part of the experience—who know how to use and manipulate social media.
MySpace, Facebook, and other similar sites are social networking tools. That "social" is key. It implies individuals communicating, making connections for work or recreation. But libraries are not individuals; they are institutions. Institutional networking makes me think of conferences. So why have so many libraries built institutional pages on MySpace?
I use Facebook myself. A friend who's in a similar line of work convinced me to join, and I'm haltingly incorporating it into my daily digital experience. When I visit Facebook, it's usually to check up on that friend—he's far more active than I at updating his page—and the handful of other people who've accumulated in my friend network. I would never think to see what my local library is up to, nor would I add it to my network. Making a building my Facebook friend would lend a little too much surreality to an already artificial act.
And social computing is artificial. It's a form of disembodied communication, in which there's no corporeal proof of each person's presence, such as touch or a voice. This quality strains our communicative faculties, as evinced, for instance, by how oddly self-revealing some people are online. People need to believe that there's a person at the other end. By baldly creating institutional pages on social networking sites, libraries signal to a computing public already weary of Internet marketing that they are one more thing to ignore.
If libraries were really serious about using social computing tools, they would encourage their librarians to create individual MySpace profiles and set aside time each day at work to maintain their pages. It would make the librarians—and, by extension, their libraries—active and relevant members of the Internet ecology. While I might not add my library to my network, I would digitally befriend my local librarian.
The current digital information environment emphasizes context over content. As Google has shown, a critical mass of content and user data will overcome quality, producing search results that are, if not the best, at least good enough. Plus the results come fast. And social computing brings trust into the equation through such tools as Digg and del.icio.us, making referrals again relevant, making the individual again relevant.
Within this digital environment, function and behavior become the two main components of context. Once a tool meets—or transforms—behavior patterns through its functionality, the tool becomes commonplace and dominant. How do people want to find, access, capture, use, and reuse information? The organization and tool that answers this question wins.
Context has always been king, though. Consider, for instance, academic journal publishing. Publishers have long thrived by providing a context for scholarly communication. The content itself was not the commodity; after all, academic authors have rarely been paid for their academic articles. How the content was delivered and what future content it spawned were what were valuable. Content without context really never has been enough.
There is an inherent logical flaw built into the concept of such business networking sites as LinkedIn. The typical person most likely to actively populate the service—to take the time to construct and fully maintain a profile—is the social/professional climber.
In other words, the most active members of professional networking sites are likely to be those not satisfied with their current places in the pecking order: climbers. Therefore, they are most likely to reach others of their ilk, rather than those they should be reaching: the people above them in the professional pecking order. While there may be some incentives for those who've reached the summit of their careers to participate in networking sites, none of them involve climbing. Such sites then risk becoming a flat social structure populated by a self-selecting group, none of whom are influential enough to form a chain to the summit.
I am not suggesting, however, that professional networking sites are pointless. Rather, you should consider this flat structure in how you use them. Build your strategy around it and use it to advantage. Flat networks benefit contractors and consultants, who use them as information conduits to find more work, better work, or new work. And in the current information- and service-based economy, it benefits all workers to think of themselves as consultants, even if they have full-time salaried positions.
Cereal commercials made a big impact on me as a kid. I don't remember craving one brand. I don't even remember a single commercial—though the Mikey commercials stand out as a genre unto themselves.
What I remember most about cereal commercials then was that the cereal was depicted as pouring directly from the box into a bowl. No plastic lining, no sealed inner package, just a cardboard box spouting dry cereal.
I must have driven my mother nuts arguing for the 400th time—as only children can—that she was supposed to empty the contents of the sealed bag into box before putting it in the cupboard. This after dealing with me for a few hours in a crowded supermarket.
I don't remember how she responded each time. No doubt she replied reasonably about keeping the cereal fresh and bugs out. Still, my insistence and persistence must have confused her.
So, the cereal's relationship to the box mattered more to me than the cereal itself. Context over content. But what purpose did representing the cereal being poured directly from the box serve the advertisers and cereal company, if any?
"They've got their radical factions, like the Ruby Ridge or Waco types."
|fundamentalism, freedom||IDEOLOGY||open access, ownership|
|US government||POLICE||US government, copyright holders|
|Saddam, Bin Laden||ENEMY||Napster, librarians|
In this scenario, the "enemy" category is the worst place to be. Get a file sharer and ten more will spring up in their place. File sharers are a phantom menace. But librarians have a real role. We are real people with jobs and organizations. We have real things to lose, which is what makes us real targets. Publishers think: get one of us and the rest might back off.
Librarians need to be prepared to defend our right to offer information access for the common good. And what we are armed with is what we are best at: information, a sense of history, and the good will of a public that sees that our interest is theirs.
I love design. I love its principles, and I love the impact it has on the world. In order to use it effectively, consider it thoughtfully, and pull meaning from its principles, however, we need to understand what it is.
The primary objective of design is to inspire action, evoke thought, or provoke emotion. Sometimes design wants your money. Sometimes it wants your vote. Often it wants you to feel strongly in favor or against something. Usually, it wants some combination of all of the above—and more.
Most of all design wants to affect action, and to do so effectively it focuses on context rather than content. Design principles are about creating an environment through which action can be induced. The content is another matter altogether.
Therefore, design itself has little to do with ethics. Design can be put to good uses and bad uses. But design itself is only good or bad based on how effective it is.