libraries | play | information | media | policy | culture

2008-01-30

It's So Important, There's an Institute for It

Kid-tested and mom-approved, play is no laughing matter to IDEO's Brendan Boyle, famed primatologist Jane Goodall, organizational guru Kevin Carroll, and the other researchers Stuart Brown has assembled to head the National Institute for Play.

The institute, founded by Dr. Brown, seeks to develop projects that study the physiological, psychological, and behavioral state of being known as play, and use the knowledge gathered to implement social change. According to the Institute's site:

The National Institute for Play believes that as play is woven into the fabric of social practices, we will dramatically transform our personal health, our relationships, the education we provide our children and the capacity of our corporations to innovate. We see play as an un-realized power that can transform our social and economic lives.
It's about behavior and communication. It's about outlook and growth. It's about not getting eaten by a polar bear.

2008-01-28

It's a Book! It's a Brain! It's . . .

Thanks to a segment this past weekend on NPR's On the Media, I learned about some Federal court cases reported in the Washington Post and the New York Times, in which the government's right to search our hard drives has been questioned.

Turns out metaphors really matter.

If a computer is considered an extension of our brain, as any good reader of McLuhan would, then the Fifth Amendment's protection against self-incrimination prevents the government from demanding that we provide a password or encryption key to gain access into our computer devices—no matter what naughty bits we might have in there. On the other hand, however, if a computer is no more than a super-powered notebook or file cabinet, then the government can demand we give them the key and have at it.

At stake are innumerable issues. In terms of legal epistemology, at what point does an object differ enough in degree so as to constitute a different class of thing altogether? After all, though a computer can be thought of as a notebook—in fact, some laptops are classified as notebook computers—it can contain so much more, including highly private information, such as medical records. But does that really elevate it to the status of brain?

In addition, these court cases highlight another unspoken issue in American society: that no one, no matter what nationality, has any rights at our borders. Border security is such that law enforcement can search and detain anyone. And if you refuse to comply by, say, willingly offering your password so they can search your computer, they can detain you indefinitely or turn you away. Who, then, would refuse, even on principle, knowing that they will miss a connecting flight, miss a day of work, or be prevented from returning home?

Finally, these cases point out a major generational flaw in our judiciary. As Adam Liptak notes:

I think that appeals court judges are older, don't much work in the digital world. I was having lunch the other day with two appeals court judges here in New York who told me that when they wanted to communicate with another judge in the court, they faxed a memo over. I said, does anyone use email? He goes, no, no, no. Email is too modern for us.
If judges are going to weigh in on such matters, shouldn't they be at least familiar, if not actually conversant, in all things digital?

2008-01-27

Seek and Ye Shall Find

Thanks to a tip from Nate over at Catch and Release, I watched a video that offered some answers to my last, somewhat open-ended, post.

Sometime in about 10th grade, we all learned in history class about the social phenomenon of the London coffeehouse. And as the video's narrator reminded me, the European coffeehouses that arose in the 17th and 18th centuries thrived in large part as a place for literate merchants, tradesmen, and barristers to start their day gossiping and arguing over current events, as they were related in newspapers and pamphlets.

In other words, the still new and evolving technology of mass-produced print directly resulted in people seeking more face-to-face contact. Most significantly, these ad hoc gatherings were conducted across class lines.

2008-01-25

Whatever Happened to Far-Flung Work Teams?

Network technology and digital communications were supposed to have profound implications for how business is conducted. They have, of course—just not in the ways people expected.

Among the predictions were that business travel would reduce or disappear; that startups would be able to, well, start up in depressed and less-expensive regions, thus benefiting both the nascent company and the region; and that workers would be able to shun expensive cities in favor of logging into their jobs from healthier, cheaper locales.

But as Tim Harford argues, the more wired and wireless we become, the more likely we want to stay where the action is in dirty, expensive cities.

Harford makes some good points. Though airlines continue to fail and Blackberries continue to improve, business travel doesn't seem to be abating any time soon. And in my IM-enabled office, people are just as likely to follow up a message by going over to their coworker's cubicle to talk.

I'm not certain if Harford is right—that technology built to enable remote, isolated communication actually forces us into more face-to-face contact. A study would have to be designed to measure the true effects. However, I wouldn't be surprised if he were right. Clustering is a common phenomenon in business.

Ever notice how major cities have merchant districts, such as a diamond district or a textile district, which seems to run counter to every assumption about direct competition? And let's not forget the clustering instinct of immigrants. Perhaps colleagues who share common goals and complaints under the banner of their company similarly need to stick together.

2008-01-24

Would You Like to Smell the Bottle Cap?

There was a time that Overheard in New York truly amused me. Because, as each of us knows too well, we've all said some pretty stupid things in our most unguarded moments. It was a comfort to read the stupid things others have said. We laughed in recognition, not just at them but with them.

Then too many people started trying to show how clever they were by submitting clearly made-up quotes.

But thanks to Linkbunnies, via [BB-Blog] (two great blogs, by the way), I've now got a new favorite site, White Wine.

2008-01-20

J-Pop

Media theorists, such as Howard Rheingold, have looked in recent years toward Japan as a bellwether for human-computer interaction. Some combination of their cultural homogeneity with their recent history, including rapid and artificial moves from imperial society to industrial society to consumer society, makes the Japanese a kind of media fruit fly in the minds of Western sociologists of technology.

It's no surprise, then, that a front-page article in today's New York Times is devoted to the latest J-Pop phenomenon, cellphone novels. The novels—composed and read predominantly by young women on their mobile phones—have circulated among Japanese youth since about 2000. They are composed of short sentences and bursts of text, sometimes use emoticons, often have little plot or character development, usually feature a fatal disease in the narrative, and were considered a noncommercial fad. Until year-end publishing figures for 2007 revealed that half of the Japanese top ten books were republished book editions of cellphone novels.

While reading this fairly long article, I wondered, as any armchair media ecologist would, if popular artists must not only compose for a medium but also on that medium in order to match the consumption habits of her audience. The quote that concludes the piece, from an editor discussing one star cellphone novelist who recently gave up her mobile as a compositional tool, suggests the answer is yes: "Since she's switched to a computer, her vocabulary's gotten richer and her sentences have also grown longer."

2008-01-18

Who Dropped the Ball?

The February 2008 issue of Fast Company has a five-page spread featuring key players in the current debate over electronic medical records.

Advocates of an electronic health-care system argue that universal digital records would result in better, more cost effective treatment for patients. If a single complete electronic record followed a patient throughout his life, they contend, physicians would be better able to see patterns in the person's medical history and make more accurate diagnoses.

On the other side of the fence are patient advocates, who are concerned with issues of security, privacy, and data integrity. They worry, for example, that people other than the patient and those authorized by the patient would have access to the records. If, say, an insurance company gained access to a potential client's complete medical history, wouldn't they use it to determine their risk factor in insuring that individual and possibly deny that person coverage?

Of the five leaders in the debate featured by Fast Company, two work for corporations, two work for nonprofits, and one works for the Bush administration. Two are trained Physicians, one is an attorney, one a businessman, and one a computer scientist.

There's not a medical librarian in the bunch.

Medical librarians must have a stake in this debate. And I can't imagine that they're neutral. Therefore, I am lead to one of three assumptions. Either there are no leaders in this debate from medical librarianship; the people at Fast Company performed lazy research for their feature; or the Medical Library Association needs to do a lot more to increase the visibility of its membership.

Whichever the case is, I do know that Fast Company is a fairly influential publication read by mid- and high-level government and industry leaders across the country. And as far as they know from Fast Company, medical librarians have nothing to add to the electronic medical records discussion.

2008-01-16

Web 3.0: Everything Old Is New Again

There's a lot of speculation about what Web 3.0 will entail. And just like 2.0, we probably won't understand it fully until it's already past. Therefore, I'll offer my prediction for 3.0, taking comfort in the knowledge that at this point my guess is as good as anyone else's.

Web 3.0 will be dominated by . . . search.

What, you say? Haven't we already been there and done that? Hasn't social computing shown the true power of the Web? Yes. But it won't hold a candle to the enhancements geographic data will give to search.

Imagine what it'll be like when our handhelds, outfitted with a fast and functional browser and GPS technology, will adjust the prominence of our search results based on our location, which it will know at all times.

Want pizza? Type it into your favorite engine and pull up the first ten results, all within two blocks of where you're standing when you hit "Search." Forget having to post a query on Chowhound. Forget having to remember and type in your current zip code. Forget asking a friend—or a Playful Librarian—for a recommendation. The local pizza places' sites, and every opinion available on their products, will be pushed into your hand.

Forget even having to Twitter your friends with your location. They—and probably the government—will already know.

And if you use your browser frequently and personalize it enough, it may even learn your preferences, anticipate your searches, offer suggestions you haven't asked for yet, and place your order before you arrive at your destination.

2008-01-14

Torture: A User-Centered Design Approach

I've long been an opponent of the legalization of assault weapons. Guns are tools built for only one purpose, killing, and assault rifles are built to do so that much more effectively.

A few years ago I was offered the chance to fire one, and being one who likes to know something about the things he opposes, I took it. The gun was an Army-issue M16—the kind carried by American soldiers in Iraq—modified to make it civilian legal.

That the trigger mechanism was changed from automatic to semi-automatic mattered little as I squeezed off several rounds toward a target area in a few seconds. I felt a rush and a thrill. And like one who peers over the edge of a high place and imagines jumping, I pictured how quick and simple it would be for me to turn toward my fellow shooters and suddenly end what had been an otherwise fun day for them.

Holding and firing that weapon scared the shit out of me and confirmed my opposition to civilian use of assault rifles, even if they are modified.

Therefore, before Congress, the Department of Justice, or any White House administration determines whether or not waterboarding constitutes torture, they should try the technique on each other. Not only would it inform their judgment, it would make future debates on Capitol Hill that much more fun.

2008-01-12

The Consumer Food Chain

Having come in at #1 on the charts, Radiohead's In Rainbows seems to have given the lie to the notion that offering free or "name your own price" digital downloads will always result in diminished sales.

Shoddy economics aside, what troubles me about such talk is the epistemological implication that people are fundamentally not trustworthy. And what troubles me more is that companies incite such discussion about a condition they created.

Let's muck about in another of my thought experiments. Have you ever gone to the supermarket, shopped, come home, and discovered that you were undercharged by, say, $5 or $10? Have you then decided not to go back to give the store the money you owe? I admit it. I have.

When I pause to think about why, there are many reasons. But the main rationalization is this: They screwed up, not me. And they're a big corporation that can afford to absorb their mistake. Besides, how many times in the past have I been overcharged for things in that store? Don't they have huge markups to account for such mistakes or theft?

Again, I admit it—it's all just a rationalization for my own laziness and chintziness. However, who created the context in which such rationalizations can exist? Companies have spent billions trying to convince us that they and their products make our lives better. Maybe they have, maybe not. The point is they've done their darnedest to convince us so and, in the process, have placed us at the bottom of the commercial food chain. It is, after all, our dollars they're hunting.

Therefore, it irks me when corporations frame the digital download argument in terms of theft. It irks me more when they scream theft on behalf of the "artist"—artists who are paid pennies on the consumer dollar for their own work. It happened with Napster. It happened when the majority of downloaders of Stephen King's failed experiment, The Plant, did so without paying.

People are not inherently thieves. Nor are they inherently cheapskates. However, they have been lead into a consumer environment in which rationalizing free downloads seems a viable option. They didn't create the environment, but they will play in it.

Digital technology could offer a viable option. There are new distribution and business models out there to discover—ones that are equitable for both the artist and her audience. The problem is, that leaves the corporations out of the equation. And they don't like that one bit.

2008-01-09

The Flip

I know I've pointed to the New York Times a lot recently for stories. Sorry. But there's been a lot there of late.

Most recently, the New York Times Magazine had two essays that put their journalistic fingers on an ongoing cultural trend. The first is by James Gleick, who writes:

Just when digital reproduction makes it possible to create a "Rembrandt" good enough to fool the eye, the "real" Rembrandt becomes more expensive than ever. Why? Because the same free flow that makes information cheap and reproducible helps us treasure the sight of information that is not. A story gains power from its attachment, however tenuous, to a physical object. The object gains power from the story. The abstract version may flash by on a screen, but the worn parchment and the fading ink make us pause. The extreme scarcity is intensified by the extreme of ubiquity.
The second is by Noah Feldman who notes, in his examination of the Mormon religion, that
There is nothing inherently less plausible about God's revealing himself to an upstate New York farmer in the early years of the Republic than to the pharaoh's changeling grandson in ancient Egypt. But what is driving the tendency to discount Joseph Smith's revelations is not that they seem less reasonable than those of Moses; it is that the book containing them is so new. When it comes to prophecy, antiquity breeds authenticity. Events in the distant past, we tend to think, occurred in sacred, mythic time.
And so it would seem that Marshall McLuhan's third law of media, Reversal—in which the ubiquity of text grants rare textual manifestations with the stamp of the sacred—requires more attention than ever. Especially since the paper of record carried two stories that so blatantly embody Reversal in the same issue.

2008-01-06

'Roll Review #6: Catch and Release

Like Donny & Marie, Nate Hill is a little bit country and a little bit rock 'n roll: country in his fundamental focus on people and belief in community, rock 'n roll in his willingness to question the status quo.

Nate and I emerged from the same library school, which by choice and chance has proven remarkably capable of assembling a like-minded group of students focused on reform and design in libraries and information services. I've seen Nate in action, and he has big ideas. What makes him different is his willingness to act on them.

Therefore, I happily recommend his new blog. It's in our best interest to read it and acquaint ourselves with his ideas—we all might work for him one day.

2008-01-03

Impression Management

The Styles section of today's New York Times has a report on research being conducted by social scientists into how people represent themselves online.

Their line of inquiry leaves me cold. It's important for marketers and advertisers to understand current online social behavior, of course. But current online behavior reveals little about the true social impact of digital technologies and the Web.

Of course people put forth a public persona in public fora. They do so now more than ever, because there are more public fora available than ever and the majority of computer users were born before 1980. Though we may be facile with digital technologies, those of us older than 28 are not native users. We remember a world before mobile phones, the Web, a computer on every desktop, and constant pervasive connectivity. Most of us older than 28 have different definitions of public and private than those born after 1980.

What I want to understand is how social networking technologies are blurring the lines between public and private behavior, particularly as more and more people born after 1980 into a world that was digitally connected are entering adult life. What formerly inappropriate public behavior has become acceptable? How much of our public personae are we incorporating into our private lives?

We've all experienced how mobile phones have brought private moments into the public sphere. We've walked pass people screaming or sobbing with a cell phone held to their ears; we've overheard fear and heartbreak and myriad other emotions once confined to solitary experience. To what extent—if at all—are the boundaries of the public and private shifting with the social Web?

2008-01-02

Elijah Returns

My year-end post seems to have been more agitating than I intended. I don't want its larger points to be lost.

How we treat others in any space, public or private, goes a long way toward determining who and what we are as a society. This is especially true in extreme circumstances—in our private spaces, faced with someone behaving outside the norm through his own fault (intoxication, for instance) or not (say, a medical condition). At that instant, who can guess the cause of the aberrant behavior? I only ask for pause to consider the best course of action for all involved.

It's easy for me to second-guess Berlin because I wasn't there in the moment. I gladly grant that. However, I stand by my own gut response to his anecdote: I still think the cops need not have been involved. The police exist in large part to handle intractable situations. From my vantage, the fellow on the porch was tractable. The finer legal points of the flagrancy of his vagrancy aside (there's a reason breaking and entering is a violent crime and trespassing is not), I'm fairly sure a shout from the front-door threshold would have woken him and sent him scurrying away embarrassed.

What does all this have to do with librarianship? Well, librarians (and cops, for that matter) are among the few workers whose workspace is a also public space. It's not unusual for us to encounter homeless, drunk, and or unstable people regularly at work. And we have to deal with them, ourselves. And because we do, we need to be sensitive to that person, as well as anyone else in the library potentially affected by the interaction.

If a homeless or drunk person walked into an office building, there are people and protocols in place to deal with them, usually before they wander up to some unsuspecting worker's cubicle. I've worked as a security guard and had to use such protocols to assist homeless guys out of the building's lobby—the office's front porch, if you will. Still, the police were never involved.

The guy on Berlin's porch should certainly be grateful his local police seem to take the "keeper of the peace" part of their jobs as seriously as they do the "law enforcement" part. And Berlin is lucky to live in a precinct so responsive and willing to step in for him. I hope neither takes their police—or their public librarians—for granted.