libraries | play | information | media | policy | culture



Search is receiving renewed attention from developers, designers, and venture capitalists, which surprises me not at all.

Some of that focus is placed on semantic search, such as the work Powerset is pursuing. Others, such as Mahalo, have revisited the idea of human-indexed results. And then, of course, there's Twine, which appears to combine the best of both worlds.

But the project I find most exciting has less to do with advancing the actual retrieval function and more to do with improving the manner in which our results are presented. And it would appear, from what little I've seen so far, that they've jumped the search results page ahead a whole evolutionary order, as compared to the standard set by Google. It's as if Lucy gave birth to Einstein.

May I introduce SearchMe:

Applying the look and feel of Apple's popular Cover Flow iTunes interface to search results is a brilliant piece of appropriation—one so simple that it'll have thousands of people asking, "Why didn't I think of that?" which is always the sign of a great idea. The interface shown in the demo is not only beautiful, but it promises to be the first truly intuitive visual results page. Up front it offers a massive amount of information that is easily digested—in much the same way we flip through a magazine—then parcels out additional information on an as-needed basis through subdued textual features.

Considering Tuesday's post, it is particularly interesting to note SearchMe's use of categories to help narrow a search. Makes sense. In such a visually rich environment in which text is at a premium, such classifications can be a powerful tool.

The actual results SearchMe retrieves have been questioned, as has its scant 1 billion pages indexed (compared to Google's 20 billion). That will improve in time, as SearchMe scrapes more of the Web. And once they go fully live, they will likely have plenty of time and money to get it right. As CNET notes, like a magazine, the riffling effect of the interface allows advertising to be inserted among the results.

However, if precision and recall prove to be the company's Achilles heel, I imagine it won't take long for one of the more established engines to buy their interface.


panoply said...

Please excuse, my obsession is currently with ubiety, place and "whereness". I wonder, how will that play into the new search models? As we search from both everywhere (ubiquity) and here (ubiety), how does that change the way we search? I think that the new interface designs will take this into account. As people like Adam Greenfield start designing for Nokia (his new job), then this will certainly be the case in terms of product design.

librarian@play said...

The fundamental model altered by ubiquity and ubiety will be human behavior, if ubiquity and ubiety are ever fully achieved. After all, if information retrieval and presentation become truly invisible within our environment, which is the ultimate goal of ubiquity/ubiety, then it is we who will adapt to it, not the search models themselves. First, however, there are two obstacles to overcome. The infrastructure and widespread adoption of technologies that can communicate seamlessly must be in place for true ubi*. And humans must become fully comfortable with the notion that someone or -thing will always know who and where they are. Given our track record of handing the controls over to anything that makes our lives more convenient, I'd say the latter is less of an obstacle.

natehill said...

Information retrieval and presentation may well become truly invisible in the environment in some ways, but the purpose of a search engine of any style is to provide information piecemeal as it is requested by a user. Its just not a question of ubiquity- yet (though it may be a question of ubiety). This is not to say that an automated search product that harvests information from the web based on a user's location vs. their indexed potential wants/needs/preferences cannot exist, but isn't that a whole different animal from what we are looking at here?