Saturday, August 18, 2012

ISR and ISR logically go together? MetaData


"Are we there yet?"
How many parents have heard that phrase, while driving, during a recent summer vacation? Millions more than usual, one suspects, given the rise of staycations that see us driving distances short and long instead of jetting off to more exotic locales. Every parent who hears this question, especially the one driving, knows just what to say.

"We're almost there."

There is a sense in the streaming industry that metadata -- that less-than-exciting-but-altogether-necessary functionality layer that's never exposed to mere viewers -- is "almost there" when it comes to widespread usage.

By "almost" I mean in one of the two ways that parents use the phrase with their children on a long trip. Either it's a really long way and we say "almost" to appease them in hopes they won't notice the next 2 hours of driving, or we really are sort of close to the destination but don't have a good sense of exactly how long it will take.

Which of the two is it for metadata usage in the streaming industry? I think it's a bit of both.

It's a long, long way (from Clare to here). There's a sense that, despite the progress made in metadata usage for on-demand services such as premium movie viewing, there's still a long way to go when it comes to widespread adoption of metadata for the average viewer.

It's nice to see a search bar functionality in Netflix or other on-demand movie streaming apps, and it's even nicer to see a list of similar movies after one has chosen to view a movie. But for the average lean-back viewer who's using a Google TV or Apple TV set-top box, those recommendations are after-the-fact choices that might have changed had the search or recommend capabilities been more robust and easier to access.

In addition, we've only really scratched the surface of nontext searches, an issue the industry has struggled with since its inception: I remember, more than 12 years ago, having a consulting session with the CEO of an early indexing-search-retrieval (ISR) company in which I challenged the executive team to think beyond text, only to receive feedback that they couldn't think of any way to search without requiring text input.

It's (sort of) very close. Yet we are also "almost there" in terms of ISR, or even end-user metadata usage, in that we now have a set of disparate tools available -- if we can get them into the same toolbox.
Think, for instance, of how simple it would be to use a smartphone camera to scan a QR code at the cineplex for a reminder to watch the movie again when it's available online. Or, for that matter, to do the same to a DVD cover at a local big-box retailer, which adds it in to the instant queue of your on-demand streaming service.

Better yet, think about being able to capture an audio sample of a song while you're on vacation, one that you think you've heard in a movie, then request that a list of known movies the song's been in be waiting at your set-top box for purchase or on-demand viewing.

This kind of metadata or ISR usage isn't rocket science. It's available today, at least if the human operating the apps and set-top boxes can provide the missing link between all the devices.

There are additional signs that we may be getting even closer. YouTube has introduced the option to right-click and save not just the URL for a YouTube clip but to save a URL that will begin playing at the exact spot where the cursor is placed when the URL is created. At the moment, it doesn't work as well as advertised, since on some short videos playback will still start at the beginning and videos with preroll advertisements won't skip the preroll, but it sure looks like an "almost there" feature to remove the tedium of watching through an entire hour-long lecture just to find a few key points.

[This article appears as "Almost There" in the August/September 2012 issue of Streaming Media magazine.]

No comments:

Post a Comment