A few days ago, I used the example of Peer39 to illustrate my belief that just as hardware and software innovation drove new growth over the past 20 years… over the next 5+ years, new value creation will increasingly come from Data Innovation.
Here I’m thinking about data innovation from a technology, customer and business model perspective when aggregating, indexing, threading, analyzing, sharing and visualizing data.
The announcement of GNIP is a great example.
Today, Eric Marcoullier of MyBlogLog fame (acquired by Yahoo) has launched a new web service called GNIP that aggregates, translates and notifies users of new social data. GNIP describes how it can dramatically change the translation, distribution and protocol translation of web data messages to improve application performance and efficiency by removing the REST polling of the social web. In their words:
Today, 3rd party data consumption means…
API throttling & high latency
One-off data integrations
Identity discovery challenges
More grunt work than any of us like to do
None of these issues are insurmountable, most are not even complicated. They’re hassles. An exponentially growing set of hassles that distract you from the features of your service that your users actually care about. Gnip is here to help. We make it easy to aggregate user data so that you can focus on doing something cool with it
GNIP also offer the ability to translate from one protocol to an another through a “protocol bridge” so that consumption of the aggregated data is efficient and simplified.
I really like the assessment and vision that Marshall Kirkpatrick of RWW outlines for GNIP.
“Gnip wants to serve as the grand central station and universal translation service for the new social web.”
and a little bit later…
“…we think that Gnip has a whole lot of potential to deliver huge value to the applications leveraging it, to the backers financing it and ultimately to all the users of the emerging class of social web applications. We love this kind of stuff.”