Social media touches many lives and monitoring it closely is useful for a wide variety of purposes. With so many social networks it’s not so easy to monitor the social media interactions happening. If you know your away around code, you can tap into the API of social networks to gather social data, even if it involves handling dozens of APIs, multiple formats, maintenance, coding knowledge and more.
Gnip helps businesses save money and time by cutting down on tedious processes and costly engineering teams. Interested in learning how to make the most of Gnip? Read on.
Gnip is a 360 degree social monitor for dozens popular social platforms including the likes of Twitter, Facebook, Myspace etc. The web app uses the official APIs of social networks and hence the data provided is accurate. They handle all the grunt work associated with handling APIs and you just pull data with keywords.
Use their built in keyword or user search feeds for specific results, or get it all using their full and partial firehose feeds. Consider this to be Tweetdeck or HootSuite, but with support for loads of other social networks.
There’s no sign up form as all accounts are created after contacting the sales team. However, there’s a requisition form for getting a 3 day trial account. Once you submit the request, an email with the URL and login details will arrive at your inbox. In my case, the login information arrived within ten minutes.
Gnip offers four different subscription plans starting at $300 a month. Based on your plan, the subscription is either monthly or annually. There aren’t free accounts available but users can avail a free 3 day trial period for evaluating the web app. You don’t have to submit credit card information for availing the free trial.
Most feeds are free to Gnip customers, however, some data sources set prices for some of their feeds. Feeds that incur an additional charge are called “Premium feeds” and poplar premium feeds include Twitter User Mention Stream, Twitter Decahose for Tweet data and Newsgator and Postrank for blog data.
Getting Started With Gnip
Setting Up a Sample Feed
After signing up, a getting started page will offer to create a sample feed from the four preconfigured sources or create one from scratch for the other 44 social networks. Let’s start with a sample feed for preconfigured social networks.
Hit Add Data Feed to initiate the process building your own feed. In the upcoming page, all social networks supported are listed and IMHO, there’s almost all of them here. To see what functions or data can be pulled from each each network, click on the name of your choice.
Some networks have many data streams while some might have just one. This is a limitation of the API offered by the social network and not a deficiency of Gnip per se. All data streams are accompanied by clear descriptions which should help you avoid ambiguity in selecting the right data stream.
Monitoring a Facebook Fanpage
I selected to monitor the comments posted by fans of Facebook pages. Much of the data from the Facebook fan pages is public, but some of their content may not be. If you plan to monitor stuff that’s not public, you’ll have to use Tokens. Gnip gives a clear idea as to how to create a token and that should be sufficient to create one.
The rest of the steps are really simple. Add a list of fan pages you plan to monitor, one after another. Then specify the data collection intervals. Gnip allows you to track data in real time or at preset intervals. So either way your data collection is covered.
Select the output format and default delivery method to complete the process.
Note that you have the option to expand short URLs, which could be useful if you want to know which sites are attracting the fan base.
Viewing Data Collection
Check if all the details you have entered are correct and confirm the creation of this feed. You can alter the feed from the feed settings page if you feel the data collected is not accurate.
Based on the type of activity you chose to monitor, the usage at the social network and other factors, user activity will start to show up in the graph and the table below it. There’s no jargon and the interpretation of data is simple, thanks to the well thought out tools of Gnip.
The graph can be set to show user activity by the second, minute, day and so on. The only thing that stops you is the volume of data that you can handle and interpret. For example, in the graph pictured above, I am trying to monitor responses for the keywords – birthday, cake and party in Facebook. Given the fact there are 500 million registered users and how common the keywords are going to be in a platform that handles personal connections, this could end up overwhelming and actually it did.
In the tabulation below, each keyword is accompanied by a timestamp and the number of results pulled in at that time. Use the number of queries requested and the responses they get to identify the highly trafficked ones. Create a separate one to monitor them as Gnip let you create multiple feeds.
Gnip delivers your online conversation data efficiently and reliably. It’s the kind of app that makes a data miner salivate. However, the pricing is a huge road block for smaller scale users and is definitely not going to be in the budget of a lot of small businesses. Gnip might consider offering some plans with limited number of feeds and limited number of social networks.
Also, the representation of user data could be made more readable. Instead of an XML file, a section to consume the comments or tweets as is (or close) would be great. For those who have a war chest, Gnip is a treasure trove of social reactions.
Share Your Thoughts!
How useful Gnip do you think could be? Are their pricing plans a sticking point?
2016 Top 5 Business Apps
- Getting Projects Done The Everlist Way https://t.co/h7cdZqYT2a
4 hours ago
- No regrets: How to ditch your software provider without the heartache https://t.co/wqHDBPxPRN
6 hours ago
- The Best Places to Host Your Email With Your Own Domain https://t.co/GaQHWCWO2v
10 hours ago
- 3 things all leaders should know about data and predictive analytics
19 hours ago