Azure Mobile Service + Node.JS + LinkedIn + Twitter + Azure Storage + HadoopOnAzure =

You might be wondering what kind of formulae is this. This is no mathematical equation, or any part of complex scenario. These are the ingredients of promising future technology, a technology that can curve the way we live.
Over past many days I was working on Azure Mobile services and Node.JS. Both seem to integrate very well and can give amazing outcomes. So I though why not work on a real time solution that may prove helpful to either me as an individual or my community. After spending few hours thinking of what can be built, I got a ping from my friend, who asked me
“How can I know about a company and what all the technologies it is working on?”.
I responded back with a reply
“Check Company portal, along with it also check the company page on LinkedIn and twitter and figure out what the members of those communities are tweeting or posting”
Bang!!! There comes an idea. Today in this virtual world where everyone likes to stay connected on social platforms, the best way to infer facts on any stuff is to go through the real time personal comments/posts. The best judge is not an expert but a layman who is the end user of the product. It reminds me of the reality TV shows, where the ultimate decision makers are the audience and not the experts.
Streamlining my thought, I went through the tweets and posts in Twitter and LinkedIn respectively and found many useful and useless messages. Useful messages had expression of likes/dislikes, feedback about the company/product and useful links that we often miss. I figured out that if we can somehow collect these tweets somewhere and do the skimming off the useless scraps, the end product could be of high value. In short a big data analytics.
As this solution was totally based on the tweets/ posts that an individual’s network connection will post, the application was named as “NetMe”.
Step 1: Connecting the dots  
Over past days I gained experience on Azure Mobile Service, Node.JS. Now I have a solution to be created which needs social feeds and that has to be analyzed for refine reports. So it was time to connect the dots.
In this data collection, I was first in need of a constant background process which will help me to fetch the tweets and the posts. I figured out there was a Scheduler feature provided in Azure Mobile Services (AMS), which schedules a job with minimum 15 minutes interval. The fact that AMS Scheduler was supported with Node.JS scripts eased my work. I could now write the logic of collecting the tweets from Twitter and Linkedin and store it. For storage I thought of Azure Table Storage service. Few days ago AMS provided the support to Azure Storage and many others. So now an end to end solution of data collection could be created using AMS, Node.JS and Azure Table storage.
Step 2: Eat…. Drink…. Code
The next step was to write the logic of the above idea. Following is the Node.JS script to capture Azure based tweets and store them in Azure Table Storage
function netme(){
var request = require('request');
var azure = require('azure');
function filter(options, callback){
  var params = {
  if (typeof options['oauth'] !== 'undefined'){
    params.oauth = options.oauth;
    delete options.oauth;
  else if (typeof options['basic'] !== 'undefined') {
    params.uri = params.uri.replace(/^https?:\/\//,
       'https://' + options.basic.username + ':' + options.basic.password + '@'
    delete options.basic;
  params.form = options; 
var req =, function(err, response, body){
    if (err) console.error("Caught in node");
  // The callback for is called when the response returns so you
  // have to save the post object and add a callback for the data event.
  req.on('data', function(buffer){
filter({ basic: { username:’<<Twitter_Username>>’, password: '<<Twitter_Password>>' }, track: 'Azure'},
  function(tweet) {
       var tableService = azure.createTableService("<<Azure Table Storage Name >>" ,"<<Storage Key>>");
      tableService.createTableIfNotExists(‘<<Table Name>>’, function(error){
          var task = {
              PartitionKey: randomString(128)
              , RowKey : randomString(128)
              , Description: tweet.text
              , UserName :
              , UserImage : tweet.user.profile_image_url
          tableService.insertEntity(‘<<Table Name>>’, task, function(error){
              console.log('Inserted successfully');
            else            {
                console.log('Insert unsuccessful');
          else           {
  function randomString(bits){var chars,rand,i,ret
          // in v8, Math.random() yields 32 pseudo-random bits (in spidermonkey it gives 53)
          while(bits > 0){
              rand=Math.floor(Math.random()*0x100000000) // 32-bit integer
            // base 64 means 6 bits per character, so we use the top 30 bits from rand to give 30/6=5 characters.
            for(i=26; i>0 && bits>0; i-=6, bits-=6) ret+=chars[0x3F & rand >>> i]}
              return ret
The Node.JS fetching tweets from twitter was not so simple, as there were few Node.JS modules which were not supported by AMS. I should thank a post at GitHub which helped me with the twitter part. The response from twitter comes with a huge JSON object where many details are given. I took 3 parameters - Tweet, User who tweeted and User’s Image from that pool and saved it to Azure Table Storage. The Azure Mobile service was configured for 15 min interval and the service was enabled. After 15 min interval I started receiving the tweets and recorded in the Azure storage.
I have created a website to display the recent 20 tweets and have added feed tracks for BigData, Microsoft, Testing and WindowsPhone too.
Now I am getting the real time views of people who are involved in reading something new or giving their feedback on some product.
As far as LinkedIn is concerned, I saw that there were many highlights like Person A is connected to Person B, Person A has been endorsed for A,B, C which was not so useful for my analytics. So I fetched only those posts which had news or status updates from my network connection. I followed the OAuth approach to get authenticated and then use LinkedIn developer API for the same. My previous blog has the details of the same.
In my next blog I will highlight my experience on the extraordinary framework of HadoopOnAzure and how it helped me in my tweet analytics.
Hope this post of mine will help the community.
"The only real valuable thing is intuition."  – Albert Einstein


Popular posts from this blog

Firebase authentication with Ionic creator

Big Data - SWOT Analysis

LINKEDIN api call using NODE.JS OAUTH module