Battle of Titans on IoT Standards to Control your Connected Life

According to a report from Gartner published in Dec 2013, The Internet of Things (IoT) will grow to 26 billion units installed in 2020 representing an almost 30-fold increase from 0.9 billion in 2009. This will result into 300 billion incremental revenue mostly in 2020.

This is a next big opportunity. All the device manufacturers, network players, telcos, etc. would want to have a major share of it.

The IoT would include worlds of home automation devices, interconnected cars, wearables, smart sensors, movable / fixed assets, etc. All the diverse devices needs to communicate, monitored and controlled.

No wonder, Intel, Samsung, Dell, Broadcom and many others announced this week a formation of Open Interconnect Consortium (OIC). According to a report, the Open Interconnect Consortium (OIC) will define a common communications framework to wirelessly connect and manage the flow of information among personal computing and emerging IoT devices of operating system via diverse service providers. The consortium would make open source code contribution, and provide device certifications.

OIC is not the first one in the market. Back in December, a group of companies led by Qualcomm announced another alliance called AllSeen Alliance and made AllJoyn as a standard for achieving the same. Initial members of this alliance included Qualcomm, LG Electronics, Panasonic, Sharp, Silicon Image and Hailer. Microsoft, and Cisco also joined the alliance.

However, Google’s Nest has become a de facto icon of intelligent device in the sphere of Home automation. Google is seriously expanding in the market.Google has recently bought Dropcam too. Google took it further by announcing Working With Nest framework. Effectively it would become Google’s standard for controlling home automation.

Apple is not be behind too. Apple will deliver a much awaited iWatch later this year. It would play an important role in wearables. Apple already has AirPlay as a standard for connecting apple ecosystem devices like AppleTV, etc. Would AirPlay be Apple’s “standard” for for interconnecting , controlling and monitoring devices?

The big question is Who would control effectively your life, home (via home automation), transport (via interconnected cars) and health( via wearables) ?

Do you have a say in it?

Advertisements

Machine Learning on Big Data gets Big Momentum

Big Data without algorithms is a dumb data. Algorithms like machine learning, text processing, data mining extract knowledge out of the data and makes it smart data. These algorithms make the data consumable or actionable for humans and businesses. Such actionable data can drive business decisions or predict products that customers most likely to buy next. Amazon and Netflix are popular examples of how the learnings from data can be used for influencing customer decisions. Hence, machine learning algorithms are very important in the era of Big Data. BTW in the field of Big Data, ‘Machine learning’ is considered more broadly ( than what it is really meant by the machine learning professionals) and includes pure statistical algorithms as well as other algorithms that are not based on ‘learning’s.

Earlier today, on 16th June, Microsoft announced a preview of machine learning service called AzureML on its Azure cloud platform. With this service, business analysts may easily apply machine learning algorithms like the ones related to predictive analytics to data.

Machine learning itself has been popular for last few years. Microsoft has recognized the trend and jumped on it. When it comes to big players making machine learning services on cloud, Google had pioneered its PredictionEngine as a service on cloud few years back.

Traditionally data scientists use tools like Matlab, R, Python (NumPym, SciKit, Sklearn) and others for analyzing data. Programmers use open sources like Apache Mahout, Weka for developing Application services using Machine Learning algorithms. However, having machine learning algorithms is not good enough, scaling the machine learning algorithms to big data is very important.

Last year Cloudera did an acqui-hire, Myrrix, and open sourced Machine learning on Hadoop as Oryx. Berkeley’s Ampslab has opensourced its Big Data Machine learning work, called MLBase, in Apache Spark, an open source big data stack becoming rapidly popular.

The momentum in machine learning has already fueled a good amount of venture funding in this area.

  • SkyTree got $18Million funding from U.S. Venture Partners, UPS and Scott McNealy.
  • Nuotonian grabbed $4 million Atlas Ventures for Big Data Analytics.
  • Another startup wise.io raised $2.5 from VCs led by Voyager Ventures. Wise.io would makes it easy to predict customer behavior using machine learning.
  • AlpineLabs that came out of EMC raised series B last year from Sierra Ventures, Mission Ventures and others. It provides a studio and easy to assemble set of standard Machine Learning and analytics algorithms.
  • Oregon based BigML raised $1.2 million last year to provide easy to use machine learning cloud service.
  • RevolutionAnalytics which got $37 (in total) makes R algorithms to work on Map Reduce.
  • and the list goes on

There is an interesting Machine learning project called Vowpal Wabbit that initially started at Yahoo and continued at Microsoft. However, Interestingly, instead of VW, Microsoft is making R language and algorithms available on Azure Cloud.

Anyway, the trend of making machine learning services easy to run on Big Data and on Cloud would continue. But having the tools and algorithms available would not enough to solve the problem. We need qualified people who understands which algorithms to use for solving which cases and how to use them (parameterize). Moreover, what we really need is applications using such algorithms to solve the business problems without even having a need for users to understand the algorithms. In my opinion , what we would see in future is such vertical applications / services that would abstract (use but hide) machine learning or prediction algorithms to serve domain specific business needs.

Data on BigData

According to Transparency Market Research’s
  • Cumulative Ave Growth Rate (CAGR) of Big Data projected to be 40% from 2012-2018
  • the global big data market was worth USD 6.3 billion in 2012 and is expected to reach USD 48.3 billion by 2018
  • Big Data tools : CAGR of 41.4% from 2012 to 2018
  • Storage CAGR of 45.3% from 2012 to 2018
  • Major players (by revenue) last year HP Co.Teradata, Opera Solution, Mu Sigma and Splunk