GE, Predictivity, Amazon Web Services, and the Industrial Internet

Earlier this month at the All Things Digital conference, General Electric CEO Jeff Inmelt talked about what he calls the “Industrial Internet.” This is the ‘Net that runs largely behind the scenes. For most of us, the Internet is accessed via the World Wide Web and email, but for industry and big business, the Internet is about moving data and a lot of that data is very boring, but very critical and useful to industry.

The Industrial Internet is made up of data from sensors, machine outputs, infrastructure changes, software updates for robots and machines, energy usage, and more. Imagine how much has changed in your own home in regards to how much data is collected by the various machines in your house: watches, mobile devices, climate controls.. even your refrigerator. Now times that by a million and you’re getting close to the kind of data that any given business in any given industry may be generating, collating and sorting. All 24/7/365.

For example, the average robot on the production floor of an automaker will produce about one megabyte of data every six seconds as it operates. Most robots operate for about 20 hours a day, on average, and so they produce about 12,000 MB of data daily. A production floor with thirty robots running simultaneously would create about 360,000 megabytes of data daily. That’s per production facility, per day. Three hundred and sixty thousand megabytes is equivalent to roughly 90,000 MP3 songs or high-definition photographs. That’s a lot of data.

GE has been working on what they call “Predictivity.” This is the ability for networks and machines to make predictions about data influx and outflow to better manage cloud-based operations for handling and utilizing that data. They have a Hadoop-based software platform for high-volume machine data management which includes partnerships with Accenture, Pivotal, and cloud services from Amazon Web Services.

The idea, says GE’s top man, is to create a common architecture for industry to use for collecting, sorting, and using the data being collected from intelligent machines, sensors, and analytics programs. As machine data continues to grow almost exponentially, resources for handling and utilizing it are becoming stretched and strained.

To bolster this work, GE has expanded its Silicon Valley offices and invested heavily in Pivotal $105 million), which is an enterprise-as-a-platform service helmed by former VMware CEO and Microsoft executive Paul Maritz as a spinoff of VMware. New agreements with Amazon and Accenture further build the deal.

For GE, this is about their forte: energy management and the data that goes with that. The company services nearly every industry on the planet, in one way or another, with products, services, or advice. Obviously, all of that in today’s world means a lot of data crunching and so expanding into this field is a definite leadership move on GE’s part.

Moving forward, we can expect to see a lot more of this happening as industry’s globalization shifts towards data management on larger and larger scales. In our automotive example above, a company like General Motors or Fiat will have factories producing engines, transmissions, and cars globally and simultaneously, all day, every day in factories from Michigan to China to Ontario to France. That’s just automotive.

GE is on to something.

Technorati Tags: ,

Published by

James Burchill

James is a fan of practical "what" and "how to" information and enjoys showing you how to 'convert conversations into cash' using social media, online marketing and live events.