Published in collaboration with NCMS
Digital Manufacturing Report

News & information about the fast-moving world
of digital manufacturing, modeling & simulation

Language Flags

GE Looks to Hadoop for the Industrial Internet


With a name as big as “The Industrial Internet,” it shouldn't come as a surprise that implementing the sensor-filled, waste-reducing, machine-optimizing network will require a multitude of research and development from its incubator, General Electric. While we've already learned about the vast array of sensor technologies and operational dashboards in place at GE's Schenectady battery plant, what's been missing thus far is the powerful analytics that would turn what is currently a reactive system into a predictive tool to circumvent failures on the factory floor.

This week, the industrial giant announced such a platform at the D: All Things Digital conference in San Francisco, California. The Hadoop-based big data and analytics platform will include expanded partnerships with Accenture and Pivotal, as well as a new partnership with Amazon Web Services, which will provide cloud storage. Together, they will serve as the infrastructure and analytics behind GE’s push for the Industrial Internet.

Thus far the spotlight has fallen on GE's efforts to outfit equipment with the sensors and interconnects necessary to gather and transmit key data from the factory floor and beyond. But with this latest announcement, GE said it will provide "real-time data management, analytics, and machine-to-operations connectivity in a secure, closed-loop architecture so critical global industries can move from a reactive to a predictive industrial operating model."

Specifically, the toolkit has been designed to optimize equipment for longevity, energy consumption, and throughput, as well as to predict when a part must be replaced in order to avoid failure, which  was done by breaking analytics into two roles: asset health and process performance.

Brian Courtney, General Manager of GE Intelligent Platforms' Industrial Data Intelligence Software group, explained that the goal of asset health is to go beyond merely predicting equipment failure 90 minutes ahead of time to instead accurately forecast failures months in advance. For some industries this means significant reductions in downtime and a boost in productivity, but for others such as aerospace and energy, the power to predict failure in a jet engine or across the power grid could be life-saving.

Meanwhile, process performance analytics will help to optimize equipment and processes to deliver optimal machine output based on current conditions.

Today GE offers data collection through Historian, predictive analytics for condition-based monitoring through SmartSignal, and process-level analytics through CSense. And now that GE has been gathering data throughout their businesses using these tools, Courtney said that the question remained of how to tie these elements—data and analytics—together in a useful way.

The answer to GE's question was their newly released Monitoring and Diagnostics Suite, which combines those products with several new offerings: Proficy Knowledge Center and the Hadoop-based Proficy Historian HD. Together they allow for the storage of big data, process visibility, asset health assessment and process optimization.

Knowledge Center is a model-driven, browser-based visualization application that was designed to perform data mining on an asset, such that you can see information like asset and process health alongside the corresponding analytics (such as advisories and the predictions currently available based on past failures).

“Many large-scale manufacturers have so much data that the first thing they do is ignore it,” explained Courtney, which ultimately leads to data being overlooked that could have been used to predict equipment failures. Historian HD is expected to mitigate this problem because it offers the elasticity of the cloud, whereby manufacturers can simply add nodes to their Hadoop cluster in order to accommodate a growing data set.

This has enabled GE to expand into petascale, which wasn't possible before the company turned to Hadoop, which is necessary if you're like GE and processing 5 terabytes of data per day.

“We use the software ourselves in our own Monitoring & Diagnostics centers to manage trillions of dollars in asset value,” Courtney explained. “Today, in the GE Industrial Performance and Reliabilty Center, GE engineers monitor thousands of mission critical assets for our customers to ensure uptime, asset reliability and overall production throughput."

Courtney says that they had to make some changes to how Hadoop works such that it understands data collected at regular intervals, but all the Hadoop applications that sit on top of the data still work, which means that those working with R, Pig Latin and Hive aren't out of luck.

Jeff Immelt, CEO of GE, described advanced analytics such as these as comprising the foundation of GE's future. While he assured that “GE will never become a software company,” he did say that investing in analytics “will be the only way an industrial company can guarantee that the products it sells will be successful.”

RSS Feeds

Subscribe to All Content


Feature Articles

Titan Puts a New Spin on GE’s Wind Turbine Research

Unlike traditional energy sources, wind is a trouble to tame, which has led GE to turn to advanced simulations at Oak Ridge National Laboratory to put the technology on track to cover 12 percent of the world's energy production.
Read more...

Lighting a Fire Under Combustion Simulation

Combustion simulation might seem like the ultimate in esoteric technologies, but auto companies, aircraft firms and fuel designers need increasingly sophisticated software to serve the needs of 21st century engine designs. HPCwire recently got the opportunity to take a look at Reaction Design, one of the premier makers of combustion simulation software, and talk with its CEO, Bernie Rosenthal.
Read more...

D-Wave Sells First Quantum Computer

On Wednesday, D-Wave Systems made history by announcing the sale of the world's first commercial quantum computer. The buyer was Lockheed Martin Corporation, who will use the machine to help solve some of their "most challenging computation problems." D-Wave co-founder and CTO Geordie Rose talks about the new system and the underlying technology.
Read more...

Short Takes

Local Motors and ORNL Partner for Automotive Manufacturing

Jan 24, 2014 | Local Motors, a vehicle innovator, and the U.S. Department of Energy’s Oak Ridge National Laboratory (ORNL) have announced a new partnership that they hope will bring change to the automotive industry.
Read more...

Robots Showcase Skills at DRC

Jan 22, 2014 | A month ago, the DARPA Robotics Challenge Trials (DRC) commenced. The main goal of the event was to aid in the development of robots that will someday respond to natural or even man-made disasters. At this year’s DRC, prototype robots from 16 teams were put through a series of trials in which they were to showcase their skills.
Read more...

Engineers Develop Microwindmills to Power Electronics

Jan 17, 2014 | Engineers at the University of Texas at Arlington have developed a new technology that could come in handy when electronic devices run out of power. Their idea stems from a source of power generation that we are all familiar with; windmills.
Read more...

Advanced Modeling Benefits Wind Farms

May 25, 2011 | Advanced computing resources optimize the site selection of wind farms.
Read more...

Not Your Parents' CFD

Oct 13, 2010 | Outdated beliefs stand in the way of greater CFD adoption.
Read more...

Sponsored Whitepapers

Technical Computing for a New Era

07/30/2013 | IBM | This white paper examines various means of adapting technical computing tools to accelerate product and services innovation across a range of commercial industries such as manufacturing, financial services, energy, healthcare, entertainment and retail. No longer is technically advanced computing limited to the confines of big government labs and academic centers. Today it is available to a wide range of organizations seeking a competitive edge.

The UberCloud HPC Experiment: Compendium of Case Studies

06/25/2013 | Intel | The UberCloud HPC Experiment has achieved the volunteer participation of 500 organizations and individuals from 48 countries with the aim of exploring the end-to-end process employed by digital manufacturing engineers to access and use remote computing resources in HPC centers and in the cloud. This Compendium of 25 case studies is an invaluable resource for engineers, managers and executives who believe in the strategic importance of applying advanced technologies to help drive their organization’s productivity to perceptible new levels.

Conferences and Events

Featured Events




Copyright © 2011-2014 Tabor Communications, Inc. All Rights Reserved.
Digital Manufacturing Report is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications Inc. is prohibited.
Powered by Xtenit.