Big Data Case Study – General Electric (GE)


Big Data Case Study: How GE is Building Big Data, Software and Analytics Capabilities for an “Industrial Internet”
Big Data - Case Study: How GE is Building Big Data Capabilities for an ’Industrial Internet’

General Electric Co. (GE) is known for making what it refers to as ‘big swings’ –large bets to grab the lead in emerging markets with enormous potential. The $145 billion manufacturer of jet engines, power plants, and locomotives believes Big Data and analytics is one of these markets. How big an opportunity GE sees with Big Data, how much and where it is investing, and how it’s marshaling its expertise to provide lessons for many companies, even much smaller ones.

In 2011, the Fairfield, Connecticut-based industrial giant announced the launch of a global software center and a $1 billion investment to build software and expertise for GE’s version of Big Data analytics. “What we’ve done is centralized what I would call the real rocket science of Big Data,” says William Ruh, vice president and corporate officer of the center. “These are people with deep expertise and experience. We use them to develop the very most complex capabilities and reusable components.”

GE brought in Ruh from Cisco Systems in 2011 to lead the center, located in San Ramon, California. That’s a short drive from Silicon Valley, an emerging epicenter of Big Data startups and established technology companies. Ruh’s charter is to build a center that powers up software development and data science capabilities in GE’s Big Data domain of interest – a niche it refers to as ‘the industrial Internet’.

By this, the company means the torrent of digital data emanating from sensors and other digital devices embedded in machines such as GE’s jet engines, turbines, trains, and hospital MRI equipment. Revenue of those GE businesses totalled $94 billion in 2011, nearly two-thirds of total sales. Harnessing such data would enable GE to help customers identify maintenance problems before they occur, improve fuel efficiency, and make other operational improvements that could add up to trillions of dollars in savings. To put it succinctly, it’s about “making machines more intelligent and getting data to right people in real-time,” as Ruh explains.

The digital data that GE could collect through such sensors would be gigantic. Ruh says a typical GE gas turbine generates 500 gigabytes of data daily. With 12,000 of them in service, that “certainly becomes Big Data,” he explains. GE’s gas turbines and other utility power equipment power a quarter of the world’s electricity. But that isn’t the only sensor data the company has in mind. GE is designing its next wave of airline jet engines to eventually capture information on engine performance for every flight. This will result in GE gathering a huge amount of aircraft engine information – in fact, more in one year than in the 96-year history of its aircraft engine business.1 With this and other data it plans to collect and analyze, the word “Big” may be an understatement when GE says “Big Data.”

Big Data Case Study: Where GE Sees the Payback

In 2012, GE CEO Jeffrey Immelt announced that the company would commit $1 billion to its analytics and software center over four years. That would put the company in the top 9% of our survey sample in Big Data and analytics investments2 .

While a sizable amount, it’s a small down payment on what GE envisions as a $30 trillion opportunity by 2030. Using what it believes to be a conservative 1% savings in five sectors that buy its machinery (aviation, power, healthcare, rail, and oil and gas), a GE report3 estimates the savings from an industrial Internet for these sectors alone could be nearly $300 billion in the next 15 years. Take the aviation industry. A 1% boost in fuel efficiency would put $2 billion a year into airlines’ coffers.

A growing percentage of GE’s business is services that support its industrial products –
services that help customers use GE’s machines more effectively and efficiently. Providing insights based on Big Data will be one more service offering.

“We believe this is a foundational change, in the same way that the consumer Internet has remade consumer industries in the past decade,” Ruh told us. “Who does Walmart view as its competitor? Amazon? Who does American Express see as its competitor? PayPal. In the next 10 years, the changes that we saw in the consumer Internet will happen in the
industrial world.”

Big Data Case Study: Why GE Has Centralized Big Data and Analytics

Today the center has a staff of about 300, up from just two people (Ruh and his executive
assistant) in late 2011. Not all of them are in San Ramon, however. Ruh has employees
located around the globe – Bangalore, New York and Cambridge–who report into the
center. The company plans to staff up to about 1,000 people at the San Ramon facility,
which is between San Francisco and Silicon Valley. Their work will support the efforts
of another 9,000 GE software engineers who operate in its various product businesses
globally.

Of the 300 currently in San Ramon, there are a number of ’hardcore data scientists’ as
Ruh refers to them. Why centralize these people? Ruh said it came down to three factors:

  • An acute shortage of talent. “The first reason for centralization is that there is only a limited amount of talent – and I actually mean extraordinarily limited,” said Ruh, who added that his center hires less than 5% of job applicants. “You can find a lot of people who are subject matter experts, basically analysts. You can find a lot of people who do business intelligence. You can find programmers. But the fact is that you’re still probably not getting a whole lot out of your data other than reporting. The in-depth data science and deep analytics capabilities are held by such a small number of people.”
  • Employee retention. Data scientists at GE will need a career path if they’re to stay for long. “They need to feel coupled and placed into a leadership program where they can get promoted based on their capabilities. When you put them in the businesses, their ability to grow, get promoted, and take on increasingly bigger roles is limited in many ways by the hierarchal structure of those businesses.”
  • Reusability in technology. “We need to build high-end capabilities [in solving deep technical problems],” Ruh said. “That cannot be built by each group over and over again. The reason is you can’t find the talent, you can’t maintain it, and so on. We believe this idea of reuse is going to differentiate the winners from the losers.”

The center has begun to organize employees into disciplines such as machine learning, statistics and operations research. “They are very different approaches, and there isn’t one approach that solves every problem,” Ruh said.

The keys to success for GE software and analytics? Ruh boils it down to this: continually bringing a new portfolio of compelling service offerings. To be attractive to the market, those services must help airlines, electric utilities, hospitals and other customers tap GE’s Big Data expertise and generate big savings and other improvements.

“In the end, our service offerings must foundationally improve how our customers manage, operate and maintain these big machines,” Ruh concluded. “If we do that, we will be a leader in helping bring about this industrial Internet.”

 



Home
 | Download Report | Big Data Services | Contact a Consultant Today

 

  1. Jessica Leber, “General Electric Pitches an Industrial Internet,” MIT Technology Review, Nov. 28, 2012. The estimate came from a GE researcher she talked to in the San Ramon center. []
  2. 9% of the companies surveyed online (and which provided enterprise data) said they spent at least $250 million on Big Data in 2012 []
  3. GE research report “Industrial Internet: Pushing the Boundaries of Minds and Machines,” by Peter C. Evans and Marco Annunziata, published Nov. 26, 2012. []

If you like this story, please share it