This is the second Open Group conference that focuses on the topic of Big Data. This is an architecture style that is getting a great deal of attention lately. With the emergence of social and the explosion of data coming from devices there is a surge of opportunities for companies to monetize on the data that is generated in the public domain. A great example of a tech company doing this today is Google. Google generates that vast majority of its revenue on marketing data, not its technology. Other companies want a piece of this pie.
Above is an IBM created an infographic I think sums up the opportunities for companies.
So with this as such a value driver for our companies it’s important for us to understand what this new technology enables but also be cautious not to abuse this new architecture style.
The first two keynotes of Day 1 cover the business opportunities for Big Data and the ways to make it interoperable in the enterprise.
Big Data at NASA
Chris Gerty presented the views and uses of Big Data at NASA. What great way to kick off the conference. At least for me, it’s always good starting off a conference with space ships, distant galaxies and the mars rover.
Outside the pure science geek factor, this was a great presentation. Chris showed the direct result of architecture or more specifically, information architecture on providing truly compelling results. I liked that Chris didn’t call out EA specifically but rather talked about the value that this new architecture style enables.
NASA is on the cutting edge of technology. What a refreshing view of the government. They are doing everything from open sourced solutions to democratizing information to create some really interesting crowd sourced applications. They even stood up a private cloud inside NASA before the public cloud really emerged as an option. They then evolved to a public cloud infrastructure for their big data activities.
The area that Chris talked about that I think has a lot of potential is the notion of context aware solutions. This is a Gartner term that has been used for a couple years to describe getting data from devices. NASA is looking at this to get a better understanding of their Big Data. The assertion here is that Big Data is often time “context-less” and when you bring in other inputs from other methods you get truly meaningful information. I believe this assertion hits the nail on the head.
Key Takeaways
There was three core takeaways provided at the session. Below I have provided a bit of commentary on those takeaways to provide additional insights from a pure EA perspective.
Democratize Information - "we believe that oneness, collaboration and collective insights are the pathways forward to solving humanities toughest challenges". I thought that this was a very thought provoking statement not only applied to NASA but as a lesson for EA in general. I think of the quote from Aristotle quote, "The whole is greater than the sum of it's parts"
Look for Opportunities for Big Data - be creative and experiment. In his talk we learn about all the insist you get in an unexpected way. I think there should most certainly be an innovation piece to EA some EA organizations have it but there are still a lot that don't. In creation of an architecture strategy for your company it is important that EA can get in front of the challenges facing the business along with exploration of new business opportunities.
Involve Everyone - While a specific NASA ask it does apply to general Big Data architectures. This is a lot like the first take away.
Bringing Order to Chaos
David Potter and Ron Schuldt talked about the work that The Open Group is doing to evolve their standard Quantum Lifecycle Management (QLM) and the complementary Universal Data Element Framework (UDEF) standard.
This session was complementary to the Big Data architecture style by leveraging these standards to provide a consistent method for tagging and exchanging information about anything.
Quantum Lifecycle Management (QLM)
Quantum Lifecycle Management or QLM is a body of work that was started in 2010 based on the EU-funded PROMISE project in 2005m is an information life cycle management standard. It is a model that can describe how to optimally collect and manage big data oriented solutions where data is feed from multiple different sources.
In Open Group terms QLM has the following characteristics:
Quantum Lifecycle Management is the next leap beyond Product Lifecycle Management (PLM)
Closing instance-level information loops across all phases of all kinds of lifecycles
Developing an open, trustworthy and secure information exchange for whole-of-life lifecycle management.
Enabling Boundaryless Information Flow™ to reach trillions of autonomous objects in the “Internet of Things”, making it a reality
You can find more information on QLM here:
Universal Data Element Framework UDEF
The UDEF is based on the concepts of International Standard 11179, and is integrated with the World-Wide Web Consortium’s Resource Description Framework (RDF). But it is less complicated than these standards. It is designed for use by the people that understand an enterprise’s business operations, rather than specialists in semantic technology.
Using a simple process, you can assign an index to any piece of data, based on the core UDEF vocabulary and imported vocabularies. This index will be the same as that assigned by other UDEF practitioners in your enterprise and in other enterprises. This makes it easy to relate new information to information that you already have stored, which can significantly reduce the cost of configuring and programming interface software.
You can find more information on UDEF here:
GE was used as a key case study for the Big Data movement. Below is an overview from Forbes.com on what was described in the session:
When it comes to big data, GE is playing catch-up to IBM. GE is counting on its expertise making industrial equipment—from gas-fired electrical turbines to locomotives—to give it an advantage over rivals focused on exclusively providing data solutions, says Ruh. “If you don’t have deep expertise in how energy is distributed or generated, if you don’t understand how a power plant runs, you’re not really going to be able to build an analytical model and do much with it,” he says. “We have deep insight into several very specific areas. And that’s where we’re staying focused.”
Comments