Home » Industry » Federal Government » Next Generation Data Management

Next Generation Data Management

gemfire-small

Data awareness the critical gap for today’s systems

Although there have been a sea of changes in the software industry over the last 30 years, there has been no major change in data management since the introduction of the relational database system (RDBMS) in the 1970’s. The world has changed drastically since then. We have orders of magnitude more data, arriving at much faster rates, from more sources. Applications that depend on this data have proliferated, reflecting the needs of the business to have faster and more ready access to information. The relationships among those applications have grown as one business process affects another, requiring the applications to share the data in real-time.

Modern relational databases have resolved many of the Achilles heals that they either introduced or suffered from in the early stages. They now provide mechanisms for dealing with high availability, clustering and fault tolerance. They can replicate data to peer databases around the world. However, a few problems remain. Firstly, relational databases are a good way to achieve data integration but are lousy at achieving process integration (more on this later). Secondly, using features such as ‘triggers’, they may be able to detect ‘events’ (changes in data that some application may be interested in) but they are traditionally poor at distributing events back out to the client tier. And thirdly, they do not store, nor present data to the client in a ‘ready-to-use’ format for most of the applications. There are multiple layers of translation, transformation, memory mapping and allocation, network I/O and disk I/O that need to occur in order for the simplest of queries to return the simplest of responses. As our use of the RDBMS has grown over time, we have come to depend on them to share data, but they were really only designed to store data.

In an attempt to break down stovepipe systems, there has been a move to Service Oriented Architectures (SOA). SOA helps organizations achieve reuse of individual components of a business process, and makes it easier to adapt their overall processes to align with changing business needs. SOA enables organizations to quickly build new business workflows. However, SOA still fundamentally leaves business processes as stovepipes and it operates on a basic assumption that the components are completely independent. SOA does not address the issue of the real-time interdependencies on the data that the processes share.

In an attempt to get a comprehensive view of data, large organizations are building data warehouses and online/real-time dashboards, so that senior management can see the big picture and drill into critical details. Most dashboard and/or data warehouse solutions pull a copy of the operational data together (usually into a new dimensional format), leaving the original data in place. The operational applications cannot advantage of this combined data view. Data warehousing does not do anything to solve the real-time data interdependencies between applications where business processes intersect. The Missing Link is ‘data awareness’.

Consider as an example the way that mission planning applications (such as JTT or JMPS) depend on data from Battle Damage Assessment (BDA) systems, Enemy Order of Battle (MIDB), situation reports, etc. Let us examine the process flow today from the mission planner’s perspective and how potential changes to sources he works with impact the work and the mission.
  1. The mission planner(s) start work design missions to destroy enemy targets (bridges, bunkers, SAM batteries, etc.).
  2. They pull in data from other systems, BDA, MIDB. Whether they use an SOA based process or not has no real impact in the result. Only on how tightly coupled one system is to another
  3. If one second later there is an update to the BDA system or MIDB, the mission planner is left unaware. He continues to plan to destroy a target that may already be destroyed, or plan a mission with inadequate resources due to a change at the target location (new SAM battery, additional enemy forces, etc).
  4. The mission planner(s) pull in data from others systems as a final check before releasing the plan. They make adjustments to the plan and release it for execution.
  5. If one second later there is an update to the BDA system or MIDB, the mission planner is unaware. The executor of the mission will have to deal with it at run-time. Rerouting to another target, hitting the wrong target or encountering unexpected enemy resistance.

How could this be different? Enter GemFire Enterprise, the next generation in data management. GemFire combines

  • Distributed Caching
  • Messaging & Active Event Notification
  • Active/Continuous Querying
  • Traditional Querying
  • Support for users/applications on disadvantaged or periodically disconnected networks.
  • High Availability and some degree of Fault Tolerance

What if the mission planning process flow tapped into a data aware, next generation data management system such as GemFire?

(Assumption—No changes are made to the underlying databases BUT all RDBMS changes are pushed thru GemFire or sent to GemFire. A small change is made to the application to let it register interest and receive data updates via GemFire’s event mechanism or Continuous Query feature.

  1. The mission planner(s) start work design missions to destroy enemy targets (bridges, bunkers, SAM batteries, etc.).
  2. They pull in data from other systems, BDA, MIDB thru GemFire.
  3. If one second later there is an update to the BDA system or MIDB, the mission planner is informed of the change immediately. He continues to plan but can make immediate changes to the plan. If for some reason he is off the network, then he will be informed of all changes and additions as soon as he is back on the network.
  4. The mission planner(s) get approval and releasing the plan.
  5. One second later there is an update to the BDA system or MIDB. The executor of the mission (and potentially the planner and others) is immediately notified so that appropriate changes can be made before the executor reaches the target.

The interdependency between applications on data and changes to data has serious impacts on mission critical processes. The current way in which data management is done in enterprise applications is over 40 years old and just can not provide many of the critical features needed to build today’s high performance, cross organization applications. It is time to consider enhancing your systems data management ability.