Live Business Insights with Complex Event Processing or What Is Happening Right Now in My Business

Issue 3, 2012    

The ever increasing proliferation of mobile devices and the continued automation of core business functions are creating a massive flood of real-time data flowing through IT systems. In this real-time world, where milliseconds count, companies that are able to extract relevant business insights from these streams will have the competitive edge. Many companies are turning to Complex Event Processing (CEP), an innovative processing paradigm specifically designed to handle today’s onslaught of real-time data, for answers. This article provides an overview of CEP and Software AG’s CEP product webMethods Business Events.

Database systems and data warehouses are standard participants in every IT landscape. They are perfect for running detailed analysis of large amounts of persistently stored data. However, today’s requirements significantly exceed the processing capacities of database systems. Business relevant data arrives much faster, accumulates much faster, and needs to be processed and analyzed much faster than a database system can deal with. This is where Complex Event Processing enters the playing field.


Complex Event Processing In a Nutshell
So what is the essence of CEP? Let us start with the “E” in CEP. An event is literally anything that happens, e.g., an airplane lands, a financial transaction takes place, a credit application is filed. An important characteristic of an event is that it is equipped with temporal information, i.e., when did the event occur. Typically this temporal information is simply a timestamp like “plane LH123 landed at 2:31 pm” or “credit application AK4711 has been rejected at 10:12am”. In many situations, events continuously stream in with high frequency; then we call them “event streams”.

The “P” refers to the proper processing and analysis of event streams. While the events stream in, we want to analyze the events in a continuous fashion. The analysis may comprise of correlating different streams, searching for patterns and trends, or computing summary statistics. We leverage the temporal information of the events by confining the analysis to a sliding temporal window. For example, if you want to continuously compute the number of planes that have landed in the last hour; or you want to search for fraudulent credit card transactions, where within 10 minutes more than three transactions have been triggered from different locations using the same credit card number. It is worth mentioning that CEP is tailored for short-term, intraday analysis; the window setting is typically in the range of minutes to hours. For long-term analysis, we still use our databases and data warehouses.

Finally, “C” denotes the detection of a complex event that is the outcome of the previous analysis. We denote such an analysis result as complex since it is derived from several simple events like credit card transactions. These complex events are continuously computed and continuously published. Thus, the huge number of events streaming in is condensed to the information relevant for your business. You can think of this approach as a kind of fishing net where events are streaming in and only those of interest are caught and analyzed.

CEP is a strong value-added tool that enables your business to:

  • React faster to customers and partners
  • Anticipate opportunities and threats
  • Identify patterns, trends, and exceptions
  • Correlate and analyze events spread across different systems
  • Continuously monitor KPIs and SLAs


Behind The Scenes of CEP
Let us now lift the lid of a Complex Event Processing engine. To do so, let us first recapitulate the main processing paradigm of database systems. A database system receives data as an input and stores it persistently on disk, ideally in conjunction with some smart indexing technologies. If we want to get some insights on the data, we pose an ad-hoc query which traverses the data and delivers the results we are looking for. So data is persistent and the query is transient. We call this approach ‘store-and-analyze’. Complex Event Processing inverts that principle.

With Complex Event Processing, a continuous query is registered and stays in the system. For each new event streaming in, the query is evaluated in an incremental manner. Provided a result has been computed, it is directly published to subsequent consumers. Those events that are required for query evaluation are continuously maintained in main memory and generally not stored on disk. For example, the events of the last five minutes are maintained in main memory if the user wants to compute the average with respect to a five minutes window. Events are transient and the query is persistent. As this processing approach runs completely in main memory, a CEP system can deal with hundreds of thousands of events per second.
 

CEP with webMethods Business Events
webMethods Business Events, the CEP platform of Software AG, offers sophisticated CEP functionality which is seamlessly embedded into the webMethods Suite.
 

EDA and CEP
In the previous two issues of TECHniques (techcommunity.softwareag.com/techniques), Matt Green presented an overview of Event-Driven Architecture (EDA). This architecture is a holistic approach for Software AG’s complete webMethods suite which strives for the event-enablement of each component. EDA prepares the ground for a flexible and scalable environment that renders the integration of heterogeneous components possible. With each EDA participant speaking the event language, we get a backbone with which a multitude of different events can be processed. If EDA is the backbone, then CEP is the brain that extracts the relevant knowledge from the events flowing. CEP is a natural EDA participant that also supports the event consumer/event producer pattern. The CEP engine subscribes to an event stream on the Event Bus, analyzes the stream, and publishes the result stream again on the Event Bus. Consumers of the results can update dashboards (as shown in Figure 1), feed reports, or trigger follow-up actions and processes automatically.

Core Components
The architecture of webMethods Business Events comprises the following core components:

  • Event Bus: The bus is the transport layer for the events. A native event input/output channel is also provided and is optimized for ultra-low latency processing.
     
  • Integration Server: The Integration Server (IS) is the runtime environment for the Event Server. New IS services allow sending and receiving events from the Event Bus.
     
  • Event Server: The Event Server hosting the CEP engine is the central component for processing and analyzing event streams. The continuous queries analyzing the streams are expressed in SQL equipped with corresponding temporal extensions. The underlying semantics of the query language has strong semantic guarantees and ensures deterministic and reproducible results.
     
  • Designer: The Designer supports the easy creation and management of CEP projects and their corresponding features such as Event Types or queries.
     
  • Visualization: ARIS MashZone is used for visualizing the analysis results in appealing live dashboards. Different visual components like tables, XY plots and bar charts allow for an intuitive presentation of your results as shown in Figure 2.

Conclusion
If you have to deal with an ever increasing flood of events and are seeking to gain benefit from business insights derived from event streams in a live manner, then take a look at webMethods Business Events.

Learn more about webMethods Business Events at techcommunity.softwareag.com/webmethods/products/business-events

Join the Software AG Tech Community today to access technical resources, download code samples or participate in discussions about complex event processing and more.