An enterprise tool for Capital Markets
Issue 4, 2013
Since coming onto the scene in 2003, Complex Event Processing (CEP) has quickly become an enterprise tool suitable for organizations seeking to take action in real-time based on insights across large volumes of fast-moving data from multiple sources. Learn if a CEP engine is right for your business.
CEP: the enterprise tool
The term “Complex Event Processing” was first coined by Apama, the world’s first real-time processing engine founded by a research project at Cambridge University. CEP became commercially available in 2003 when Apama signed its first customer. Since then, CEP has moved from an edge tool for algorithmic trading to an indispensable platform. No longer a niche product, it is now used across capital markets to power event-driven real-time applications in the front and middle office.
Some popular uses of CEP with capital market applications in the front office include:
- Algorithmic trading
- Smart order routers
- Order management
- Matching engines
- Pre-trade risk
- Client analytics
- Algorithm back testing
Popular uses of CEP for the middle office include:
- Pre-trade and post-trade risk management
- Fraud detection
- Rogue trader detection
- Rogue algorithm detection
- Market or trade Surveillance
- Anti-Money Laundering (AML)
CEP is useful in applications that make decisions and take action based on analysis of large volumes of fast-moving data. There may be multiple streams and types of data. Volume, velocity, variety and speed of correlation are all parts of the equation. Essentially, CEP is perfectly suited to handle and act on big data in motion across the enterprise.
Today, Apama is the market-leading real-time analytics platform with more than 150 deployments worldwide in use cases as diverse as algorithmic trading, risk & compliance, FX e-commerce, fraud detection, real-time marketing and more.
Apama addresses four key industry drivers: 1) the need to exact value from big data in motion; 2) the need for pro-active not reactive business; 3) the need to massively scale real-time applications; and 4) the need for location-based and context-aware applications.
In addressing these drivers, Apama offers:
Exceptional performance at scale in terms of events per sec, number of operations performed on those events and the complexity of the operations (it’s not all about events per second!)
Agility of a flexible platform to rapidly develop and evolve IT solutions in the face of constant change
Rapid time to value and reduced total cost of ownership through use of a technology platform across multiple business lines and operations (do more with less!)
An Apama opportunity typical addresses many, if not all, of the following:
- Real-time big data where milli (and micro) seconds matter
- Operations are performed on multiple streams and types of data
- Operations are updated frequently as businesses develop and evolve
- A real-time (re)action is required and the window of opportunity is small
Apama in the real world
Presagium LLC, a New York-based quantitative fund for which computational performance is critical, recently discussed its trading system performance improvements with Apama. Here is a brief excerpt of that discussion provided by Greg Frank of Presagium on the Cap Markets blog.
Our strategies rely on execution of complex quantitative analysis for portfolio rebalancing and not pure speed alone. We have been using Apama and its out-of-the-box integration with Matlab for some time now to develop and deploy in-house strategies.
|We have always found the Capital Markets Foundation (CMF) of Apama a useful library of services to accelerate the development and deployment of new strategies. However, we see the latest version of Apama and CMF as going further to 1) help overcome a fundamental threading challenge with Matlab and 2) offer options to increase performance by migrating some code directly to Apama, thereby reducing calls to external components.
Matlab, a well-known and heavily used tool amongst quants in the financial industry, is often used for prototyping algorithms relying on complex calculations. By combining Apama's strategy development environment and ability to execute through ready-made integration with Matlab, we can use Matlab directly in production without having to migrate those functions into another technology — a significant advantage in reducing the lead-time for new or updated algos [algorithms]
However, Matlab on its own only provides synchronous calls to its library, meaning the application thread in the calling process (Apama in this case) is blocked until the Matlab function is complete. This may have resulted in market data handling within Apama falling behind as Matlab completes its calculations and market slippage may occur on final trade execution. The latest release of the CMF gets around this by allowing asynchronous Matlab calls from multiple Apama application threads so as not to block the calling application. Our Apama application can keep processing market data and other activities while Matlab is working and we avoid analyzing or trading on potentially stale quotes. This feature has resulted in one of the biggest performance improvements of our system to date.
|The other thing we are excited about, also related to performance, is Apama's recent release incorporating the Low Level Virtual Machine (LLVM) compiler engine. Apama’s Event Processing Language (EPL) now compiles and runs as native machine code optimized for the exact hardware it executes on. Software AG tells us that as a result, EPL code is likely to execute at the same or faster speeds than equivalent C++ or Java implementations and significantly faster than a call to an external component. Since the majority of our real-time application code is in Apama and we are now confident that it can churn complex calculations on par with analytics libraries, we are now considering migrating some of these custom functions into EPL. This migration not only offers a performance boost but also eases maintenance through standardizing on a common technology platform.
Using CEP for big data
Software AG had a grander vision in mind when it added Apama to its portfolio. The combination of CEP, an in-memory database and low-latency messaging offers a complete solution to extracting maximum value from fast-moving big data.
For example, CEP can easily make decisions and kick-off actions based on the data inside the event being processed. But what if we need to enrich that event with data from a database while keeping performance high and latency low?
A call to a traditional database could render the solution useless as the window of opportunity to act may well have passed. Integration with Terracotta can, for example, augment real-time trading data with historical trends to give a better measure of risk, or augment an ATM transaction with known customer location data to give a better measure of likelihood of fraud—before the ATM withdrawal is approved!
Be sure you consider Apama when you look into CEP for your enterprise. It rapidly correlates, aggregates and detects patterns across large volumes of fast-moving data from multiple sources, so you can take the right action in real-time.