(Level: BEGINNER)
You might not have wanted to read the deep detailed docs from the previous days, or even if you did, it can be good to remember the basic fundamentals.
In 2016 the Forrester Wave™ on Big Data Streaming Analytics described the concept of Streaming Analytics as follows:
“Software that can filter, aggregate, enrich, and analyze a high throughput of data from multiple, disparate live data sources and in any data format to identify simple and complex patterns to provide applications with context to detect opportune situations, automate immediate actions , and dynamically adapt.”
Apama has been a market leader since it’s inception in 2000. A single sentence description of the fundamental Apama technology is:
" Apama is an in-memory system for Complex Event Processing (CEP) and Streaming Analytics based on a specialist Domain Specific Language (DSL) and a high-performance language runtime "
Let’s make that concept concrete. The core technology is a real-time event processing engine that we call the “correlator” (this short for “event correlation engine”), and our DSL is the Event Processing Language (EPL for short).
When we talk about getting data in and out of the correlator, the verbs we use are “inject” when talking about EPL being added to the correlator, and “send” and “receive” when talking about events flowing though the correlator.
Fun historical fact - in the very early days the correlator was simply called the engine. This will make sense in when we talk about some of the CLI tooling later.
This is Day #4 (bonus weekend post) of a short series of brief tips, tricks, hints and reminders of information relating to the Apama Streaming Analytics platform, both from Software AG as well as from the community.
This series of articles will be published Monday-Friday only, with the occasional weekend bonus.