Real World Streaming Analytics

Streaming Analytics has been hot terminology in the business world for the past few years, as companies seek to harness, benefit and add value from the inexorably growing volume of data they generate and receive in the modern digital information driven world. Critically, we are moving to a world where real-time analysis is becoming ever more important versus conventional static analysis to maintain a competitive edge. Delivering near immediate insight is what is now expected.

In many ways, it’s an allegory to modern life. Continuously trying to make head or tail of a non-stop barrage of information, designed to assault our senses. Just as our brains take that as a challenge, filtering out the noise, processing the salient details from multiple sources and making decisions every millisecond, we can do the same with our modern tools.

Apama’s Community edition allows for SME’s and individuals to also take control of streaming data and make judgements and decisions that add value. By using streaming API’s – whether those publicly available or that have been developed for a bespoke purpose of your own – the tools Community Edition provides can allow you to quickly develop your own analytics application.

There’s many good articles and posts online already describing the big picture view of Streaming Analytics, so instead, let’s look at some examples of the types of applications we can build on the technology.


In a high volume business, analytics could be used to drive promotions and management decisions in real-time. For anyone who watches ‘The Apprentice’ in the UK, you’ll have seen Alan Sugar setting a task that involves the contestants purchasing wholesale items, selling them and potentially reinvesting the profits into purchasing the more popular items again for resale. Sugar, rather vividly, calls this strategy ‘Smell what sells’.

Using a streaming analytics application, one could see the possibility of automating such a strategy for an online store, whereby one analyses what’s currently popular and then instantly generate and push out promotions, capitalize on upselling opportunities with bundles, and signal stock management so that supply and demand are balanced efficiently on the fly.


We can utilise the proliferation of content provided by the users of Social media for many uses. Companies could monitor opinion and sentiment on their products using the many available APIs, such as Twitter offer. In many ways, the only limit is the imagination when it comes to what is possible through Social Media analysis. Based on what’s trending on social media, one use could be to build a sentiment engine, examining important data on events, allowing ‘on the ground’ reports to be utilised to analyse what people feel. You could even automate news reporting to some degree, by seeing what people are talking about and combining the processed information into articles or value-added posts.


The Internet of Things is intertwined in this world. An ever spiralling amount of embedded devices now provide us with statistics and specifics, intended for consumption in an intelligent manner. How we process this data to provide targeted understanding is the challenge, and one that can be solved through intelligently engineering streaming analytics applications.

One simple use case, amongst a multitude of ‘smart home’ type scenarios which have been written about extensively, would be in health. For instance, some diabetics currently use a device called a CGM (continuous glucose monitor) which provides a readout as to their current Glucose level, which can also work in conjunction with a pump to regulate insulin. Taking this a few steps further, a number of other devices could be hooked up (as technology allows) to a smart hub monitoring a variety of physiological factors, and suggest responses as appropriate.


One of the obvious uses is in technologically driven sport, the pinnacle of which is Formula 1 motor racing. One of the most interesting aspects of this is that from research, typical analysis appears to be mainly focused around managing the systems on an individual car via it’s sensor data. This, whilst of critical importance for reliability and optimising performance, could just be the start of the story. Where this could head is into multi-layered race strategy. In the past year, F1 has opened up Team Radio completely to all teams. Streaming data could include audio from all teams. By utilising multiple streams, a strategy ‘war’ could take place using vocal analysis. The team that can analyze it’s competitors broadcasts in the best manner can give itself a competitive edge when it comes to pit stop strategy, or when to use a competitors tyre condition to it’s advantage.

But it’s not just traditionally tech-focused sports that will benefit. Other competitive team sports could easily utilise this new technology, taking a similar first step to where F1 is today, by monitoring the ‘machine’ – the human body. Monitoring the health/fitness in real time of the players could also help make tactical decisions, making pragmatic judgements rather than letting emotion drive opinion. A midfield general may signal to his coach that he is unwilling to be substituted, signalling his intent to carry on through sheer blind will, but data shows that his fatigue level is clouding his mind.

Such decisions could be the difference between glory and despondency.


For the day trader, harnessing streaming API’s can also provide certain opportunities that weren’t open to the mainstream just a few years ago. Trading between crypto-currencies is one such opportunity that can easily utilise such data, with fees that dwarf those in conventional equity trading, which makes automated strategies for the individual more prohibitive. Making your own algorithmic trading engine is now a realistic option.


The use cases mentioned are just the beginning – a minute fraction of the possibilities this technology enables. An important point is that whilst the stories around Streaming Analytics focus on large, high volume data sets to be analyzed at very high frequencies, the principles behind it can still be utilised at reduced rates to allow valuable and critical data for decision making to be retrieved.

Indeed, this is analogous to Apama itself, and the value it offers. Whilst Apama is well known for it’s ability to deal with data with incredible throughput performance, EPL, the Event processing Language which Apama applications are written is in itself a very natural way to express such strategies, regardless of the throughput requirements. With the recent addition of the connectivity plug-in framework, connecting to external sources has never been easier.

Thus, as well as scaling up as needed for large purposes, Apama can be scaled down to your own needs, and the benefit of being able to easily translate the requirements into a language intended for such a purpose gives you the ability to be more productive.

So go ahead and download community edition, and come up with something amazing yourself and let us know what you’ve done with it!