Issue 1, 2016
Explore how customers use big data analytics to audit Mainframe applications to optimize performance, protect sensitive information and prevent fraud.
Big data analytics
It’s time to think differently about how you approach the opportunities and risks of doing business today. The velocity at which data flows from a vast variety of data sources contains valuable insights that must be detected and acted upon immediately.
Big data analytics opens new opportunities to greatly increase the value of your big iron’s transactional data. Most Adabas & Natural systems on the Mainframe or Linux®, UNIX® and Windows® (LUW) already contain high-value data and conduct mission-critical transaction processes. By leveraging big data analytics, you can monitor fast-moving operational data from multiple sources, detect patterns and take action immediately to optimize performance, protect sensitive information and prevent fraud.
Figure 1: Optimize performance, protect sensitive information and prevent fraud with big data analytics.
Big data analytics correlates, aggregates, filters and queries large volumes of fast-moving data from multiple sources to make intelligent decisions with real-time visualization. Through streaming analytics, you can enrich real-time events, detect patterns and derive context to improve decision-making. Combine all that with an in-memory data management architecture that provides scalability and high availability with extremely low latency and you have the tools you need to take advantage of all that big iron and big data have to offer.
Let’s explore how two Software AG customers are using big data analytics to audit their Mainframe applications, increasing the value of their core data and providing actionable intelligence from data that is always changing.
Driving innovation in government services
Managing the systems of a large government organization is no easy task. The IT department of the largest city in the U.S., a long-standing customer of Software AG, knows it must continuously enhance system performance in order offer better services to its clients—a vast network of 120 agencies, boards and offices, more than 8 million residents, 300,000 employees, and 230,000 businesses every day—and 50 million visitors each year.
Tasked with driving innovation, this department is responsible for:
- Modernizing government technology
- Increasing digital literacy opportunities among its citizens
- Facilitating a more transparent and open government
- Creating innovative partnerships with today’s leaders in technology to improve IT infrastructure, service delivery and civic engagement
To successfully execute against its responsibilities, this IT department must:
- Monitor applications and infrastructure to ensure optimal performance
- Deliver continuous availability of its systems
- Guarantee security of its systems
- Address challenging audit requirements
To meet these challenges, the IT department implemented a big data analytics solution that combines Software AG’s Digital Business Platform with current software monitoring features. The solution enables the IT department to visually monitor a complex multi-platform environment that spans across multiple agencies from a single location.
How it works
Unlike traditional performance monitoring that only provides insights into a single database at a time, big data analytics captures event data from multiple data sources in real time. These streams of events are analyzed against unique, multi-dimensional filtering mechanisms that quickly detect sought-after patterns defined by the IT department. The events are visualized on an interactive dashboard that provides real-time insights, triggering appropriate responses and action.
Figure 2: Big Data Analytics Capabilities
To identify potential problems with database or event service bus performance, these KPIs are monitored and displayed using dashboard alerts:
- Excessive commands issued to the database
- Database activity and usage trends
- Database resource health
- Unusual sever activity
- Transactions and messages on servers for potential runaway jobs
- Severs active or inactive status
The application monitoring tool measures the response time of the distributed applications and identifies error situations. Each call to a selected service by a client application is monitored and measured for the overall service response times, the network transport times, the broker process and wait times, the Remote Procedure Call (RPC) server processing times, and the time spent for database calls.
Each involved Software AG enterprise product concatenates the monitored time(s) with the service call. When the call returns to the client, the client RPC runtime provides the event data to an application monitoring data collector. The information is then combined into application monitoring reports that are generated every 60 seconds and automatically consumed by the streaming analytics component (Apama) of the Digital Business Platform to display the results and alerts.
Software AG’s Apama Streaming Analytics Platform monitors the rapidly moving event streams, detects and analyzes important events and patterns of events, and immediately acts on events of interest. It combines event processing, messaging, in-memory data management and visualization to help the department put real-time data in context to make intelligent decisions.
Processing events differs from traditional applications; rather than continuously executing a sequence of instructions, they listen for and respond to relevant events. An event-based system waits and responds appropriately to an asynchronous signal as soon as it happens. In this way, the response is as immediate or real time as possible.
With big data analytics now in place, the measurements of the predefined KPIs are now visualized on a dashboard and SMS messages and emails are produced to quickly alert the IT department of performance issues.
Figure 3: Big data analytics identifies threats so action can be taken in real-time.
Through active 24/7 monitoring and email alerts, the IT department can proactively resolve problem environments before end-users even notice there is a problem. In other cases, the IT department will have advance notice of an impending failure so that recovery steps can be taken in a more expeditious manner.
Visualizing the performance of all the system components through a graphical dashboard increases the efficiency of identifying performance issues. The graphical dashboard leverages drop-down menus to monitor all components from a single location, eliminating the need to log on, check the health and log off of multiple environments. This greatly reduces the likelihood of missing an environment health check and frees up to 10 man-hours per week.
Executing against new and changing business rules
Another state agency in the U.S. also turned to event analytics to tackle fraud and address changing regulatory requirements in health and human services. This agency has long relied on two Mainframe applications that manage child support and public assistance. These applications are very stable but every year business rules change, new business rules are added and integration with different applications is required. Rather than buy or develop new applications to address these needs, the department turned to Software AG’s big data analytics.
Big data analytics gives this state agency a mechanism to modernize its applications and meet the regulatory and user demands for a fraction of the cost to replace them. Since all the data needed already exists, the solution streams the applicable data to Software AG’s Apama Streaming Analytics Platform where business users are provided modern dashboards with graphical data representations to productively perform their job.
The state agency is tasked with identifying unauthorized access to child support case data in near real-time and reporting statistical and financial information about their state-administered Child Support Enforcement (CSE) program to the U.S. Federal Government’s Department of Health and Human Services (HHS). This information is used by the Secretary of HHS to comply with sections 409, 452(a) and (g), 458, and 469 of the Social Security Act. This act requires the Secretary to establish standards for an effective CSE program, to establish minimum organization and staffing requirements, and to make an annual report to the Congress on program activities.
KPIs trigger fraud alerts
Apama Streaming Analytics uses Internet of Things (IoT) queries to continuously analyze streaming CSE data and provide predictive data in real time. The state agency uses this data to monitor unauthorized access to case data as well user and case access thresholds.
To identify fraud, the following KPIs were identified and measured:
- Unauthorized access by user
- Unauthorized access by case
- Latest unauthorized access violations
- Most accesses by case number
- Most accesses by user
- Detailed report by month
- Worker case load
- Child support balance
If a particular case was viewed 20 times in the past hour (which is not normal access activity), the unauthorized access would be flagged as potential fraud. This would prompt a quick investigation.
Big data analytics helps this state agency proactively identify cases of fraud and collect accurate, predictive information based on current and historic CSE data.
Figure 4: Identifying Unauthorized Access to Case Data with Big Data Analytics
Using business intelligence to provide insights into data thresholds enables staff to act on violations proactively. Identifying fraudulent access in real time will improve relations with citizens and reduce the current dissatisfaction with latent responsiveness.
Complying with Federal 156 reporting requirements using accurate, predictive data will help the state more quickly receive money needed from the Federal CSE budget to staff the correct number of personnel and resources needed for CSE enforcement. The information submitted to HHS is also used to evaluate state performance in running its CSE program and determine appropriate incentives.
What if you could stop fraud before it happens? Monitor all of your data streams through automation—real-time incoming alerts from smart meters, database events and log data, inconsistencies between consumption and billing, changes in consumption patterns compared to historical levels and processes associated with investigating questionable service and security levels. Then, by integrating these multiple streams in a way that allows real-time comparison and benchmarking, you provide your employees the tools to identify fraud sooner, protecting your infrastructure as well as your revenue.
With Software AG’s big data analytics and Mainframe integration technologies, you now have the ability to make business decisions on real-time data from multiple sources including your transactional data in Adabas & Natural applications. Through interactive, self-service dashboards you can view real-time insights into critical aspects of the business using your mobile device, your laptop or any other platform you choose. And more importantly, you can automate intelligent actions to take place when certain criteria or thresholds occur.
Embrace big data today—as one of the four forces of mobile, social, big data and cloud—that are reshaping expectations and possibilities. You cannot ignore the opportunities of the big data evolution.