Displaying search results for ""

Democratizing Data for System Health

Democratizing Data for System Health

DataLab@Maveric is a hands-on scenario based learning initiative for honing real-world domain and technology skills via sandbox experiments that often result in pragmatic innovations

By the end of 2020, it is estimated that 1.7 Mb of data will be created every second for every person on the planet. That is a staggering 2.5 Quintillion data bytes each day. Take a moment to wrap your minds around that number: A quintillion is a thousand raised to the power of six (18 zeros). In fact, if those many pennies were laid out flat, it would cover the earth 5 times.

In today’s ‘data-is-the-new-oil’ economy, the pressures on data and technology teams globally arise from multiple criticalities:

  • Real time event response (business events demand immediate attention),
  • Data distribution (many users need to address many use cases across many systems),
  • Asynchronous communication (capture data as it is created but allow applications to consume on their own pace),
  • Parallel consumption (multiple parties need copies of the same data for different uses) and,
  • Service Modularity (micro services architecture ensures one service does not depend on the other)

Information pattern

At Maveric DataLab, as part of one of the experiments with Apache Kafka, the team conducted deep dives on how data can be shared, re-used, secured but at the same time not be restricted. What followed was a startling insight and an opportunity. Eventually after multiple trials and feedbacks, the Maveric DataTech team came up with a unique way to truly democratize data.

Democratizing system health information includes producing a continual assessment of applications, ensuring an uninterrupted supply of services to the users, along with managing applications’ performances at optimal service levels.

Now in any secure Enterprise application, access-control is handled by IT and raw information is difficult to get, except for power users who receive data streams and are able to comprehend it. The information thereafter is used to create customized dashboards to provide different views for different user sets.

For the experiment to yield efficient results, an open, scalable, and extensible architecture is needed to address challenges in the information patterns. The Maveric DataTech team’s proof of concept uses Kafka and Apache stack of Big Data to capture and process higher volumes of data quickly and in real-time from disparate systems.

By calibrating and deploying the solution, the team reported an increased value realization across business events.

Maveric DataLabs Solution Description

The team developed an information sharing platform called ASAP (Active System Analyzer and Predictor). It collates various system and application health data from multiple application components. The collated metrics is then made available through the pub-sub model on Kafka. The content-based filtering ensures that contextual analysis of the data.

The Components of ASAP are –

  • Corpus, which collates the system health information using several protocols from different environments
  • Provenance, which is built on Kafka cluster which maintains messages classified into topics of contextual importance
  • Ambit, which consumes the message from provenance and creates dashboard to visualize the system metrics
  • Sphere, a system for contextual analysis from the derived metrics tuned to address various scenarios of monitoring and alerts

For detailed information, please get in touch with us at data@maveric-systems.com

View

Digital Banking Will Be The Future Of Banking Post Corona Pandemic

Digital Banking Will Be The Future Of Banking Post Corona Pandemic

In the last few years, banking as an industry has seen a massive move towards digitization. Traditional banks are challenged by new-age, digital only banks that rely on replacing the traditional banking experience with a hyper-personalised digital first approach. In addition, banks need to boost their Return on Equity, bring down Cost to Income ratio etc. in order to stay competitive. Banks have also been facing threats from new entrants such as Google, Amazon and other technology companies looking to enter this space. COVID-19 has accelerated some of these trends, like changed customer behaviour and adaptation of newer tools and technologies by the banks.

Surveys observed rapid increase in customer reluctance to visit branches and they are inclined to try out newer tools to meet their banking needs. Banks have been closing branches globally at an unprecedented scale. Citigroup closed about 100 branches and JP Morgan, the largest bank in the US closed about 1000 branches in the immediate aftermath of the pandemic.

Customers now expect banks to maximize digital interactions and come up with digital alternatives for their day-to-day banking needs as they are now more open to trying out a new app than they were before the COVID-19 pandemic.

Digital Banking – Way Ahead

In the post COVID world, digital banks will need to replace many of the customer’s existing transactions and interactions with the bank by improved use of technology.

Customer Experince
To achieve this, they will need to build a robust, scalable technology platform focusing on the following components:

Omni Channel

Banks need to ensure that their channels are streamlined to ensure seamless and superior customer journeys across channels. Traditionally, banks have had siloed channels with customized workflows and support. This approach is inherently inefficient and leads to broken customer journeys, staff and customer dissatisfaction and increased costs. To achieve a true omni-channel experience, banks have to re-engineer their platforms to be digital first. Workflows, customer journeys and experiences should be orchestrated through a central hub and then distributed to individual channels.

Mobile banking will be at the heart of omni-channel banking soon. An added advantage of mobile banking is that it can offer near real-time communication avenues to banks. Also, access and authentication can be handled through a mobile’s in-built security mechanism. Banks should focus on generating omni-channel experiences that are mobile friendly and can be repurposed across other channels.

Modular Banking

Customers expect increased dynamism at the front end. However, legacy bank systems are monolithic in nature leading to potential delays in implementing changes, increased time to market for new products along with an unpredictability of outcome.

To address this, banks must focus on decoupling existing monoliths to begin with. However, this will not be enough for banks to compete with digital only banks. Banks will need to incorporate a platform that at its core is “digital first”. This means the breaking down of functionality into smaller components that can be combined to alter processes and products as needed.

Having a truly “digitally advanced” platform will allow P&L owners within the bank to design and technology & engineering functions to develop and deliver new products and services rapidly.

Open Banking

Traditionally, banks didn’t require to share data with competitors or other service providers. Some banks used their data to improve their services and products, however there was no obligation for them to share this data with third parties or competitors. Open Banking and PSD2 have changed this in Europe and it is now likely that regulators world-wide would follow their European counterparts and require banks to get acccess to the customer data to third parties and competitors based on customer’s consent.

Open Banking will create opportunities for banks that are open to alliances with Fintechs and other third-party products, thus offering customers an end-to-end experience. Banks will need to open up their APIs and tap into third party capabilities to dramatically improve customer experience and build deep, lasting relationships with their customers.

Intelligence Driven

Traditional approaches to customer service, cross-selling and recommending products to the customers have relied on a “One size fit all” approach. With a bigger scale, financial penetration and reduced customer interaction, there is a clear opportunity for banks to use data to personalize customer experiences, recommendations, and services.

By integrating different kinds of customer data (demographic, transaction, interaction, behaviour, application usage etc.), banks can create unique experiences for each customer by leveraging technologies such as Cognitive Computing, Machine Learning, Natural Language Processing etc.

In the post-COVID world, customers will need to feel unique and cared for even when they interact with banks over digital channels.

Digital Banking experience will be a game changer for enhancing customer satisfaction post-COVID. Banks need to take a holistic view and invest across all pillars of digital banking to retain existing customers and acquire new ones.

(The author is the Vice President at Maveric DataTech, based in India with an experience of 18 years in Management Consulting, Data Analytics and Digital Transformation. He is currently responsible for the Advanced Analytics practice at Maveric DataTech)

This was originally published on Outlook India website and is being reproduced here.

View

How Maveric Systems is using AI to find clients

How Maveric Systems is using AI to find clients

While the outset of Covid-19 outbreak has transformed how organizations used to operate, accelerating digital transformation is something that every corporate honcho believes.

In a conversation with ETCIO, Muraleedhar Ramapai, Executive Director- Data at engineering services firm Maveric Systems busts some myths about traditional ways of working, leveraging data analytics and much more.

Ramapai believes that the good-old theory of human work motivation and management, which assumes the typical worker has little ambition, avoids responsibility, and is individual-goal oriented, is simply not true.

The senior management at Maveric had early discussions on possible productivity snags and remedial counter methods. To their surprise, the leadership found when goals are clear and projects that do not require elaborate white-boarding, the productivity actually increased while working from home.

In terms of capacity building approaches, Ramapai feels that organizations would not be in a hurry to build, say a 200-member team under one roof.

“Irrespective of how great an impact Agile methodology has had over the Industry, I feel, it has done a disservice to one of the most admirable achievements of human collaboration – the ‘Open Source movement’. Moreover, across organizations, future workflow examinations will include rethinking the importance given to face to face collaboration. Pre-lock down quite a few companies practiced 4:1 rhythm (work from office: work from home). I will not be surprised if we see that ratio inverted post lockdown,” he maintained.

Fueling growth with data analytics

It is imperative for any enterprise to acquire clients for growth. And to pursue this, Maveric Systems is now leveraging advanced analytics and data engineering to derive the nuances of voice of our clients’ customers.

“We listen keenly for insights, we understand the customer’s pain points, and we spot features and offers which give our clients the competitive edge. Let me describe this in a little detail,” said Ramapai.

Before Maveric approaches its clients, it works out the many parts of this equation, via advanced analytics.

At the outset, the company employs the three-way analytic assurance framework. It starts by using automation to collate banks customers’ historical feedback from all public domain and social media qualified-sources and separate out key themes using NLP and machine learning.

“The key dissatisfiers and delighters are validated through our detailed engineering analysis of the channel technologies. First-hand experience of the channels is scientifically analysed to complete the third leg of analytics,” he added.

According to Ramapai, the Chennai based Maveric Systems’ core belief is that ultimately data analytics should be tangibly usable and should be converted into actionable decisions.

AI for finding suitable candidates

Continually finding appropriate talent and making sense of the changing workforce trends is high on Maveric’s agenda. The company is experimenting with ML and AI to locate appropriate candidates for various roles.

Initial screening is being done by machines and data. Resumes are scanned and relevant technical assessments are then served over the cloud for candidates to participate and submit for evaluation. However, this initiative is currently in its infancy.

“Joining the dots for a robust and fool proof-talent recruitment program will happen as we replace the ‘all-powerful human-selection-mindset’ with objective decision-making models on AI, ML platforms. The expectation is that algorithms will enable us to complete role fitment through situational analysis,” added Ramapai.

Cloud Computing

According to Ramapai, all of the company’s computing and storage needs are federated and distributed. The systems are built to scale up by up to 25% of its capacity anytime as per the requirements. Maveric Systems embraced the cloud as a natural evolution strategy much before COVID took over.

Talking about application modernization plans for data centres, Ramapai said, “Luckily for us, we have been in a rather modernized (low-legacy) position. Most of our corporate applications (like code repositories etc.) are ‘SAAS-ified’ and already hosted on the cloud. Some labs were local, for which remote access was provided. We took an internal decision that future capacity enhancements (be it data-storage or compute) for R&D purposes, will happen only on the cloud.”

This was originally published on Economic Times website and is being reproduced here.

View

A culture of accurate data leads to precise decisions

A culture of accurate data leads to precise decisions

No matter we realize it or not, our lives run on data.

Every time that we buy, we pick entertainment options and vacation spots, we make friends on social media, we recommend products and services, we find dating partners, we apply for jobs, or we locate houses to move in to, we are deciding. And these decisions are influenced through the innumerable data algorithms and recommendation engines that silently trawl our online personas, historical decisions, past behaviours, and social preferences, to churn out the future-likely choices.

In 2006, Clive Humby, the British Mathematician and an entrepreneur in Data Sciences, coined the phrase “Data is the new oil”. He elaborated by saying, “like oil, data is valuable, but if unrefined it cannot really be used.Oil has to be changed into gas, plastic, chemicals, etc. to create a valuable entity that drives profitable activity; so, data must be broken down and analysed for it to have value.”

Arguably Clive himself wouldn’t know till much later the towering impact of his prediction. Let’s exemplify that effect by two statistics alone:

  • New information generated per second for every human being is 1.7 megabytes. Considering there are over 7.7 billion people on the planet, this amounts to new information equivalent to more than 25,000 hour-long videos. Per second!
  • Today, nearly 70% of leading enterprise companies have a chief data officer (CDO), according to a study from New Vantage Partners. That’s up significantly from 2012 when only 12% did.

Everyone wants to talk about the insights and value they can derive from data. But what about Bad Data? And the price it exacts?

Bad data is inaccurate data, missing data, wrong or inappropriate data, non-conforming or duplicate data. Bad data is costly.

In 2016, IBM estimated the yearly cost of poor quality data (to the US economy) as $3.1 Trillion. Gartner put the average cost for poor data at $9.7 million/year/business. These are stunning figures.

So given the primacy of data in decision making and the heavy penalties of bad data (across a company’s financial resources, efficiency, productivity and credibility), let us consider what makes a data culture.

Simply put, a data driven culture moves data to the center of decision making.

It treats data as a main resource (as opposed to gut or instinct) for leveraging insights across all departments.

Best data driven companies embed a cultural framework that rivets the following nuts and bolts: easier data access, clear governance around data usage and definitive quality standards, improved data literacy, and finally, incorporating apt technologies to prepare and analyse data.

Next let us examine few ideas that mature Data cultures use to approach clarity of purpose that enhances their effectiveness, and also increases speed to fruition for their analytical efforts.

These data culture practices are what eventually leads to precise decisions.

  • Data culture is not about cool experiments. The fundamental objective in collecting, analysing and applying data is to make better decisions. Period. Volume in data lakes by themselves mean little if the central focus is not on solving the business problem. Always.
  • Data culture is both top driven and bottom swelled. True and repeated commitments from the board and the C-suite is non-negotiable but it is in democratizing data that the richer gains lie. Like staple diet forms, data should be missed if not consumed every day. Seeding projects, instigating new conversations, prompting innovation by applying data to challenges, imbibing data-belief, celebrating data-victories and removing data bottlenecks – all are critical in sustaining a data culture
  • Early acceptance of Risk. Mature data culture organizations understand the imminent risks that come from getting analytics wrong. They accept the support and remedial costs. Punitive measures in these companies are replaced by building alert systems. Higher levels of sophistication are wired into their planning-processes-practices. Ground level structures are fortified to use data with assurance and freedom.
  • ‘Culture Connectors’ are critical. To bridge the world of the data scientists to the on-ground users, a few high-credibility data ambassadors are enlisted. These change agents are catalysts cum coaches that help others to embrace data culture.
  • Rewiring the talent organization. It is usual practice to deliberate on touchpoints that interlaces roles. Constructive data-driven engagement practices are used to bind disparate organizational jobs and functions.

In sum, today’s businesses accelerate through precise decisions when on-demand data is engineered for accuracy. This marriage of conscious contextualization and comprehensive competencies works when it is backed by intersecting domains and tech. expertise.

Being able to do this, day in and day out, is what creates a robust data culture.

This was originally published on MEA-Finance.com website and is being reproduced here.

View

Data Labs at Maveric

Data Labs at Maveric

Research to Results by redefining the intersections between people and technology

Data Technology Innovation Lab – is the Maveric way to solve today’s existential problems by reimagining the customer’s tomorrow-value paradigm.

Banks that capture the zeitgeist of ‘20’s will be the ones driving adoption of maturing digital technology that is more commoditized and accessible, and also, harness scientific advancements that leverage the industry’s ecosystem.

Experiments at the Maveric Data Laboratory merges our deep belief in client-reliability with a precocious talent across applied data sciences. As institutions scan the banking ecosystem to either seize the approaching technology shift or negotiate the veiled disruptive threat, Maveric Data Labs prepares you with a potent techno-domain edge to convert disruptions into competitive advantages faster than before.

The involved steps in Maveric led digital transformations are neither incremental nor are they about finding the next ‘silver bullet’. Rather a contextualized plunge between blue sky thinking and applied technologies, the experiments at the Data Innovation Lab build solutions that are unbiased of open source or commercially licensed technologies.

The modus operandi for running experiments here are straightforward.

Guided by a set of concrete business benefits, Banks engage with Maveric Data Labs. The engagement that can spark in any number of ways, needs a problem statement to be articulated at the outset. In specific ways. Either through an intended business impact or transcending a big picture challenge or co-creating a tangible human experience or even by, agreeing on bottom line metrics as a key measure of the experiment’s success.

The problem statement is then parsed into functional and non-functional elements. Thereafter, related experiments are set up, run and packaged as Proof of Concepts. The presented PoC’s come with appropriate choices of solution architecture, and recommendations for underlying technologies (Open source and/or commercially licensed).

At Maveric Data Lab, Innovation missions typically work across a 2 to 8-weeks window on experiments that aims to put technology back into specific business situations.

A 4 to 8 – member agile team of SME, Domain experts, and Technology specialists work on naturally scalable, cloud based solutions across Synthetic Data (Domain model and statistics) and Public data (Indian market, Social Sites, and Open data)

Few of the open projects at DataLabs@Maveric include:

  • Live streaming of market data
  • Automated valuation model for real estate
  • Ecommerce – price comparison website
  • Hotel Search & Price Comparison website
  • Loan processing solution
  • Credit Card Fraud Detection system
  • Targeted marketing based on card transactions
  • Fraud Detection in cross border financial transactions
  • Market based Analysis platform
  • Centralized Data Hub for a Bank

If you are keen to explore the B-I-G question that accelerates the next step in Research & Development across DATA Transformations we at Data@Maveric-systems.com are listening!

About Maveric Data Technology Practice

Backed by domain, driven by technology, validated at each step, we are committed to accelerate your business through precise decisions using on demand data, engineered for accuracy.

 

View