Big Data ... transforming data into information
In order to succeed in today’s fast-paced business environment, leaders need timely and actionable data to make strategic decisions. Financial institutions have a lot of customer data but traditionally have struggled with converting data into actionable information. Advancements in technology enable real-time reporting, easy to use dashboards and customer analytics.
As a result, it makes sense that leading financial institutions would look to the latest tools and techniques to further leverage their wealth of institutional data. From customer data and back-office systems to channels and sales, institutions are producing an ever-increasing mountain of data. The question is how to make full use of this information. The answer may lie in the concept of big data.
What is big data?
There are a variety of definitions for big data in the industry today. They range from focusing on predictive analytics artificial intelligence, machine learning to psychometric modeling, to providing real-time analytics. Big data encompasses the ability to leverage all of an institution's data and can be defined as:
Big data is a term for data sets that are so large or complex that traditional data processing application software is inadequate to deal with them. Challenges include capture, storage, analysis, data curation, search, sharing, transfer, visualization, querying, and updating, and information privacy.
What drives institutions to venture into the world of big data? It's when they can no longer use traditional methods to fully leverage their data. This is usually due to one or more of the three Vs.
Volume refers to the amount of data produced by an institution. Financial institutions produce a wealth of data from all parts of the institution. This includes a significant amount from transaction processing, debit cards, loan applications, wealth management, finance, etc.
Variety refers to the different types of data an institution produces, including both structured and unstructured data. Financial institutions have a wide range of data at their disposal for use in decision-making, planning and operations, including data from the branches, online banking, mobile banking, the tellers and customer service staff, to name but a few.
Velocity refers to the speed in which data is produced, as well as the rate at which it is possible to consume it for business purposes. Financial institutions have been using data in a batch manner for many years in areas such as core and ancillary applications. Today, there is a need for real-time processing and data decisioning.
It's not necessarily any one of the above that causes the need for big data capabilities, but rather the complexity that is generated as a result. When data becomes too complex for traditional business intelligence or analytics tools and technologies to process it in a timely manner, big data capabilities are potentially the answer.
Use of big data
Most financial institutions can improve, or modernize, nearly every aspect of day-to-day activities. Institutions have not come close to leveraging the wealth of data available to them. Big data offers the ability to do so, offering several potential applications, including, but not limited to:
Financial institutions are complex organizations with a number of channels, products and processes. Understanding the myriad of dependencies and their impacts is often outside of what can be expected from traditional systems, or even human understanding and intuition. Big data is designed to take data from disparate systems and processes, and provide insights not normally available through traditional means. The applications of these insights into operations are endless. They can be used for:
- Implementing a higher degree of automation
- Optimizing transaction processing
- Improving the forecasting
- Improving customer service
- Custom tailoring the products to anticipated customer needs, based on predictive models
As customers and competitors are evolving, understanding the real needs of your customers helps limit missteps, better leverage capital investments and increase return on investment. Applications can include:
- Using social media to drive product improvements
- Enhancing customer service by using predictive models to determine optimal service
- Using customer data to identify potential next-generation products
- Using internal and external data (such as social media and customer feedback) to predict customer consumption trends
- Driving the identification of potential new markets and the probability of success in each
- Improving segmentation to tailor product offerings to individual customers
Big data can be used to drive improvements in the overall reporting process. Applications can include:
- Automating regulatory reviews
- Creating automated approval processes
- Improving real-time security, fraud and loss reporting
- There are countless options that big data offers your institution to leverage your data to drive your operations.
Utilizing big data
So we have the data, but how do we transform it into information? Corporate performance management (CPM) is an umbrella term used to describe software applications that automate your business’ budgeting planning and forecasting processes, and enable companies to define, analyze and execute strategic goals relative to finances, operations, business modeling, analysis and key performance indicators. CPM can be used to:
- Decrease data collection and increase reconciliation analysis time
- Execute the institution’s strategy and objectives, and improve performance
- Articulate a clear link between valuation and shareholders’ expectations and the drivers who influence profitability
- Create a fact-based decision-making culture by linking metrics, planning and decision-making
- Set targets and drive desired behaviors at each layer of the institution
- Develop and test alternative scenarios to facilitate capital planning and resource allocation
- Provide a cost-effective platform for effective analytical processes that are responsive to the institutions’ needs
The path to revolutionize data analytics includes:
- Define expectations and goals to execute strategies
- Uncover and examine key elements of your business
- Develop pragmatic suggestions
- Deploy new processes and systems
Obstacles to the adoption of big data
Why haven't more financial institutions ventured into big data?
It's a paradigm shift
Although the concept of big data has been around for several years, most institutions have not adopted it as part of the core capabilities. It's a new way of thinking about how to use institutional data, and it usually requires new and cutting-edge technology. It can often result in re-engineering business processes and the way in which people perform their jobs. In a nutshell, it can be a major paradigm shift for an institution. Proper education, planning and training can make the transition into big data easier and less threatening.
Determining the best way to use big data
Being on the cutting edge of new uses of data and the latest application of technology can drive a competitive advantage for institutions, but it can also have the opposite effect. While big data capabilities offer tremendous value, care must be taken to ensure an institution knows why it's using them. Having a strategy, plan and defined success criteria for big data is an important first step and allows for an effective implementation.
Breaking down data silos
Financial institutions have a wealth of data from numerous systems and internal/external sources. Unfortunately, this also sometimes comes from the data being managed in silos. Departments can be protective of their data and its use. By educating people on what big data is, how the data is used and the resulting value, data silos should start to fade away. Additionally, having an institution view of data and how it is managed also facilitates the adoption of big data.
Mistrust of the 'black box'
Because of the way in which processing big data works, transparency into how the results were generated doesn’t always exist. Because of the complexity generated by the volume, variety and velocity of the data processed, there are often different algorithms and statistical techniques applied than in the past. People need to be educated on how the data is being used and processed, as well as shown the validity of the results. Once trust in big data is gained, the use and adoption will increase.
Cost of adoption
Many equate big data with big cost. While full adoption and use of big data capabilities can require a significant investment, it needs to be looked at more holistically within the total cost of ownership equation. Big data can result in more streamlined operations, reduced labor consumption, improved quality control, and increased efficient sales and customer channels. All of these can result in increased revenues and reduced costs.
Having a strategy, plan and understanding of how big data will be used will facilitate implementing it in the most efficient manner. Factor in that big data can provide near real-time insights that traditionally might take weeks or months to achieve, and the benefits should more than outweigh the costs.
Additionally, venturing into big data does not necessarily mean a huge upfront cost. Conducting a focused prototype allows you to dip your toe into the big data world to see if it's right for your institution.
In an ever-increasing global competitive landscape, financial institutions need to use all the tools at their disposal to create a competitive advantage. One of the most important assets institutions have is their data. Most institutions are only using a fraction of their data and, even then, do not use it to its full potential. Big data offers the ability to evolve the institution in almost every area.
Developing a strategy around the use of big data is the first step. Understanding how and where it will be used allows for a more efficient and effective implementation. Furthermore, getting started does not require a large upfront investment. Conducting a prototype to demonstrate the advantages is a great way to generate support for the use of big data capabilities. Doing nothing puts institutions at risk of continuing with suboptimal operations and even losing market share.