ACCELERATE INSIGHT FOR FINANCIAL SERVICES: REDUCE RISK... Accelerate Insight for Financial Services: Reduce Risk.. Increased demands from customers, regulators and shareholders are also
Trang 1ACCELERATE INSIGHT FOR FINANCIAL SERVICES: REDUCE RISK STOP FRAUD INCREASE PROFITS
Trang 2REDUCE RISK STOP FRAUD INCREASE PROFITS
Accelerate Insight for Financial Services: Reduce Risk Stop Fraud Increase Profits
Version 1
October 10, 2010
www.paraccel.com
© 2010 ParAccel, Inc All Rights Reserved ParAccel product names are trademarks of ParAccel, Inc Other product names are trademarks of their respective owners
Trang 3TABLE OF CONTENTS
INTRODUCTION 1
MANAGING RISK IN AN UNCERTAIN MARKET 2
THE CORE ANALYTIC CHALLENGES 3
PARACCEL AT WORK IN FINANCIAL SERVICES 3
THREE PROVEN USE CASES FOR FINANCIAL SERVICES 4
FIS: SLAM THE DOOR ON CREDIT CARD FRAUD 4
A THREE-DIMENSIONAL APPROACH TO PERFORMANCE 6
A HIGH PERFORMANCE ANALYTIC FOUNDATION 7
DEPLOYMENTS THAT MAXIMIZE PERFORMANCE AND VALUE 8
SUMMARY 9
Trang 4© 2010 ParAccel, Inc All rights reserved 1
INTRODUCTION
Financial services organizations face immense hurdles in maintaining profitability, building
competitive advantage and meeting service levels for customers and in-house staff All of these can be severely impacted through a widening variety of risk factors, including regulatory risk, criminal risk, macroeconomic risk, etc Increased demands from customers, regulators and shareholders are also driving financial services businesses to seek new ways to simultaneously sharpen investing prowess, minimize risk and fraud, and improve compliance and customer service at unprecedented rates
For example, the recent global economic crisis has resulted in further regulatory pressures on financial institutions, such as investment and commercial banks The Basel III Accord has
introduced both increased capital requirements and countercyclical capital buffer requirements The latter is especially onerous on banks as they try to predict how much capital will need to be kept in reserve across all stages of an economic cycle On a separate note, the increase in electronic banking and commerce has driven a corresponding increase in the frequency of fraud Meeting these kinds of increased risks and demands requires deep, timely analysis of data – data that accumulates relentlessly This accumulation has started to push many institutions past the performance limits of their existing data warehousing platforms, especially those that weren’t designed for complex analytics Whether they’re mortgage banks, investment banks, credit card providers, or other institutions within the industry, financial services organizations must perform
“what-if” scenario analysis, identify risks, and detect fraud patterns The advanced analytic complexity required often makes such analysis slow and painful, if not impossible This in turn can result in sizeable losses
ParAccel provides the core analytical performance to beat the complexity, agility and scale requirements associated with high-performance analytics ParAccel works with existing analysis tools to provide the requisite analytical performance for today’s most demanding analytical challenges It combines industry-leading performance, powerful analytic functionality and the ability to immediately begin deep analysis without “prepping” the data or setting up workarounds
to improve response times
Trang 5Accelerate Insight for Financial Services: Reduce Risk Stop Fraud Increase Profits
MANAGING RISK IN AN UNCERTAIN MARKET
While each sector of the financial services industry faces somewhat different problems regarding data analysis, there are some that are relatively common across various sectors within the
industry
Regulatory Risk: Regulators demand stress testing to gauge the impact of negative events,
such as unexpected geopolitical actions or natural disasters, on a financial organization’s health Traditional approaches have been shown to underestimate the frequency of high-risk events In order to accurately predict the impact that these events can have on an institution one needs to
be able to analyze the impact at all levels of granularity and aggregate it to the enterprise level Stress testing provides an inadequate measure of risk when accurate information takes a long time to produce, or when only small sets of data are analyzed because it is otherwise too difficult
to run deep complex analyses on massive data sets
Whether they are testing for dynamic, volatile events that require quick reaction to a crisis – such
as a natural disaster, or a flash crash – or are legally obligated to respond to regulators, financial services businesses need precise, timely answers to their questions Stress testing does far more than simply keep regulators at bay: It helps executives make better strategic decisions, and
assures that their future plans will drive optimal profits with acceptable exposure to multiple
sources of risk
Corporate/Shareholder Risk: Shareholders, and corporations that rely on financial institutions
for their operational and investment capital, no longer accept claims of liquidity and financial health at face value For shareholders this can have a material and immediate impact The Basel III accord states that if banks cannot meet regulatory requirements then they cannot pay
dividends Shareholders therefore have elevated expectations that financial institutions will avoid unduly risky investments that could crater profits and impact dividends
Consumer Risk: Individual customers expect that financial institutions will transparently protect
them from fraud Banks and credit card providers need to detect fraudulent transactions without interfering with consumers’ ability to carry on their day-to-day banking and other financial
activities Without precise answers based on actual data, financial organizations are more likely to generate false positives for fraud activity, wasting consumers’ time and creating negative
perceptions about service Worse, if fraudulent transactions are not caught in time, they could permanently impact consumers’ credit scores, and the ability of the banks to stay in business For example, studies show that in 2007, bank losses totaled about $2.9 billion—with the largest share
of losses associated with credit cards, followed by checks, debit cards and ACH payments The high cost of preventing fraud and complying with regulatory and network security standards cost
an additional $3.1 billion per year During the same time frame, merchant fraud losses totaled about $15.6 billion per year, while their spending to prevent fraud reached $5 billion Add it all up and the figures are staggering: $26.6 billion per year1
Competitive Risk: Traditional approaches to analytics don’t allow for open, iterative analysis
This degrades the quality and timeliness required to build and maintain competitive advantage Understanding and accessing data has evolved from a unique differentiator to an absolute
competitive necessity The ability to get answers into the hands of customers and traders before the competition has even finished loading data provides a distinct source of advantage that is somewhat akin to legalized “insider trading.” For financial institutions, the ability to deliver
accurate insights in seconds and minutes provides an immediate and quantifiable boost to both profit and revenue Over time, this also enables them to build more lucrative preferential
relationships with customers
[ 1 Federal Reserve Bank of Kansas City, “Can Smart Cards Reduce Payments Fraud and Identity Theft?” September 2008]
Trang 6© 2010 ParAccel, Inc All rights reserved 3
THE CORE ANALYTIC CHALLENGES
Across all sectors of the industry, financial services firms share three common analytic
challenges
Managing and processing extremely
large volumes of data - The analytical
complexity and size of the data stores that
financial services businesses must manage
can become major roadblocks to addressing
the regulatory, shareholder, and competitive
concerns detailed above For example, in
1993, the New York Stock Exchange
recorded less than one million quotes/orders
per day By 2008, that number had grown to
more than one billion quotes/orders every
single day Adding in the sheer volume of
customer transactions, the continuous
stream of real time tick data and the amount
of data that needs to be saved for regulatory
compliance can make it especially difficult
for financial institutions to keep up with the
information and analytic workload
Data growth is already impacting the ability
of institutions to maintain the performance of
their data warehouses at current levels
Gartner’s 2010 survey of data warehouse
customers revealed that almost 70% of
customers had already hit some sort of
performance limit
Running highly complex open, and
iterative analyses - Traditional analytic
platforms were largely designed to allow
users to ask standard queries on “prepped”
data They are simply not equipped to
handle the complex analytics and timeliness
required to make sense of their data
Traditional analytics approaches are falling
even further behind in their ability to both
scale their analysis to match the growth in
data volumes, and run increasingly
sophisticated analyses Costly
customization is often required to get around
inherent performance limitations These
laborious workarounds often inflate the initial
resource requirements and time lags -
before the core analytics processing can
even be initiated These workarounds also
restrict the types of analyses that can be
done quickly, potentially overlooking key
data points
Accessing – and acting on – time-sensitive data before it loses value - Ironically, while an
increasing amount of data must be retained for regulatory compliance, it also represents a key
PARACCEL AT WORK IN FINANCIAL SERVICES
One of the world’s premier financial institutions is using ParAccel to sharpen its trading edge Before implementing PADB, the
organization was purchasing mortgage data, loading it into Microsoft SQL Server, and then running reports from the database to
determine the most lucrative trading strategies for its investment portfolio
However, analysis of the data was taking almost a full workweek to run, a significant disadvantage for a business whose profits depended on real-time investment decisions In addition, the financial institution could not pursue ad hoc analysis – so they were forced to rely on a limited number of standard queries After choosing ParAccel, the time frame for queries was reduced from four days to less than ten minutes
PADB is at the heart of their success Now they know more about the mortgage bonds than any of their competitors Their traders make more profitable decisions before others have even finished loading and analyzing the data Customers are able to get access to innovative (and custom) analysis days before its available from anyone else, making the bank the preferred destination for trading mortgage bonds
ParAccel is also helping financial services businesses stay ahead of stress-testing requirements For example, a top-five investment bank needed to forecast risk exposures in the daily books of its traders, for both executives and regulators, using scenario analysis and stress testing It was necessary for the bank to model its analysis using highly complex scenarios based on fast-changing, dynamic industry trends to determine how best to manage exposure
to risk However, its existing relational database management systems could not perform these tests at the speeds required for true risk analysis
For more than a year, the bank searched in vain for a high-performance analytics platform that could meet the business-critical requirements necessary to match increasingly stringent regulatory oversight Finally, after thoroughly validating the advanced performance of PADB, they chose ParAccel as their best-of-breed platform for analytics databases ParAccel gave the bank the ability
to perform scenario analyses and stress testing all the way from individual instruments and trading desk exposure up to the global institution-level Now they can confidently report on risk across all parts of their firm, regardless of the level of data aggregation PADB’s high-speed, parallel loader allowed the bank to analyze the most current data available in conjunction with historical data Not having to preprocess the data meant that they could begin to immediately iterate across a large number of scenarios, and then ask deductive questions that helped them analyze the end result in any given scenario
Trang 7Accelerate Insight for Financial Services: Reduce Risk Stop Fraud Increase Profits
FIDELITY INFORMATION SERVICES: SLAM
THE DOOR ON CREDIT CARD FRAUD
FIS is one of the world’s top-ranked technology
providers to the banking industry Its customers rely on
F for deep, complex analyses of credit card
transactions in order to decrease fraud losses while
minimizing impact on valid cardholders Its data
volume is expected to increase by 50 percent in 2011,
reinforcing its need for an analytics solution that can
scale performance both predictably and efficiently
However, the existing data warehouse environment at
FIS was only suitable for predictable, static data
queries, whereas FIS’s fraud analytics required the
ability to create ad hoc queries and receive answers in
near-real time Fraud activity moves fast and fraud
patterns change quickly, which means activity must be
detected almost as soon as it starts to minimize
losses FIS’s existing database was taking hours to
respond to queries – far too long for the
fraud-detection needs of credit card providers
To meet the speed and complexity requirements of its
customers, FIS needed a solution that could handle
rapidly growing data volumes and process highly
complex queries almost immediately FIS needed a
solution that offered Massively Parallel Processing
(MPP), extremely high data access speeds, and
complex querying ability ParAccel was one of only two
vendors that FIS chose to evaluate in depth –
eventually choosing ParAccel after determining that
that PADB offered the best scalability and
performance with high data loads
ParAccel enables FIS to stay one step ahead of the
criminals With PADB, analytic queries now take less
than one minute, giving analysts the ability to more
accurately identify fraud, even when the patterns of
criminal behavior shift rapidly Fast access to this data,
and the ability to analyze the data in complex ways,
helps save FIS’s banking customers millions of dollars,
and gives the company a distinct competitive
advantage in its market
source of value depreciation Financial institutions need faster data access and analysis to retain their business advantage They must be able to ask their questions, their way, and get the
answers they need – fast The same level of insight is much more valuable in seconds, compared
to minutes or hours This enables firms to gain and retain even more value from core underlying data, before it depreciates
THREE PROVEN USE CASES FOR FINANCIAL SERVICES
The following are examples of how three firms were able to quickly unlock the value inherent in their data through rapid, iterative complex analytics
Trading Analytics: Financial intelligence through ad hoc, iterative analysis
Traders and investors in bond and mortgage trading rely heavily on skilled analysts at trading desks to provide them with fast, relevant answers to their questions about investment vehicles and their associated risk profiles In these markets, seconds and minutes matter - the answers
must arrive in time to make a difference Key investment and allocation decisions need to be based on the most current financial intelligence available (including third party sources) to ensure that the risk and return profiles of the securities best matches the needs of any given portfolio Financial services organizations that lack analytic agility are much less likely to earn the trust or meet the high expectations of their clients This cannot only impact short-term profitability, but also long-term survival Financial institutions fully recognize these challenges, yet many allow their teams’ performance to be hindered by legacy analytics solutions that don’t support rapid and complex analysis across vast and ever-growing data stores Analysts working at trading desks need the support of a high-performance analytics solution that offers the flexibility for creative, iterative lines of questioning
as well as the core analytical power required for complex analysis
Fraud Analytics: Identify patterns of
fraudulent credit card usage
Identifying potential fraudulent credit card activity requires the ability to correlate the reported alerts or exceptions in real time
to provide a consolidated view of fraudulent card activity By correlating these alerts in real time, banks can conduct further analysis to determine whether these are an isolated incidents or
Trang 8© 2010 ParAccel, Inc All rights reserved 5
ongoing patterns of fraud In other words, any one of the individual alerts might not appear significant But when analyzed together, they could become significant enough to require action This advanced set of analytic capabilities, which far exceeds the basic rules engine capabilities of many point solutions and typically involves vast amounts of real-time data from multiple sources, needs to be implemented to more accurately identify fraudulent activity Advanced analytics include techniques that can detect outlier patterns, identify complex sequences, discover and analyze links, and perform complex event processing
The pattern of fraudulent activity is gaining pace as criminals try to stay one step ahead The ability to quickly stay on top of rapidly shifting patterns of fraudulent transactions enables
institutions to dramatically limit their downside risk without impacting their customers
Risk Analytics: Multidimensional approach to integrated risk management
Risk comes in many forms: operational, financial, market, credit, liquidity, etc Financial
institutions are not only exposed to a wide variety of risks, but are also required to produce proof
of institutional viability under a range of scenarios The open-ended nature of this analysis can immobilize traditional analytical platforms, leading to a lack of clarity regarding the degree of risk (and the associated return)
Under the rules of the Basel III Accord, institutions that are unable to provide accurate reports of their risk exposures will be barred from paying dividends to their shareholders Therefore
financial institutions need to be able to analyze risk at any level of granularity: from a global institutional perspective down to individual instruments, trading desks, or national/legal entities The analysis requires an open-ended approach that enables companies to scale their analysis across a wide variety of high-risk events without limiting analysis to a limited set of parameters Whether they are testing for dynamic, volatile events that require quick reaction to a crisis – such
as a natural disaster or stock market decline market decline (flash crash) – or whether they are legally obligated to update or respond to regulators, financial services businesses need precise, rapid answers Stress testing does far more than simply keep regulators at bay: It helps
executives to make better strategic decisions, and determines if their future plans will drive optimal profits with acceptable exposure to multiple sources of risk
Figure 1: Stress Test
Trang 9Accelerate Insight for Financial Services: Reduce Risk Stop Fraud Increase Profits
A THREE-DIMENSIONAL APPROACH TO PERFORMANCE
Laborious workarounds and tuned hardware only address one dimension (speed) of
performance, providing a useful but limited approach to the new generation of analytical
challenges A three-dimensional approach is required for advanced analytics, incorporating speed, as well as analytical complexity, and an unrestricted framework for analysis Institutions that use a three-dimensional approach to analysis are able to unlock the maximum value from underlying data, using a two-way “conversational” approach to their toughest questions
First, speed is the starting point for scaling performance – both for limiting downside risk and for zeroing in on the best choices for key investments Identifying fraudulent activity before it
happens can prevent millions of dollars in losses; accurately pricing a buy or sell opportunity 30 seconds faster than the competition can mean the difference between posting a profit or
recording a loss
Secondly, another aspect of analytic performance is the ability to follow an open line of
questioning that is not limited by the structure of the data Analyses must be able to include more than a limited subset of analytic parameters – the analytical choices need to be driven by
analysts, not the underlying database Deep, complex analytics can quickly overload traditional databases built to process transactional data They are not optimized for today’s analytic
requirements and need to rely on cumbersome workarounds and/or finely tuned hardware The former include laborious database tuning, and the latter involves the purchase of expensive proprietary hardware Both stopgaps are clumsy and static approaches that strive to paper over the lack of core analytical performance
The third aspect is the ability to scale analytical complexity This is separate from the ability to process larger sets of data, or include additional parameters in the subsequent analysis Analytic complexity enables firms to expand the range of analysis to incorporate increasingly advanced statistical techniques and algorithms These can range from cross-industry requirements to those focused on specific requirements unique to financial services This can be further expanded to also include proprietary algorithms and techniques unique to a specific institution or trading desk
Figure 2: Performance across Both Complexity and Speed
Trang 10© 2010 ParAccel, Inc All rights reserved 7
A HIGH PERFORMANCE ANALYTIC FOUNDATION
The ParAccel Analytic Database (PADB) takes a dramatically different approach to performance Unlike traditional relational databases that have been built and tuned primarily to manage
transactional workloads, ParAccel’s Analytic Database has been architected from the ground up
to deliver across all three dimensions of performance ParAccel Analytic Database (PADB) allows users to get past the heavy-lifting grunt work and get started on the high-value “detective work”
Built from the ground up to deliver on the promise of speed Unlike other next-generation
alternatives, PADB provides a true end-to-end columnar approach that eliminates bottlenecks all the way down the stack A column-based approach dramatically reduces the amount of data that needs to be retrieved off of physical disk, especially when compared to row-based relational databases such as Oracle, SQL Server or DB2 Built in column-oriented data compression automatically compresses the data in each column based on its characteristics ParAccel’s approach builds on a columnar orientation to then also provide a custom communication protocol that provides 10X faster communication between nodes compared to standard TCP/IP PADB also (re)defines queries on the fly, compiles them to reduce the workload on the processors, and leverages the patent-pending OMNE optimizer to ensure that data is retrieved at the fastest speeds possible
PADB enables analysts to ask any question, in any way, at any time It’s “load and go”
approach eliminates the need to “prep” the data first Other databases, even some of the next generation databases, rely on costly workarounds such as materialized views These are used when certain standard queries are known in advance In these cases the results are calculated in advance to provide faster speed Besides additional storage requirements, this approach also constrains analysis to a limited subset of variables Any requirement to include additional
variables or data from different sources requires a costly lag in analysis, as the data is
re-calculated The OMNE optimizer also minimizes the need for advanced SQL skills It can use a cost-based or rule-based approach to ensure that any SQL query, no matter how complex or convoluted, can be completed in the fastest way possible This ensures that power users don’t spend countless hours fine-tuning queries and can instead spend more time in analysis
The ParAccel Analytic Database can scale along with your data and analysis PADB has a
shared nothing (MPP) architecture (shown below) that provides near-linear scalability to match growing analytic requirements New compute nodes can be added modularly to ensure that complex queries against vast amounts of data are executed within a reasonable timeframe A column-based approach also dramatically cuts the amount of time needed to retrieve individual sets of data from the database, providing orders of magnitude better performance, especially on large data sets Using traditional analytic approaches, large data sets require painful database tuning and prepping of the data before any analysis could even be executed This not only delays the time-to-analysis, but also results in bloated storage requirements where 1 TB of “raw” data might actually require 3TBs of storage capacity
Figure 3: PADB Shared Nothing Architecture