The financial services industry has long been at the vanguard of innovative data analytics techniques, leveraging large piles of data (i.e. credit and debit card transactions, tick data, etc.) to pinpoint trading opportunities, better understand a customer and more efficiently manage fraud. However, things are different today. We are seeing the financial services sector experience an acute spike in new data types and sources, creating exploding volumes of multi-structured big data that cannot be simply and cost-effectively stored in conventional data collection systems. Perhaps more importantly, these increasingly voluminous, varied and complex data stores encumber insightful data processing and analysis using orthodox tools.
Beyond simply maximizing profits and improving business processes, financial services firms face the additional regulatory incentive to drive insightful data analytics. Since the Great Recession of 2007-2012, regulators now obligate financial services firms to provide increased transparency and risk mitigation to allay concerns about the safety of capital. Hence, one of the biggest IT pain points that financial services firms face today is the need to tame this big data conundrum – i.e. by better processing and analyzing ballooning volumes of multi-structured data at various speeds – to comply with a bevy of more rigorous regulations like Dodd-Frank (i.e. the OTC Regulations) and Basel III.
As a result, according to the CEB TowerGroup “Capital Markets: Top 10 Technology Initiatives for 2012” report, analysts expect 2012 to be the year when “[capital markets] firms begin to employ big-data techniques to address problems in risk management, regulatory compliance, and portfolio analytics.” Furthermore, a Gartner 2010 data management survey of banks and investment services firms underscores the vast potential of big data analytics within the financial services sector. It found that only 33% gave the quality of their data for supporting business intelligence and management decision making a high rating, suggesting room for IT spend.
Thus, the promise of enhanced big data analytics to stay ahead of market conditions, regulators and competitors will require a technological paradigm shift and new architectural approaches. One such approach is to build out massively parallel grids for big data analytics using platforms that support open, iterative, interactive and complex analytics on vast and diverse datasets. The ParAccel Analytic Platform and Dell’s portfolio of server, storage, and networking solutions provides such a big data analytics platform. Deployed as part of optimized data center architecture, it delivers a powerful and flexible analytic infrastructure that maximizes business and IT agility, minimizes TCO, and delivers ROI at scale – all while fundamentally transforming the way customers do business.
Today, Dell and ParAccel are hosting a conversation on the growing saliency of big data analytics within financial services and to explore the latest big data analytics-enabling technology advancements. The event will feature speakers from Dell, ParAccel, Oktay Technology, Tabb Group, STAC, LiquidNet and Executive Networks. I'll be sure and share highlights from the conversations in a future blog post.