For decades, asset managers and their analysts/PMs tended to sort themselves into one of two camps - fundamental and quantitative. Over the last few years, however, a new type of analytical approach to evaluating securities and other financial assets has gained traction. It’s known as the “quantamental” approach, and it’s already redefining how hedge funds and other asset managers manage their portfolios.
Bridging the fundamental/quantitative gap
As the name implies, quantamental investors and analysts combine fundamental and quantitative techniques, bringing the best of both worlds into one approach.
Fundamental analysts focus on bottom-up analysis of securities or companies that sought to find the intrinsic value of a particular asset using metrics ranging from macroeconomic conditions and industry performance, to management and financial statements. Quantitative analysts, or “quants,” focus on hyper-technical evaluation of assets based on market data, such as price and volume. Fundamental factors like earnings, expenses, liabilities, and other balance sheet items rarely - if ever - come into play here.
The idea of combining these two approaches into the quantamental paradigm is largely born out of necessity. The sheer quantity of data available to investors today is making it possible to gain greater insights on the performance of a particular portfolio than ever before. It’s clear that in order to stay ahead of the curve - and in some respects just keep up with it - hedge funds and asset managers will need to find ways to harness and use all of this data.
That’s easier said than done. There are two key problems facing shops that want to adopt quantamental methodologies:
- Gathering and harnessing reliable, up-to-the-minute data is an expensive and difficult endeavor. It gets even more challenging when the type of data needed is very specific to a particular situation and not readily available.
- Much of the data available today is unstructured and requires considerable time and programming expertise to get to a usable form. Most shops either don’t have the resources to do this in-house, and if they do, it entails having portfolio managers ask quants or programmers to cleanse and organize the data. This can take dozens of man hours, and drastically slow down the time it takes to build models and derive insights from the findings, especially when you’re working on a problem with very niche parameters.
Finding tools to facilitate quantamental investing and analytics
Building out the infrastructure and hiring the people with the expertise to supplement the quantamental approach can take well over a year and be cost prohibitive. The desire among financial institutions, however, to use these techniques and methods has led to the development of platform and software solutions that make it easy and quick to harness and use data.
The Elsen nPlatform, for example, was designed to be a platform on which applications that solve the aforementioned problems with implementing quantamental research and analytics can be built. Thus, financial Institutions can easily build applications that help them solve problems more quickly.
Three key components of the nPlatform are Data Store, Processing Engine, and Intelligence Engine, which directly address the challenges posed by the quantamental approach.
- Data Store comes fully integrated with thousands of premium financial datasets from world-class data providers, all of which have been pre-cleansed and normalized so users can start pulling data and using it to perform analytics instantly. All of this data is stored in secure cloud databases and also allows financial institutions to “bring their own data.” Finding, cleaning, and accessing reliable, up-to-the-minute data is simple.
- Processing Engine is constructed on a massively parallel processing architecture, and is purpose-built for high-performance computing (HPC) to dramatically accelerate workflows. Again, using top-tier cloud infrastructure makes it possible to access incredible computing power - making a typical workflow up to 30-50x faster - without incurring the costs of building an in-house solution. Additionally, Processing Engine is flexible enough for both power users who want to directly access the engine and non-technical users who can rely on intuitive interfaces built on top of the engine.
- Intelligence Engine is an interface built on an algorithmic approach to data analysis, making it easy to uncover otherwise hidden insights. The interface allows applications built on the platform to present clear data visualizations and allows users to automate key workflows. The reduction in time spent on manual processes allows user to run far more tests and strategies than before.
The emergence of solutions like the nPlatform is just one of the signals pointing to the growing trend of the quantamental investor. As these tools proliferate in the marketplace, it seems likely that financial institutions that can take this approach will come out ahead of the game. Making data easy to harness and use can reduce the time-to-insight and drive a more efficient research and analytics process.