For more than 25 years, exchange-traded funds (ETFs) have transformed the global investment landscape. Born from assumptions of efficiency of the American stock markets and computer progress, these low-cost investment vehicles are found today in many institutional portfolios. Investors use ETF products well beyond the initial limits of US stocks. These liquid beta instruments aim to replicate the performance of selected international stock market indices, commodity indicators and fixed income universes. Given the continued growth in assets allocated to ETFs, particularly ESG oriented ones, this blog takes a look at some of the analytical implications for the asset allocation and risk management processes of institutional investors.
For asset managers, hedge funds and fund of fund specialists investing in ETFs, finding detailed daily data is a significant operational burden. This often leads these investment professionals to limit the scope of data to the highest ETF numbers. In addition to values, these numbers may include aggregate exposures such as sector / country / rating weights, beta, duration and historical volatility. This trade-off between “top-down” data has implications for investment modeling options and depth of analysis. Enrichment of ESG attributes and provision of fund-level statistics can be added to inform effective descriptive analyzes and risk measurement frameworks. From an investment point of view, the main advantages of this simplified modeling approach lie in its consistency; index allocation decisions and ETF strategy implementation are well aligned. However, limitations quickly appear. Depending on the regulatory environment for investors, the lack of “transparency” information can result in more expensive capital reserves. For example, insurance companies operating under the European Solvency II regulation can benefit from increased data transparency. In the absence of granular position details, their positions in G7 equity ETFs may be exposed to shocks to standard “type 2” equities. Rather, more in-depth data might qualify these less expensive “type 1” stress investments.
During the preparation period for Solvency II before 2016, European insurance companies reviewed their data flows and their operational governance around them. They also listed suitable options for modeling risk-related assets. Driven by actuarial considerations, many companies have decided to use curve-fitting techniques to aggregate the proxy assets and liabilities of all of their organizations. In this context, proxy instruments represent a good compromise between aspects of data provisioning, the relevance of economic sensitivity and the impact of calculations on large-scale risk simulations. Defined as polynomials of varying degrees, these proxy instruments reflect the financial positions and exposures of external funds to market risk factors. These factors can include stocks, fixed income, credit, commodities, inflation, and more. An advantage of using flexible curve-fitting techniques on ETFs is the ability for users to perform aggregate sensitivities and advanced scenario analysis on the resulting proxies. For example, users can check the sensitivity of a bond ETF to widening spreads, government bonds, or changes in the inflation curve. Unlike the previous single instrument approach, these passive fund proxies are compatible with complex Value-at-Risk attribution schemes. Risk managers can assign ETF-proxy simulation results to blocks of stocks, government bonds, corporations, inflation, foreign exchange, commodities and volatility factors. Predictive information comes at a price, however; curve fitting processes require careful calibration. Proxies can otherwise quickly show limitations when the factor simulations are far from the limits of the calibration data.
The real picture
From a risk management perspective, the real picture of a fund combining long and short stocks, futures, fixed income positions and ETFs across asset classes can only be obtained ‘using a “see through” approach. This means modeling individual ETF positions at the most granular level within a consistent analytical framework. This requires pricing libraries, simulation and fund of funds aggregation capabilities. For most investment organizations, the functional benefits of this approach are expected to far outweigh the marginal operational data. It first allows front and middle-office stakeholders to confirm net and gross exposures to a single name for their entire fund.
Fig. 1 – Monte Carlo simulations on an example of a US equity ETF (single instrument vs “ look-through ”)
As shown in the chart above, ETF “see-through” data combined with advanced simulations can help investment and risk management teams study the tail risk of return distribution and diversification effects. By leveraging this information, teams can implement well-targeted derivative overlays if needed. Asset allocators can also use “transparency” to assess equity-style leanings in their ETF investments and better manage unintentional drifts on popular factors of size, value, growth and momentum.
Financial information around the world is a reminder to risk and investment management professionals that aggregate exposure information can be sufficient on most business days. However, when it comes to risk management, granular information matters every day. In today’s cost-conscious business environment, is it time for more financial institutions to assess how data granularity decisions might impact their investment and risk management processes?
To learn more about SS&C Algorithmics’ financial risk management solutions, please visit our product page.