Translating Market Intelligence into Alpha – Nuant CEO on the DAS: London panel
November 1, 2022
A core advantage of Nuant’s digital asset portfolio management solution is the ability to tap into rich market intelligence and gain insights from real-time market data and on-chain data metrics & analytics. However, access to data is not an end unto itself – effective portfolio management requires contextualising data into market intelligence and subsequently translating and curating these insights into alpha.
Nuant CEO Rachid Ajaja recently attended DAS: London, one of Europe’s flagship events for institutional digital asset professionals, where he participated in a panel discussion titled “Translating Market Intelligence into Alpha.” The video is available in full below. This article summarizes the main points that came up during the conversation.
On the panel with Rachid were Ambre Soubiran, CEO at Kaiko; Alain Kunz, Head of European Sales at GSR; and Laura Vidiella, VP of Business Development and Strategy at Ledger Prime. The panel was moderated by Sam Martin of Blockworks.
The role of on-chain data
The first question put to the panel concerned the nature and importance of on-chain data. Ambre Soubiran opened the discussion with an explanation of how elements of market data have moved on-chain thanks to the advent of DeFi.
Rachid explained that once institutions first began to enter the digital asset space around 2018, that created a demand for institutional-grade market data. But since the advent of DeFi around 2020, these traditional data providers no longer capture a full picture of the market, since DeFi transactions happen on-chain. Therefore, it has now become necessary to consider on-chain as well as off-chain data when making market-related decisions.
What kind of institutional clients use on-chain data, and how? Rachid explained there are two main categories: traditional portfolio managers who have entered digital assets, and crypto-native funds. He outlined how the traditional category is not necessarily interested in DeFi, but that on-chain data is nevertheless a consideration in their decision making. For example, a large transaction where someone moves funds from one on-chain wallet to another on-chain wallet could have an impact on price. However, if a data provider is only consolidating inputs from centralized exchanges, such movements are neither considered nor explained.
Invariably, there is substantial interest in on-chain data from crypto-native funds, which tend to be more involved with the decentralized finance space. Laura Vidiella elaborated that Ledger Prime used on-chain data to verify the health of a protocol or assess the liquidity of assets using a holistic on-chain/off-chain approach.
Raising the quality
The conversation moved onto the overall quality of on-chain data and how it can be digested and presented in a format that can be read and understood by both crypto-native and non-crypto-native audiences.
Here, Ambre Soubiron explained that obtaining real-time, high-quality data at scale remained an ongoing technical challenge for the industry. Rachid concurred that there was a significant challenge in presenting the right amount of data with the necessary depth of insight in a way that is easily digestible by a cross-section of industry players who may have varying needs and levels of understanding regarding on-chain data.
Ambre Soubiron also highlighted the role of on-chain data in identifying upcoming liquidity challenges, as decentralized exchanges of lower liquidity tend to be affected first in a liquidity crisis. However, once again, drawing these insights requires an overview of price and market data in combination with on-chain transactions.
Alain Kunz and Rachid raised the point that the DeFi space had a huge opportunity to leverage on-chain data to be able to backtest and stress-test strategies in the same way as analysts in traditional finance do using market data. Having the opportunity to perform this kind of testing on DeFi protocols would allow for more accurate risk assessment and encourage more responsible development in the sector.
Transparency – a double-edged sword
The moderator then posed a question regarding the potential risks of having so much on-chain data available publicly and whether this could create any risk for professionals moving funds in such a visible way. Laura Vidielli acknowledged that it could become troublesome if an address were identified as belonging to an individual or entity, as any on-chain movements can raise questions.
However, Rachid pointed out that many of the recent high-profile collapses seen in crypto, including Celsius and Three Arrows Capital, could have been avoided if the operators had disclosed their on-chain addresses. Alain Kunz drew the conversation back to the point that the ability to forecast these kinds of events still depends on the availability and quality of data and, more importantly, the ability to interpret the data.
Laura concurred, highlighting the irony that even though blockchain offers unparalleled visibility into transactions and events, the sheer volume of data and extent of analysis needed to interpret it creates a degree of opacity.
The case for standardization
Sam Martin queried the panel on their thoughts on the lack of standardization when analyzing and presenting data. Rachid acknowledged the challenge here, and illustrated the difficulty involved in getting a single accurate representation of asset price in a portfolio when the real price was fluctuating across different venues. Even a small lag can result in a loss of alpha. This results in there being a gap in the institutional space to help decision makers understand the opportunities and limitations of any given set of data and how they can use it.
Ambre Soubiran elaborated that these considerations took on even more weight when considering that much activity can be automated. However, automation hinges on whether the underlying data is reliable and accurate.
Rachid extrapolated this point further with reference to an example in decentralized finance. Liquidity providers on decentralized exchanges, used by many crypto-native funds, can suffer from a phenomenon known as impermanent loss, which is somewhat comparable to the adverse selection risk faced by traditional market makers.
Currently, there is no standardized way to aggregate impermanent loss across multiple wallets or apply stochastic modelling that might indicate the future risk of impermanent loss. Ultimately, effective portfolio management will depend on this kind of feature being available in a standardized way.
Furthermore, as Rachid pointed out, it is important to be able to recognize the overall balance of a DeFi portfolio, as the benefits accrued over time in fees can provide a critical offset against impermanent loss. However, timing is key to understand when to withdraw liquidity and once again illustrates the need for effective modelling and backtesting.
Finding the right tools
The moderator asked the panel whether they had any preferred tools for aggregating digital asset data. The Ledger Prime and GSR representatives both confirmed that their firms were building internal tools, while Rachid highlighted the gap that Nuant aimed to address for digital asset portfolio managers. He explained how, after speaking to many institutional participants, the Nuant team became aware that a common approach was for analysts and portfolio managers to simply feed data from a variety of analytics services such as Messari or Glassnode into a spreadsheet. As such, the idea of Nuant as a comprehensive solution for digital asset portfolio management, with an integrated and curated data & analytics platform presented in a unified dashboard, was born.
Fragmentation vs. Abstraction
The moderator asked the panel about the fragmentation across the space and whether it was likely to continue. The consensus was that the panel did expect fragmentation to continue; however, abstracting away the complexity from end users is a more realistic and achievable goal than attempting to reduce fragmentation or force standardization. Ambre Soubiran also pointed out that operators in the blockchain data sector also had a challenge in keeping pace with the volume of new data and information coming out of new Layer 1 and Layer 2 platforms, as well as the applications that run on them.
Rachid elaborated on the role that Nuant’s proprietary query language could play in helping to create a standardized means of accessing and querying blockchain data across different protocols. Alain Kunz questioned whether there was a possibility that one data provider would come to dominate the market through their ability to query and present data, in a similar way to how Google came to dominate web search and, as such, vast quantities of data.
Rachid countered that blockchain data was still decentralized and public, and therefore, there was little risk of any company having the same degree of control over blockchain data as Google has with web search data. However, he acknowledged that there could be a need for providers to offer transparency into how the data is translated into a particular insight or analytic, perhaps in the form of cryptographic proof.
Ambre Soubiran raised the point of regulation, highlighting that in the traditional markets, data providers are regulated thanks to the role they play in presenting information and, thus, driving financial decisions.
The final question to the panel was about any prevailing perception from traditional finance that digital assets are purely speculative with no inherent value. The panel consensus was that the price of tokens was not necessarily any indicator of the underlying value in the digital asset space. Disregarding price volatility, the overall direction of the sector is towards growth, development, and adoption, which is where the long-term value exists. Overall, there is an increasing recognition of this view from those hailing from traditional finance.
You can watch this video and the other presentations from DAS: London here.