Stay vigilant against phishing attacks. Chorus One sends emails exclusively to contacts who have subscribed. If you are in doubt, please don’t hesitate to reach out through our official communication channels.

Blog

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
News
Announcing Chorus One's Integration with Paladin: Reshaping MEV on Solana
Chorus One is launching a rollout of Paladin on one of our Solana validators, bringing better MEV rewards to our delegators.
September 10, 2024
5 min read
Introduction: The MEV landscape on Solana

Maximum Extractable Value (MEV) is critical to blockchains, particularly on networks like Ethereum and Solana. With sub-second block times and high throughput, Solana has unique challenges and opportunities in the MEV space. Unlike Ethereum's block-building marketplace model, Solana's mempool-less architecture has led to a different MEV extraction dynamic characterized by high-speed competition and potential network congestion.

Solana's unique features, including Gulf Stream for mempool-less transaction forwarding, have enabled remarkable speed and efficiency. However, these same features have also created an MEV landscape that requires innovative approaches.

Current trends in Solana's MEV approach

The current methods of MEV extraction on Solana have several drawbacks. Searchers competing on latency often flood the network with duplicate transactions to ensure MEV capture, leading to periods of intense congestion and failing transaction processing for all users.

The winner-takes-all nature of Solana MEV opportunities results in a high rate of failed transactions. These failed transactions still consume compute resources and network bandwidths. Studies have shown that up to 75% of transactions interacting with DEX aggregators can fail during periods of high activity.

Moreover, the concentration of MEV capture among a few players threatens network decentralization as these entities accumulate more resources and influence. In Ethereum, the use of external searchers and block-builders has led to private order flow deals, resulting in extreme centralization where a single builder creating over 50% of Ethereum blocks, with only two builders responsible for 95% and four entities building 99% of all blocks.

Paladin: A new approach to tackling bad MEV on Solana

Paladin introduces a solution to address these issues. It consists of two main components:

  1. An open-source MEV bot, and
  2. A token to capture and distribute MEV rewards among validators and stakers.

The Paladin Bot

The Paladin bot is a high-speed, open-source arb bot that runs locally on validators. It works only when the validator is the leader and is integrated with the Jito-client. By running directly on the validator, it captures all riskless and straightforward MEV opportunities (e.g., atomic arbitrage, CeFi/DeFi arbitrage) faster than searchers, without needing to outsource these opportunities and pay around 20% of the MEV to external entities. Any non-supported, or more advanced MEV strategies that the Paladin bot doesn’t recognize can still be captured by the Jito auction, making it a net positive for the ecosystem.

The bot listens to state updates from the geyser interface, allowing real-time opportunity detection. Validators can choose which tokens and protocols to interact with, allowing more conservative validators to alleviate legal concerns about interacting directly with tokens they deem securities.

The PAL Token

The PAL token is designed to align the incentives of validators and users and create a robust MEV extraction mechanism. With the entire supply of one billion airdropped at launch, PAL is distributed among validators, their stakers, Solana builders, the team, and a development fund.

Source: Paladin

PAL can be staked by validators and their delegators, with rewards proportional to their SOL stake. The token has a unique MEV distribution mechanism, where 10% of captured MEV is funneled to PAL token holders, with 97.5% going back to validators and their stakers. Most staked PALs can vote to slash the staked PAL of validators who engage in dishonest actions, such as running closed-source modifications of Paladin, instead of adhering to the "just run Paladin" principle.

How Paladin Works: A Technical Deep Dive

Paladin's Key Principles and Dynamics

Paladin's design creates dynamics that contribute to its sustainability. The "Pack of Wolves" dynamic incentivizes validators to "run with the pack" by honestly running Paladin. Going against the pack risks slashing and loss of rewards. This creates a self-reinforcing system of honest behavior.

As more validators run Paladin, a flywheel effect is created. More MEV is funneled to PAL holders, increasing the value of PAL and further incentivizing participation. This alignment of long-term interests incentivizes validators to behave honestly rather than pursue short-term gains through harmful practices like frontrunning.

Moreover, by allowing all validators to participate in MEV extraction, Paladin prevents centralization while still allowing searchers to implement more specialized strategies. The bot's open-source nature and transparent reward distribution create a fairer MEV landscape, benefiting the entire Solana ecosystem.

Chorus One's Integration with Paladin

At Chorus One, we recognize Paladin's transformative potential. We've taken the proactive step of integrating  Paladin into one of our Solana validators, Chorus One Palidator.

Breaking Bots - our proof-of-concept to capture MEV on Solana

If you have been following Chorus One, you would know we have a deep interest in MEV. Almost two years back, we open-sourced our proof-of-concept called ‘Breaking Bots’ to capture MEV on Solana efficiently and ethically. Paladin’s proposition is similar in spirit but takes a different approach with the PAL token, which was not part of our proof-of-concept.

Conclusion: Shaping a Better Future for Solana

The integration of Paladin with our validator is a significant step in addressing the challenges of MEV on Solana. We invite Solana stakers to join us in this effort by delegating to our Palidator. Let’s move towards a model that benefits all participants rather than a select few.

As the MEV landscape evolves, Chorus One is committed to exploring and implementing solutions that benefit our delegators and the wider Solana community.

Additional resources on Solana by Chorus One:

Blog articles

https://chorus.one/articles/metrics-that-matter

https://chorus.one/articles/solana-mev-client-an-alternative-way-to-capture-mev-on-solana

https://chorus.one/articles/solana-validator-economics

https://chorus.one/articles/analyzing-mev-instances-on-solana-part-3

https://chorus.one/articles/analyzing-mev-instances-on-solana-part-2

https://chorus.one/articles/analyzing-mev-instances-on-solana-part-1

Podcasts

Solana's Next Big Moves: From Memecoins to Staking—What's Coming Next?

Exploring Marinade V2 and the state of Solana Staking

About Chorus One

Chorus One is one of the largest institutional staking providers globally, operating infrastructure for over 60 Proof-of-Stake (PoS) networks, including Ethereum, Cosmos, Solana, Avalanche, Near, and others. Since 2018, we have been at the forefront of the PoS industry, offering easy-to-use, enterprise-grade staking solutions, conducting industry-leading research, and investing in innovative protocols through Chorus One Ventures. As an ISO 27001 certified provider, Chorus One also offers slashing and double-signing insurance to its institutional clients. For more information, visit chorus.one or follow us on LinkedIn, X (formerly Twitter), and Telegram.

Networks
Nillion: Redefining Data Privacy in the Age of AI
A deep-dive into current challenges surrounding private data exchange and how Nillion addresses these issues
September 9, 2024
5 min read

The rapid expansion of AI-driven applications and platforms in 2024 has revolutionized everything from email composition to the rise of virtual influencers. AI has permeated countless aspects of our daily lives, offering unprecedented convenience and capabilities. However, with this explosive growth comes an increasingly urgent question: How can we enjoy the benefits of AI without compromising our privacy? This concern extends beyond AI to other domains where sensitive data exchange is critical, such as healthcare, identity verification, and trading. While privacy is often viewed as an impediment to these use cases, Nillion posits that it can actually be an enabler. In this article, we'll delve into the current challenges surrounding private data exchange, how Nillion addresses these issues, and explore the potential it unlocks.

The Value of Data and the Privacy Paradox

Privacy in blockchain technology is not a novel concept. Over the years, several protocols have emerged, offering solutions like private transactions and obfuscation of user identities. However, privacy extends far beyond financial transactions. It could be argued that privacy has the potential to unlock a multitude of non-financial use cases—if only we could compute on private data without compromising its confidentiality. Feeding private data into generative AI platforms or allowing them to train on user-generated content raises significant privacy concerns.

Data Categories and Privacy Concerns

Every day, we unknowingly share fragments of our data through various channels. This data can be categorized into three broad types:

  • Public Data: Instagram posts, blogs, tweets, Google reviews, Reddit comments, real estate listings.
  • Partially Private Data: Blockchain transactions, deleted tweets, search history, advertising cookies.
  • Private Data: Transaction data, text messages, voicemails, medical records, personal photos, location data.

The publicly shared data has fueled the growth of social media and the internet, generating billions of dollars in economic value and creating jobs. Companies have capitalized on this data to improve algorithms and enhance targeted advertising, leading to a concentration of data within a few powerful entities, as evidenced by scandals like Cambridge Analytica. Users, often unaware of the implications, continue to feed these data monopolies, further entrenching their dominance. With the rise of AI wearables, the potential for privacy invasion only increases.

As awareness of the importance of privacy grows, it becomes clear that while people are generally comfortable with their data being used, they want its contents to remain confidential. This desire for privacy presents a significant challenge: how can we allow services to use data without revealing the underlying information? Traditional encryption methods require decryption before computation, which introduces security vulnerabilities and increases the risk of data misuse.

Another critical issue is the concentration of sensitive data. Ideally, high-value data should be decentralized to avoid central points of failure, but sharing data across multiple parties or nodes raises concerns about efficiency and consistent security standards.

This is where Nillion comes in. While blockchains have decentralized transactions, Nillion seeks to decentralize high-value data itself.

What is Nillion?

Nillion is a secure computation network designed to decentralize trust for high-value data. It addresses privacy challenges by leveraging Privacy-Enhancing Technologies (PETs), particularly Multi-Party Computation (MPC). These PETs enable users to securely store high-value data on Nillion's peer-to-peer network of nodes and allow computations to be executed on the masked data itself. This approach eliminates the need to decrypt data prior to computation, thereby enhancing the security of sensitive information.

The Nillion network enables computations on hidden data, unlocking new possibilities across various sectors. Early adopters in the Nillion community are already building tools for private predictive AI, secure storage and compute solutions for healthcare, password management, and trading data. Developers can create applications and services that utilize PETs like MPC to perform blind computations on private user data without revealing it to the network or other users.

The Nillion Network operates through two interdependent layers:

  • Coordination Layer: Governed by the NilChain, a Cosmos-based network that coordinates payments for storage operations and blind computations performed on the network.
  • Orchestration Layer: Powered by Petnet, this layer harnesses PETs like MPC to protect data at rest and enable blind computations on that data.

When decentralized applications (dApps) or other blockchain networks require privacy-enhanced data (e.g., blind computations), they must pay in $NIL, the network's native token. The Coordination Layer's nodes manage the payments between the dApp and the Petnet, while infrastructure providers on the Petnet are rewarded in $NIL for securely storing data and performing computations.

The Coordination Layer functions as a Cosmos chain, with infrastructure providers staking $NIL to secure the network, just like in other Cosmos-based chains. This dual-layer architecture ensures that Nillion can scale effectively while maintaining robust security and privacy standards.

Clustering on the Petnet

At the heart of Nillion's architecture is the concept of clustering. Each cluster consists of a variable number of nodes tailored to meet specific security, cost, and performance requirements. Unlike traditional blockchains, Nillion's compute network does not rely on a global shared state, allowing it to scale both vertically and horizontally. As demand for storage or compute power increases, clusters can scale up their infrastructure or new clusters of nodes can be added.

Clusters can be specialized to handle different types of requests, such as provisioning large amounts of storage for secrets or utilizing specific hardware to accelerate particular computations. This flexibility enables the Nillion network to adapt to various use cases and workloads.

The Role of $NIL

$NIL is the governance and staking token of the Nillion network, playing a crucial role in securing and managing the network. Its primary functions include:

  1. Securing the Coordination Layer: Staking $NIL accrues voting power, which is used to secure the network and determine the active set of validators through a Delegated Proof of Stake mechanism.
  2. Managing Network Resources: Users pay $NIL tokens to access the Coordination Layer or request blind computations, enabling efficient resource management.
  3. Economics of Petnet Clusters: Infrastructure providers earn $NIL for facilitating blind computations and securely storing data.
  4. Network Governance: $NIL holders can stake their tokens to vote on on-chain proposals within the Coordination Layer or delegate their voting power to others.

Use Cases for Nillion

Nillion's advanced data privacy capabilities open up a wide range of potential use cases, both within and beyond the crypto space:

  • Private Order Books: A privacy-enhanced order book could mitigate the effects of Maximal Extractable Value (MEV) and reduce front-running in DeFi.
  • Governance: Decentralized Autonomous Organizations (DAOs) and delegators could benefit from provable privacy for their votes.
  • Messaging: On-chain messaging, particularly in decentralized social media, could be a significant use case with Nillion's privacy features.
  • Decentralized Storage: Storing sensitive documents or information in a centralized entity carries risks. Nillion's decentralized infrastructure with complete encryption could transform how such data is managed.
  • Medical Data: Privacy-enhanced infrastructure could streamline the storage, transfer, and usage of medical data, ensuring confidentiality.
  • Advertising: Advertisers currently exploit user data for behavioral trends without compensating the data providers. Nillion's privacy solutions could create a more equitable model.

Testnet and Future Prospects

Nillion is currently in the testnet phase, having recently completed its incentivized Genesis Sprint. The network is now running the Catalyst Convergence phase, which aims to seamlessly integrate the Petnet with the Coordination Layer. Nillion also recently announced its partnership with the leading Layer 2 Arbitrum. The tie-up will enable apps on Nillion to tap into Ethereum’s security for settlement and bring Nillion’s privacy-preserving data processing and storage to Arbitrum dapps.

Staking $NIL with Chorus One

Chorus One is actively collaborating with Nillion and will support $NIL staking when the network launches its mainnet. For those interested in learning more or participating in the network, reach out to us for further information.

Core Research
A primer on proposer preconfirms
We explore what preconfirmations are, why they matter, and how they’re set to transform the blockchain landscape.
September 9, 2024
5 min read

In the blockchain industry, where the balance between decentralization and efficiency often teeters on a knife's edge, innovations that address these challenges are paramount. Among these innovations, preconfirmations stand out as a powerful tool designed to enhance transaction speed, security, and reliability. Here, we’ll delve into what preconfirmations (henceforth referred to as “preconfirms” ) are, why they matter, and how they’re set to transform the blockchain landscape.

Preconfirms are not a new concept.

The idea of providing a credible heads-up or confirmation that a transaction has occurred is deeply ingrained in our daily lives. Whether it's receiving an order confirmation from Amazon, verifying a credit card payment, or processing transactions in blockchain networks, this concept is familiar and widely used. In the blockchain world, centralized sequencers like those in Arbitrum function similarly, offering guarantees that your transaction will be included in the block.

However, these guarantees are not without limitations. True finality is only achieved when the transaction is settled on Ethereum. The reliance on centralized sequencers in Layer 2 (L2) networks, which are responsible for verifying, ordering, and batching transactions before they are committed to the main blockchain (Layer 1), presents significant challenges. They can become single points of failure, leading to increased risks of transaction censorship and bottlenecks in the process.

This is where preconfirms come into play. Preconfirms were introduced to address these challenges, providing a more secure and efficient way to ensure transaction integrity in decentralized networks.

Builders, Sequencers, Proposers: Who’s Who

Before jumping into the preconfirms trenches, let’s start by clarifying some key terms that will appear throughout this article (and are essential to the broader topic).

Builders: In the context of Ethereum and PBS, builders are responsible for selecting and ordering transactions in a block. This is a specialized role with the goal of creating blocks with the highest value for the proposer, and builders are also highly centralized entities. Blocks are submited to relays, which act as mediators between builders and proposers.

Proposers: The role of the proposer is to validate the contents of the most valuable block submitted by the block builders, and to propose this block to the network to be included as the new head of the blockchain. In this landscape, proposers are the validators in the Proof-of-Stake consensus protocol, and get rewarded for proposing blocks (a fee gets paid to the builder from the proposer as well).

Sequencers: Sequencers are akin to air traffic controllers, particularly within Layer 2 Rollup networks. They are responsible for coordinating and ordering transactions between the Rollup and the Layer 1 chain (such as Ethereum) for final settlement. Because they have exclusive rights to the ordering of transactions, they also benefit from transaction fees and MEV.  Usually, they have ZK or optimistic security guarantees.

The solution: Preconfirmations

Now that we’ve set the stage, let’s dive into the concept of preconfirms.

At their core, preconfirms can provide two guarantees:

  • Inclusion Guarantees: Assurance that a transaction will be included in the next block.
  • Execution Guarantees: Assurance that a transaction will successfully execute, especially in competitive environments where multiple users are vying for the same resources, such as in trading scenarios.

These two guarantees matter. Particularly for:

Speed: Traditional block confirmations can take several seconds, whereas preconfirms can provide a credible assurance much faster. This speed is particularly beneficial for "based rollups" that batch user transactions and commit them to Ethereum, resulting in faster transaction confirmations.  @taikoxyz and @Spire_Labs are teams building based rollups.

Censorship Resistance: A proposer can request the inclusion of a transaction that some builders might not want to include.

Trading Use Cases: Traders may preconfirm transactions if it allows them to execute ahead of competitors.

Preconfirmations on Ethereum: A Closer Look

Now, zooming in on Ethereum.

The following chart describes the overall Proposer-builder separation and transaction pipeline on Ethereum.

Within the Ethereum network, preconfirms can be implemented in three distinct scenarios, depending on the specific needs of the network:

  1. Builder issued Preconfirms

Builder preconfirms suit the trading use case best. These offer low-latency guarantees and are effective in networks where a small number of builders dominate block-building. Builders can opt into proposer support, which enhances the strength of the guarantee.

However, the dominance of a few builders means that onboarding these few is key. However, since there are only a few dominant builders, successfully onboarding these players is key.

  1. Proposer issued Preconfirms.

Proposers provide stronger inclusion guarantees than builders because they have the final say on which transactions are included in the block. This method is particularly useful for "based rollups," where Layer 1 validators act as sequencers.

Yet, maintaining strong guarantees are key challenges for proposer preconfirms.

The question of which solution will ultimately win remains uncertain, as multiple factors will play a crucial role in determining the outcome. We can speculate on the success of builder opt-ins for builder preconfirms, the growing traction of based rollups, and the effectiveness of proposer declaration implementations. The balance between user demand for inclusion versus execution guarantees will also be pivotal. Furthermore, the introduction of multiple concurrent proposers on the Ethereum roadmap could significantly impact the direction of transaction confirmation solutions. Ultimately, the interplay of these elements will shape the future landscape of blockchain transaction processing.

Commit-Boost

Commit-boost is a mev-boost like sidecar for preconfirms.

Commit-boost facilitates communication between builders and proposers, enhancing the preconfirmation process. It’s designed to replace the existing MEV-boost infrastructure, addressing performance issues and extending its capabilities to include preconfirms.

Currently in testnet, commit-boost is being developed by a non-ventured-backed neutral software for Ethereum with the ambition of fully integrating preconfirms into its framework. Chorus One is currently running commit-boost on Testnet.  

Recap - The preconfirmation design space
  1. Who chooses which transactions to preconfirm.
    1. This could be the builder, the proposer, or a sophisticated third party (“a gateway”) chosen by the proposer.
  2. Where in the block the preconfirmed transactions are included.
    1. Granular control over placement can be interesting for traders even without execution preconfs.
  3. Whether only inclusion or additionally execution is guaranteed.
    1. Without an execution guarantee, an included transaction could still fail, e.g. if it tries to trade on an opportunity that has disappeared.
  4. How and what amount of collateral the builder or proposer puts up
    1. Preconfers must be disincentivized from reneging on their promised preconfs for these to be credible.
    2. E.g. This could be a Symbiotic or Eigenlayer service, and proposed collateral requirements range from 1 ETH to 1000 ETH.

Final Word

Chorus One has been deeply involved with preconfirms from the very beginning, pioneering some of the first-ever preconfirms using Bolt during the ZuBerlin and Helder testnets. We’re fully immersed in optimizing the Proposer-Builder Separation (PBS) pipeline and are excited about the major developments currently unfolding in this space. Stay tuned for an upcoming special episode of the Chorus One Podcast, where we’ll dive more into this topic.

If you’re interested in learning more, feel free to reach out to us at research@chorus.one.

About Chorus One

Chorus One is one of the largest institutional staking providers globally, operating infrastructure for over 60 Proof-of-Stake (PoS) networks, including Ethereum, Cosmos, Solana, Avalanche, Near, and others. Since 2018, we have been at the forefront of the PoS industry, offering easy-to-use, enterprise-grade staking solutions, conducting industry-leading research, and investing in innovative protocols through Chorus One Ventures. As an ISO 27001 certified provider, Chorus One also offers slashing and double-signing insurance to its institutional clients. For more information, visit chorus.one or follow us on LinkedIn, X (formerly Twitter), and Telegram.

Core Research
An introduction to oracle extractable value (OEV)
This is a joint research article written by Chorus One and Superscrypt, explaining OEV, and how it can be best captured.
August 30, 2024
5 min read

This is a joint research article written by Chorus One and Superscrypt

Blockchain transactions are public and viewable even before they get written to the block. This has led to maximal extractable value (‘MEV’), i.e. where actors frontrun and backrun visible transactions to extract profit for themselves.

The MEV space is constantly evolving as competition intensifies and new avenues to extract value are always emerging. In this article we explore one such avenue - Oracle Extractable Value, where MEV can be extracted even before transactions hit the mempool.

This is particularly relevant for borrowing & lending protocols which rely on data feeds from oracles to make decisions on whether to liquidate positions or not. Read on to find out more.

Introduction

Value is in a constant state of being created, destroyed, won or lost in any financialized system, and blockchains are no exception. User transactions are not isolated to their surroundings, but instead embedded within complex interactions that determine their final payoff.

Not all transaction costs are as explicit as gas fees. Fundamentally, the total value that can be captured from a transaction includes the payoff of downstream trades preceding or succeeding it. These can be benign in nature, for example, an arbitrage transaction to bring prices back in line with the market, or impose hidden taxes in the case of front running. Overall, maximal extractable value (or “MEV”) is the value that can be captured from strategically including and ordering transactions such that the aggregate block value is maximized.

If not extracted or monetized, value is simply lost. Presently, the actualization of MEV on Ethereum reflects a complex supply chain (“PBS”) where several actors such as wallets, searchers, block builders and validators fill specialized roles. There are returns on sophistication for all participants in this value chain, most explicitly for builders which are tasked with creating optimal blocks. Validators can play sophisticated timing games which result in additional MEV capture; for example, Chorus One has run an advanced timing games setup since early 2023, and published extensively on it. In the PBS context, the best proxy for the total MEV extracted is the final bid a builder gets to submit during the block auction.

Such returns on sophistication extend to the concept of Oracle Extractable Value (OEV), which is a type of MEV that has historically gone uncaptured by protocols. This article will explain OEV, and how it can be best captured.

Oracles

Oracles are one of crypto's critical infrastructure components: they are the choreographers that orchestrate and synchronize the off-chain world with the blockchain’s immutable ledger. Their influence is immense: they inform all the prices you see and interact with on-chain. Markets are constantly changing, and protocols and applications rely on secure oracle feed updates to provide DeFi services to millions of crypto users worldwide.

The current status-quo is that third-party oracle networks serve as intermediaries that feed external data to smart contracts. They operate separately from the blockchains they serve, which maintains the core goal of chain consensus but introduces some limitations, including concepts such as fair sequencing, required payments from protocols and apps, and multiple sources of data in a decentralized world.

In practical terms, the data from oracles represents a great resource for value extraction. The market shift an oracle price update causes can be anticipated and traded profitably, by back-running any resulting arbitrage opportunities or (more prominently) by capturing resulting liquidations. This is Oracle Extractable Value. But how is it captured, and more importantly, who profits from it?

A potential approach to understand the value in OEV (using AAVE data).
Oracle Extractable Value (OEV)

In MEV, searchers (which are essentially trading bots that run on-chain) profit from oracle updates by backrunning them in a free-for-all priority gas auction. Value is distributed between the searchers, who find opportunities particularly in the lending markets for liquidations, and the block proposers that include their prices in the ledger. Oracles themselves have not historically been a part of this equation.

OEV changes this flow by atomically coupling the backrun trade with the oracle update. This allows the oracle to capture value, by either acting as the searcher itself or auctioning off the extraction rights.

How OEV created in DeFi can be captured by MEV searchers before the dApp gets access to it.

OEV primarily impacts lending markets, where liquidations directly result from oracle updates. By bundling an oracle update with a liquidation transaction, the value capture becomes exclusive, preventing front-running since both actions are combined into a single atomic event. However, arbitrage can still occur before the oracle update through statistical methods, as traders act on the true price seen in other markets

Current landscape

UMA and Oval:

  • UMA has developed a middleware product called Oval (in collaboration with Flashbots), which aims to redistribute value more fairly within the DeFi space.
  • Oval works by wrapping data and conducting an order flow auction where participants bid for the right to use the data, with proceeds shared among protocols like Aave, UMA, and Chainlink.
  • This means that Oval inserts an auction mechanism and lets the market decide what a particular price update is worth.
  • This system helps DeFi protocols like Aave capture value that would otherwise go to liquidators or validators, potentially increasing their revenue.
  • Recently, Oval announced they had successfully completed the “world’s first OEV capture”, through a series of liquidations on the platform Morpho Labs. They even claim a 20% APY boost on some pairs on Morpho.

API3 and OEV Network:

  • API3 launched the OEV Network as a L2 solution, which uses ZK-rollups to capture and redistribute OEV within the DeFi ecosystem.
  • The network functions as an order flow auction platform where the rights to execute specific data feed updates are sold to the highest bidder.
  • This is a different extraction mechanism, as it turns the fixed liquidation bonus into a dynamic market-driven variable through competition.
  • This approach aims to enhance the revenue streams of DeFi protocols and promote a more balanced ecosystem for data providers and users.
  • API3’s solution also incentivizes API providers by distributing a portion of the captured OEV, thus encouraging direct participation and somewhat disrupting the dominance of third-party oracles​.

Warlock

  • Warlock is an upcoming OEV solution that will combine an oracle update sourced from multiple nodes with centralized backrun transactions.
  • The oracle update will feature increasing ZK trust guarantees over time, starting with computation consistency across oracle nodes.
  • Centralizing the backrun allows for lower latency updates, precludes searcher congestion, and protects against information leakage as the searcher entity retains exclusivity, i.e. does not need to obscure alpha. Warlock will service liquidations with internal inventory.
  • The upshot is that lending markets can offer more margin due to less volatility exposure via lower latency. The relative upside will scale with the sophistication of the searcher entity and the impact of congestion on auction-type OEV.
  • Overall, the warlock team estimates that a 10-20% upside will accrue to lending markets initially, with a future upside as value capture improves.

Where could this go?

The upshot of this MEV capture is that oracles have a new dimension to compete on. OEV revenue can be shared with dApps by providing oracle updates free of charge, or by outright subsidizing integrations. Ultimately, protocols with OEV integration will thus be able to bid more competitively for users.

OEV solutions share the same basic idea - shifting the value extraction from oracle updates to the oracle layer, by coupling the price feed update with backrun searcher transactions.

There are several ways of approaching this - an OEV solution may integrate with an existing oracle via an official integration, or through third party infrastructure. These solutions may also be purpose built and provide their own price update.

Heuristically, the key components of an OEV solution are the oracle update and the MEV transaction - these can be either centralized or decentralized.

We would expect purpose-built or “official” extensions to existing oracles to perform better due to less latency versus what would be required to run third party logic in addition to the upstream oracle. Additionally, these would be much more attractive from a risk perspective, as in the case of third party infrastructure, updates could break undesired integrations spontaneously.

The practical case is that a centralized auction can make most sense in latency-sensitive use cases. For example, it may allow a protocol to offer more leverage, as the risk of stranding with bad debt due stale price updates is minimized. By contract, a decentralized auction likely yields the highest aggregate value in use cases where latency is not as sensitive, i.e. where margin requirements are higher.

Mechanisms and Implications of OEV
  1. Atomic Liquidations
    • In a network supply chain, several blockchain actors can benefit from the information arbitrage that they possess.
    • Entities with privileged access to oracle data can leverage this information for liquidation or arbitrage
    • This can create unfair advantages and centralize power among those with early data access.
  2. A new dimension to compete on
    • OEV can lead to substantial profit opportunities, with estimated profits in the millions of dollars. This is especially true in highly volatile markets.
    • OEV enables oracles to distribute atomic backrun rights to searchers, capturing significant value
    • Ecosystems that distribute value in proportion to the contributions (of users, developers, and validators) are likely to thrive.
  3. Potential Risks and Concerns
    • If not managed properly, OEV can undermine the fairness and integrity of decentralized systems. Although the size of the oracle remains the same, it opens the door to competition on the value they can extract and pass onto dApps.
    • Some oracles like Chainlink have moved to reduce OEV and mitigate its impact, by refusing to endorse any third-party OEV solution. However, canonical OEV integrations are important as third party integrations bring idiosyncratic risk.
    • In traditional finance, market makers currently make all of the money from order flow. In crypto, there is a chance that value can be shared with users.
  4. Mitigation Strategies
    • Decentralization of Oracles: Using multiple independent oracles to aggregate data can reduce the risk of any single point of control.
    • Cryptographic Techniques: Techniques like zero-knowledge proofs can help ensure data integrity and fair dissemination without revealing the actual data prematurely.
    • Incentive Structures: Designing incentive structures that discourage exploitative behavior and promote fair access to data. Ultimately, the goal is a competitive market between oracles, where they compete with how much value can pass downstream.

Key Insights
  • Revenue Enhancement: By capturing OEV, projects can significantly enhance the revenue streams for DeFi protocols. For example, UMA’s Oval estimates that Aave missed out on about $62 million in revenue over three years due to not capturing OEV. By enabling these protocols to capture such value, they can reduce unnecessary payouts to liquidators and validators, redirecting this value to improve their own financial health.
  • Decentralization and Security: API3’s use of ZK-rollups and the integration with Polygon CDK provides a robust, secure, and scalable solution for capturing OEV. This approach not only ensures transparency and accountability but also aligns with the principles of decentralization by preventing a single point of failure and enabling more participants to benefit from the system. An aspect of this is also addressed by oracle-agnostic solutions and order flow auctions.
  • Incentives for API Providers: Both API3 and UMA’s solutions include mechanisms to incentivize API providers. API3, in particular, allows API providers to claim ownership of their data in Web3, providing a viable business model that promotes direct participation and reduces reliance on third-party oracles.
  • Impact on Users and Developers: For users and developers of DeFi applications, these innovations should be largely invisible yet beneficial. They help ensure that DeFi protocols operate more efficiently and profitably, potentially leading to lower costs and better services for end-users.
  • Adoption by Oracles and Protocols: Ultimately, the oracles have a part to play in the expansion and acceleration of OEV extraction, through themselves or more realistically, by partnering with third-party solutions. In the last weeks, UMA has launched OEV capture for Redstone oracle feeds, whilst Pyth Network announced their pilot for a new OEV capture solution. Protocols might also want to strike a balance between a new revenue stream ( for the protocol, liquidity pools, liquidity providers…) and the negative externalities of their user base.

OEV is still in its early stages, with much development ahead. We're excited to see how this space evolves and will continue to monitor its progress closely as new opportunities and innovations emerge.

About Chorus One

Chorus One is one of the largest institutional staking providers globally, operating infrastructure for over 60 Proof-of-Stake (PoS) networks, including Ethereum, Cosmos, Solana, Avalanche, Near, and others. Since 2018, we have been at the forefront of the PoS industry, offering easy-to-use, enterprise-grade staking solutions, conducting industry-leading research, and investing in innovative protocols through Chorus One Ventures. As an ISO 27001 certified provider, Chorus One also offers slashing and double-signing insurance to its institutional clients. For more information, visit chorus.one or follow us on LinkedIn, X (formerly Twitter), and Telegram.

No results found.

Please try different keywords.

 Join our mailing list to receive our latest updates, research reports, and industry news.