Blog

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Networks
Opinion
Core Research
A Deep-Dive into Saga
We take an in-depth look at Saga, a protocol that lets you create your own dedicated blockspace in minutes
September 5, 2023
5 min read
Introduction

Web3 founders face a crucial decision when deciding to launch their product. If they want to avoid the layer 2 option due to concerns surrounding centralized sequencers and multisig bridges, they must choose between two main paths: developing their product as a smart contract and deploying it on an existing Layer 1 blockchain, or taking the ambitious route of creating their own blockchain from scratch. The former option comes with different advantages, notably removing the complexities of infrastructure management, ensuring a decentralized foundation, and leveraging the network effect inherent in the underlying blockchain.

Yet, opting for a smart contract deployment is not without tradeoffs. It leads to a competition for block space, resulting in a worse user experience characterized by inflated gas costs and transaction fees, coupled with an impact on transaction executions. The immutability of smart contracts can also be restrictive, offering little flexibility for the protocol in the case of critical bugs or hacks. The smart contract approach also lacks sovereignty, as the protocol will be subject to the rules of the hosting blockchain.

One solution that has gained popularity in the last two years to address the challenges of the smart contract approach is the appchain thesis, which was pioneered by Cosmos and followed by Polkadot. The idea behind this model is to build a dedicated blockchain for one application. Compared to the smart-contract solution, this model offers sovereignty and full customizability from the blockchain to the application. It also enhances performance and scalability since the application has its own blockspace. This leads to increased opportunities for the token to capture value, such as MEV, as Osmosis does, in addition to capturing other network fees.

Certainly, this solution involves several important factors to consider. It requires the management of the chain's infrastructure, ensuring its own security, attracting validators, and designing a tokenomics model that aligns the interests of validators, stakers, and app users.

What if we could easily launch an application, similar to deploying a smart contract, and gain the benefits of an appchain, all without any initial investment or extensive effort? This is exactly what Saga's value proposition is about.

Saga’s value proposition and architecture

The Saga protocol functions like application-specific blockchains as a service. In other words, Saga is a blockchain used to easily launch other blockchains, called “Chainlets” in the Saga ecosystem. Chainlets are secured by the Saga blockchain and its validators through a mechanism called Interchain Security, a well-known shared-security system in Cosmos.

Interchain security means that one blockchain, in this case Saga, acts as a provider of security for other blockchains, in this case the Chainlets. As a result, the Chainlets inherit the benefits of running a Cosmos SDK appchain but outsource their block validation and validator set to Saga.

Therefore, a Chainlet is a sovereign blockchain that has the same level of security and decentralization as Saga.

Saga introduces an easy, decentralized, and secure approach to deploying application-specific blockchains. This solution also grants developers the autonomy to choose their preferred Virtual Machine (VM), with initial support for the Ethereum Virtual Machine (EVM).

In the long run, Chainlets aims to be VM agnostic, which means that developers would have the flexibility to choose from a variety of virtual machines, including the EVM, CosmWasm, or the Javascript VM for example.

Different examples of Chainlets

How to launch its own Chainlet?

The way Chainlets are created differs slightly from what we can observe on the Cosmos Hub when launching consumer chains with Replicated Security. In contrast to the Cosmos Hub, the launch of a Chainlet with Saga is entirely permissionless.

Developers only need to have SAGA tokens to pay for setting up and maintaining their Chainlet. This is similar to services offered by Amazon Web Services and other SaaS platforms, except that here the subscription fee is paid in SAGA tokens to create and maintain a Chainlet.

This means that once the fee is paid, the role of Saga validators is to set up and run the infrastructure for a Chainlet, similar to how Cosmos Hub validators also operate the infrastructure of the consumer chains.

To launch a Chainlet, a developer is required to allocate funds to an escrow account using SAGA tokens. This escrow account can be pre-funded to any desired amount and works like a prepaid service to cover the costs associated with the Chainlet. If the deposited fee is depleted, the Chainlet goes offline until the developer deposits more SAGA in the account. The fee is determined per epoch, where one epoch lasts approximately one day.

Diverse methods could be used for funding the escrow account with SAGA tokens:

  1. Directly fund the account with SAGA tokens
  2. Stake SAGA with the escrow account to cover the fee through staking rewards
  3. Allow sponsors, communities and DAOs to pay the fee
  4. Implement an IBC mechanism to seamlessly convert any crypto into SAGA and pay for the fee

This subscription fee is determined by the Saga validator set. Before the start of a new epoch, each Saga validator submits the fee they would like to receive for running a Chainlet. These bids are then locked before the start of the next epoch, and a Musical Chair Auction begins.

The Musical Chair Auction is a process that aims to establish a universal price for running a Chainlet. In this context, each validator presents their bid, and only the w validators with the lowest prices are included in the 'Winning Set'. The remaining validators with higher bids constitute the 'Losing Set'.

The final cost of running a Chainlet is determined by the highest bid within the Winning Set. This implies that the validator with the highest bid in the Winning Set gets its desired price, while other validators within the Winning Set not only secure their desired price but also receive an additional margin on their bid.

The price that developers will have to pay for Saga validators to run a Chainlet is:

Pricerun chainlet = max(BidWinning Set )Number ValidatorsSaga  

To prevent collusion or Sybil attacks related to the Winning and Losing Set, the count of validators within this set must be large enough to make controlling the Winning Set challenging. According to the Saga team, this number should range between 75% and 85% of the participants in the Musical Chair Auction.

However, the Musical Chair Auction is not riskless for a validator. In fact, the mechanism is designed to incentivize validators to submit bids as low as possible, rewarding validators within the Winning Set, while penalizing those in the Losing Set.

A possible way for the team to handle punishment is to treat it like validator downtime: validators who are down for a certain period get a minor slash and are jailed (removed from the active set). Validators who lose the auction too often in a given period could also be minorly slashed and jailed.

Hence, the SAGA token has multiple use cases: it is used as a subscription fee to keep the Chainlet alive and to reward the validators for running the infrastructure. In this case, there is a 1:1 relationship between costs and revenues with the auction system. We can also think about having pools of validators that share the cost, with validators only running some Chainlets and not others, to improve scalability.

Saga and its Chainlets introduce an interesting token structure, as gas fees are not explicitly collected from end users. Within a Chainlet, gas fees can be paid using Saga, the developer’s own Chainlet token, no tokens at all (gasless transactions), or even other tokens such as ETH or USDC.

It's worth noting that gas fees generated within a specific Chainlet are directed to a wallet managed by the developer. This confers a high degree of flexibility to the Chainlet and its team in determining their preferred monetization approach.

Consequently, with Chainlets, developers benefit from predictable and low costs, an easy process for deploying their blockchains, and the capacity to horizontally scale applications. While Chainlets inherit security from Saga, there exists a method for a Chainlet to also leverage and inherit Ethereum's security using the Saga stack. Let’s delve into this aspect in the following section.

Zoom on a specific type of Chainlets: Ethlets

Saga Ethlet is a new Ethereum scaling solution that combines the best attributes from appchains, rollups, and validiums into a single product. Launching an Ethlet will be as easy as launching a Chainlet: with one click, an Ethlet can be created and inherit Ethereum's security.

How does this mechanism work? Ethlets work with three essential components: Data Availability, State Hash Commitment, and Fraud Proof.

At the end of each epoch (~ 1 day), blocks produced during that time frame are batched, forming the 'batched epoch'. A new epoch referred to as the 'challenge period' then begins. During this challenge period, Saga’s validators can use a fraud-proof mechanism (optimistic ZK or interactive) that enables the identification of any fraudulent transactions or state transitions that might occur within the blocks from the batched epoch. If, by the end of the challenge period, no fraud-proof has been presented, the state hash of the previous batched epoch is committed to Ethereum, and therefore, this committed state inherits the security of Ethereum.

This implies that there is a one-epoch delay for a state hash to be committed to Ethereum and inherit its security. However, it's important to note that blocks inherit Saga’s security even before being committed to Ethereum.

Finally, Saga will be used as a Data Availability layer, similar to a validium, to avoid the high Data Availability costs of Ethereum. An Ethlet thus achieves fast finality through Tendermint, facilitates rapid bridging, and leverages the advantages of IBC. This approach ensures cost-effectiveness while also inheriting Ethereum's security.

Conclusion

Saga offers any developer the ability to easily launch their application as a Chainlet and inherit Saga’s mainnet level of security and decentralization from the start. By choosing this option, the application will benefit from its dedicated blockspace, and the team will gain more control over the blockchain and the application layers compared to launching as a smart contract. If the developer choses, they can upgrade a Chainlet into an Ethlet and gain the benefits of Ethereum Security.

Saga is initially focused on gaming and entertainment chains, as we can notice from their partnerships. Gaming applications are one of the fastest-growing sectors in web3, and a gaming project, such as a video game, needs its own dedicated scalable blockchain capable of supporting high transaction volumes – exactly what Saga is offering and what Chainlets based on the Cosmos SDK can provide. As web3 gaming and entertainment continue to grow and the demand for scalable architecture for users increases, Saga presents itself as the solution to provide the necessary architecture and is confident in onboarding the next 1000 chains in the Multiverse.

About Chorus One

Chorus One is one of the biggest institutional staking providers globally operating infrastructure for 40+ Proof-of-Stake networks including Ethereum, Cosmos, Solana, Avalanche, and Near amongst others. Since 2018, we have been at the forefront of the PoS industry and now offer easy enterprise-grade staking solutions, industry-leading research, and also invest in some of the most cutting-edge protocols through Chorus Ventures.

Opinion
Core Research
Staking is the least-risky source of yield in crypto.
We comment on the reasons why we believe so and compare it to other forms of yield.
March 19, 2023
5 min read

This document is a summary of a longer article — “The financialized staking economy” — published in Chorus One’s ‘Annual Staking Review’ for 2022. Click here to read the entire report.

Cryptocurrencies can be used in three kinds of yield-bearing activity. These have cumulative trust assumptions -

  • Base layer: Staking income is generated by the chain itself to incentivize its liveness & security.
  • Smart contract layer: Protocols run on the chain and may pay incentives for capital. At a minimum, these carry risks associated with protocol security (e.g. hacks), and protocol design (e.g. collateral management).
  • Off-chain: Centralized parties may offer interest on cryptocurrency assets. Complex trust assumptions are involved here including counterparty prudence & sophistication, technical security, and legislative risk.

We believe staking yield is the most attractive risk-adjusted source of yield in crypto for two reasons:

  1. Firstly, yield enabled by the base layer, i.e. staking yield, carries by far the least risk. Specifically, it does not carry significant idiosyncratic risk beyond the priced-in chain risk, as a failure for staking yield to materialize, or a reduction of the notional for an appropriately operated node would be equivalent to chain failure. There is some tail risk associated with improper operation of validator nodes (e.g. double signing, downtime), but this can be minimized by choosing a professional validator like Chorus One.
  2. Secondly, it delivers competitive returns, even if compared to riskier sources of yield. For example, using Uniswap (the largest DeFi App of all) as a proxy, liquidity provisioning on Uniswap is a losing proposition for as much as 50% of users due to “impermanent risk”. A second example is Binance Earn as a stand-in for off-chain yield generation — it currently pays 4.3% on Ethereum, vs. a 7.5% staking yield! Especially in an environment with limited organic on-chain activity, staking is a very competitive source of return. If on-chain activity increases, staking yield adjusts to this, via increased transaction fees and MEV rewards. It’s a call option on on-chain activity.
Staking is the most attractive yield source in crypto

Why staking is an attractive source of yield beyond crypto

Proof-of-stake ecosystems do not have an anchor in the real world. This means that the staking yield rate denoted in native terms is completely decoupled from any kind of factor in the wider economy. For staking, endogenous capital (e.g. ETH) is the only factor of production.

This is a difference to proof-of-work (PoW) systems, where electricity and hardware costs serve as an unbridgeable anchor to the real economy, directly affecting a miner’s yield rate. It is also different from most CeFi and DeFi yield sources, which depend more heavily on user activity.

The above implies that staking can be an uncorrelated yield source for two kinds of investors — those that are bullish long-term and denominate their holdings in native units, and those that are hedged against the price risk of the staked asset.

Hedging the staking yield

The token price risk may be hedged out through on- or off-chain solutions. The former case has the advantage of transparency, reflected in an improved counterparty risk assessment and iron-clad terms. With some of the largest lending desks in the space embroiled in a liquidity crisis, this is a significant factor. Validators are ideally positioned to execute on-chain hedging, as they directly interface with the staking yield source and thus no custody transfer, i.e. additional risk, is required to interface with a hedging solution.

One increasingly popular on-chain hedging solution is a “staking yield interest rate swap”. This allows validators to swap token-denominated staking yield for a stablecoin, typically USDC, locking in a stable and predictable income for a staking client. The associated risk is very minor as neither the validator nor the swap counterparty takes custody of the principal — the worst case, a counterparty default, would reduce to the price risk on the yield earned on the staked notional. Chorus One can leverage Alkimiya, the leading protocol for on-chain capital markets, to execute this type of hedge.

A second way to hedge is by using the staking yield to finance classic options-based strategies. For example, a zero-cost collar options package may incorporate the staking yield in a way that enables an asymmetric pay-off.

Chorus One is invested in & advises a range of solutions optimizing staking yield for return (i.e. MEV) and risk (i.e. hedging). Reach out to us at [email protected] to learn more about how these can be tailored to fit your use case.

About Chorus One

Chorus One is one of the biggest institutional staking providers globally operating infrastructure for 35+ Proof-of-Stake networks including Ethereum, Cosmos, Solana, Avalanche, and Near amongst others. Since 2018, we have been at the forefront of the PoS industry and now offer easy enterprise-grade staking solutions, industry-leading research, and also invest in some of the most cutting-edge protocols through Chorus Ventures. We are a team of over 50 passionate individuals spread throughout the globe who believe in the transformative power of blockchain technology.

For more information, please visit chorus.one

Networks
Core Research
Cosmos ticks all the boxes in building the ultimate modular blockchain
We evaluate why Cosmos is the best solution for building a modular blockchain.
March 19, 2023
5 min read

Introduction

Cosmos is steadily becoming the place to create the ultimate modular blockchain. Cosmos SDK allows developers to effortlessly roll out tailored blockchains, resulting in a flood of new projects that provide specialized settings for novel products. The goal of modular blockchains is to divide Execution, Settlement, Consensus, and Data Availability. Refer to page 19 of this report to learn more about modular vs. monolithic blockchain designs (Ethereum). As a result, we see various teams tackling the issues of each layer and creating optimal solutions and developer environments. Ultimately, developers could use these optimizations to create an application that is highly performant using such an ultimate modular blockchain. Not to mention the greater decentralization that comes with spreading your product across numerous ecosystems.

Let’s go over the problems that current ecosystems face in each layer of the modular stack, and how various quality teams are solving these issues. Please bear in mind that there are other teams that are solving these issues too, we are just exploring some.

Issues with Data Availability

It is important to explain that when a block is appended to the blockchain, each block contains a header and all the transaction data. Full nodes download and verify both, whilst light clients only download the header to optimize for speed and scalability.

Full nodes (validators) cannot be deceived because they download and validate the header as well as all transaction data, whereas light clients only download the block header and presume valid transactions (optimistic). If a block includes malicious transactions, light clients depend on full nodes to give them a fraud proof. This is because light nodes verify blocks against consensus rules, but not against transaction validity proofs. This means that a 51% attack where consensus is altered can easily trick light nodes. As node runners scale, secure methods to operate light clients would be preferable because of their reduced operational costs. If nodes are cheaper to run, decentralization also becomes easier to achieve.

The DA problem refers to how nodes can be certain that when a new block is generated, all of the data in that block is truly published to the network. The problem is that if a block producer does not disclose all of the data in a block, no one will be able to determine if a malicious transaction is concealed within that block. A reliable source of truth as a data layer is required that orders transactions as they come and checks their history. This is what Celestia does, solely optimizing the Consensus and the DA layer. This entails that Celestia is only responsible for ordering transactions and guaranteeing their data availability; this is similar to reducing consensus to atomic broadcast. This is the reason why Celestia was originally called ‘Lazy Ledger’, however, efficiently performing this job for a future with thousands of applications is no easy job. Celestia can also take care of consensus. See the different types of nodes in Celestia here.

​​Two key features of Celestia’s DA layer are data availability sampling (DAS) and Namespaced Merkle trees (NMTs). Both are innovative blockchain scalability solutions: DAS allows light nodes to validate data availability without downloading a complete block; NMTs allow Celestia’s execution and settlement layers to download transactions that are only meaningful to them. In a nutshell, Celestia allows light nodes to verify just a small set of data, that when combined with the work of other light nodes, provides a high-security guarantee that the transactions are valid. Hence, Celestia assumes that there is a minimum number of light nodes sampling the data availability layer.

“This assumption is necessary so that a full node can reconstruct an entire block from the portions of data light nodes sampled and stored.”

It is worth noting for later that these layers (DA & Consensus) are naturally decentralized and easier to have fully on-chain, as most of the work is taken on by the validators. Scaling here will ultimately depend on the consensus algorithm. ‘Rollapp’ developers will not need to assemble a validator set for their applications either.

Issues with Execution & Settlement layers

  • Execution refers to the computation needed for executing transactions that change the state machine accurately.
  • Settlement involves creating an environment in which execution levels can check evidence, settle fraud claims, and communicate with other execution layers.

The present web3 environment suffers from centralization in the execution and settlement layers. This is due to the fact that the on-chain tech stack severely limits an application’s functional capability. As a result, developers are forced to perform heavy computation off-chain, in a centralized manner. On-chain apps are not inherently interoperable with external systems, and they are also constrained by a particular blockchain’s storage and processing capability.

More than just a distributed blockchain database is required to create the ultimate decentralized apps. High-performance processing, data IO from/to IPFS, links to various blockchains, managed databases, and interaction with various Web2 and Web3 services are all common requirements for your application. Additionally, different types of applications require different types of execution environments that can optimize for their needs.

Blockless — Facilitating custom execution

Blockless can take advantage of Celestia’s data availability and focus to improve application development around the execution layer. Blockless provides a p2p execution framework for creating decentralized serverless apps. dApps are not limited by on-chain capacity and throughput by offloading operations from L1 to the performant, configurable execution layer offered by Blockless. With Blockless you can transfer intensive processing from a centralized cloud service platform or a blockchain to the Blockless decentralized node network using built-in functions. With the Blockless SDK, you can access any Web2 and Web3 applications as it currently supports IPFS, AWS3, Ethereum, BNB Chain, and Cosmos.

Developers using Blockless will only need to provide the serverless functions they want to implement (in any language!), as well as a manifest file that specifies the minimal number of nodes required, hardware specifications, geolocation, and node layout. In no time, their services will be operating with ultra-high uptime and hands-free horizontal scaling. To learn more about the architecture of the Blockless network go here, but yet again, its orchestration chain is a Cosmos-based blockchain responsible for function/app registration. The cherry on the cake is that you can use and incorporate or sell community functions and extensions into your own application design in a plug-and-play manner using Blockless Marketplace. In Cosmos, you can already do this through projects like Archway or Abstract.

SAGA — Rollups as a service and Settlement optimization

Popular L2s and Rollups today like Arbitrum, Optimism, and StarkNet use Ethereum for data availability and rely on single sequencers to execute their transactions. Such single sequencers are able to perform fast when submitting to Ethereum but evidently stand as a centralized point of failure. Saga has partnered with Celestia to provide roll-ups as a service to enable a decentralized sequencer set.

Saga’s original design is meant to provide critical infrastructure to the appchain vision, where the Saga protocol abstracts away the creation of a blockchain by leveraging IBC.”

Saga provides easy-to-deploy “chainlets” for any developer to roll out an application without having to care about L1 developments. Although their main focus is to support full appchain formation on top of the
Saga Mainnet, the technology can also support the modular thesis. This means that rollup developers can use Saga’s validators to act as sequencers and decentralize their set. In other words, Saga validators can also work in shifts submitting new blocks for Celestia rollups.

https://sagaxyz.cdn.prismic.io/sagaxyz/08e727f2-88a2-4c95-ad17-b0b9579d2b69_saga-litepaper-march-2022.pdf

Saga offers a service that organizes validators into sequencers and punishes misconduct through shared security. Saga’s technology provides functionalities to detect invalid block productions with fraud proofs and to manage censoring or inactivity, challenges are made to process a set of transactions. This means that Saga can enhance the settlement layer whilst using Celestia for data to generate fraud proofs and offline censor challenges. This could also even be done for Ethereum, with the additional benefit of having shared security between chainlets and IBC out of the box. To further understand the difference between running a rollup or a chainlet, please refer to this fantastic article.

Conclusion

In such a modular world, developers finally have full customization power. One could choose to build sovereign rollup or settlement rollups, or even a hybrid. In our example, it could even be possible to use Saga’s consensus instead of Celestia’s. Referring to our example, we could have an application that decentralizes its execution computing through Blockless whilst programming in any language, decentralizes its sequencer set and is able to deploy unlimited Chainlets if more block space is required with Saga, and has a reliable and decentralized data availability layer with Celestia. What’s best, all these layers are built and optimized with Cosmos SDK chains, meaning they will have out-of-the-box compatibility with IBC and shared security of Chainlets.

MEV
News
Core Research
Solana-MEV Client: an alternative way to capture MEV on Solana
We believe this approach to capture MEV prevents centralization and spam attacks.
February 7, 2023
5 min read

The MEV supply chain is critical to the future performance and business models of the Solana network. Solana is in a phase of actively searching for, and ultimately choosing its MEV supply chain. One approach is to replicate the model established on Ethereum, building a searching and block-building marketplace. This path has multiple downsides, such as artificially introducing a global mempool that would increase Solana’s latency, and may also increase the risk of centralization and censorship.

We’re happy to announce that Chorus One has released a whitepaper today where we contrast the most relevant characteristics of Ethereum and Solana; review some of the features of the block-building marketplace model, i.e “flashbots-like model”, and what retrofitting it onto Solana would entail.

Given the particularities of Solana, we also propose an alternative to the block-building marketplace: the solana-mev client. This model allows for decentralized extraction by validators, through a modified Solana validator client, capable of handling MEV opportunities directly in the banking stage of the validator. Along with the whitepaper, Chorus One is also releasing an open-source prototype implementation of the approach detailed inside the whitepaper itself.

Fig 1: How the solana-mev client works. Green blocks represent the modification in the client.

The client can be run by any validator. Even small validators or those with no specific expertise can benefit from MEV rewards by choosing to run the solana-mev client. That means the validators will be able to execute MEV strategies as they appear in their slot, in contrast with the current competitive aspect of searching, which results in a few winner bots extracting the value.

The model shrinks the incentive for independent bots to spam the network which ultimately contributes to episodes of intense traffic, as most of them send transactions targeting the same MEV opportunities.

Given that not all MEV strategies can be implemented inside the validator, independent searchers will continue to play a relevant role in the MEV space on Solana. That is guaranteed by their advantage of quickly building and updating sophisticated strategies, as well as expanding their focus to newly deployed programs and pools. This includes long-tail MEV.

In summary, the MEV client enables permissionless and decentralized extraction that benefits the ecosystem through transparent and ethical strategies, as well as increased financial returns for network participants.

For a comprehensive overview of the motivations and the model, please refer to the whitepaper here.

Core Research
A deep-dive into Eth-staking-smith
Performant Ethereum validator key management.
January 13, 2023
5 min read

Authors: Jennifer Parak, Maksym Kulish

One of the most important events of 2022 in the crypto community was The Merge upgrade of the Ethereum protocol, which switched Ethereum from a Proof-of-Work legacy chain implementation to a Proof-of-Stake Beacon chain. It has proved that principal innovation is possible for the oldest and largest decentralized systems, without any disruption to the protocol users.

At Chorus One, we worked on securing next-generation Ethereum since the Beacon Chain took off in 2020, and we operate multiple thousands of validators on the mainnet today. Our new product OPUS — an Ethereum Validation-as-a-Service API — is designed to enable any organization and individual to run staking validator clients on Ethereum Beacon Chain, with a non-custodial, permission-less approach where we require customers to specify their own withdrawal and fee recipient addresses, so they remain in possession of both their stake funds and rewards. This post focuses on the technology implementation of validator keys provision and storage approach within our Validation-as-a-Service API product and shows off some challenges we faced and solutions we created in the process.

Background

Ethereum Staking Keys

The Merge has introduced two new types of keys involved in securing the Ethereum chain, in addition to legacy chain wallet keys that are remaining unchanged within Beacon Chain [1]. These keys are composed of the Signing (Validator) key pair and the Withdrawal key pair. In addition to new key functionality, the Signing key is also using a new cryptographic signature scheme, called BLS, which stands for Boneh–Lynn–Shacham. This means older key generation tools will not work for creating Signing keys. BLS signatures, specifically those over the BLS12–381 curve are used in Beacon chain block signatures and attestations. This makes it possible to aggregate multiple signatures and verify them in a single operation, which is an outstanding improvement in scalability [2].

Like most other Proof-Of-Stake blockchains, next-generation Ethereum depends on the functioning of validators for securing the transaction flow. Validators are members of the network who lock a portion of their Ethereum coins (with a minimum amount of 32 ETH) to become responsible for proposing new signed blocks of transactions, and verifying such signatures of other validators, which is called attesting. Normally, every Ethereum validator should attest signatures for a slot once per Ethereum epoch (around 6.4 minutes); and for every slot, in every epoch, one validator is pseudo-randomly chosen to produce a block of transactions to be attested by others. Validators are being rewarded for both block proposals and block attestations. The mechanism of signing the blocks and verifying the signatures of others relies on the Signing key pair. The verification mechanism works because every public part of a Signing key (Public Signing Key) is published on-chain, so every signature done with the private part of the Signing key (Private Signing Key) can be verified by every other validator. Despite having the power for creating blockchain content, the Signing keys can not be used to move any funds including staking funds, and they only listen for and sign the transaction content provided by the peering network of Ethereum nodes.

The Withdrawal key pair is neither used for blocks nor for attestations, but it has control over staked funds. After the Shanghai fork, withdrawals will be activated, which will enable the funds to be moved to an owner-controlled withdrawal address specified in the deposit contract. With EIP-4895 withdrawals will be enabled in a push-based fashion [3], such that funds that were previously locked on the consensus layer on depositing are automatically pushed to the execution layer as a system-level operation. This means users won’t have to pay any gas for a withdrawal transaction. For users who have specified a BLS withdrawal address in their deposit contract, they would need to broadcast a BLS_TO_EXECUTION_CHANGE message to the beacon chain to update their withdrawal address to an execution address.

Finally, when the validator successfully proposes a block, a special Fee Recipient address receives the accumulated gas fees from the block. Since the Fee Recipient is not directly involved in staking, we will largely omit it in this post.

More information about different types of keys involved in Ethereum staking can be found in the following resources: [4], [5]

Managing Validator keys for OPUS Validation-as-a-Service API

As part of the OPUS Validation-as-a-Service API, we require customers to retain ownership of Withdrawal keys, so that staked funds can never be controlled or accessed by Chorus One. A Signing key, however, is different: since Chorus One is a responsible party for hosting and maintaining the Ethereum validator, the inner workings of the Validation-as-a-Service API require us to generate, load, and store Signing keys. Thus, a robust solution for key management is an essential part of our Validation-as-a-Service API.

Early into the project lifecycle, we used the staking deposit command line interface (CLI) provided by the Ethereum Foundation (https://github.com/ethereum/staking-deposit-cli). While the staking CLI is a great tool for solo/home stakers, we realized that it was not designed for our use case. First of all, staking-deposit-cli by default stores the newly generated keystore into a filesystem, posing a potential security threat from leaking key material. While it is possible to use infrastructure-specific workarounds like ramdisks to mitigate the threat, such workarounds would add complexity and failure points to the platform. The open-source nature of staking-deposit-cli allowed us to fork the source code and modify it to cater to our needs, but the lack of thoroughly automated test suites meant we had a hard time syncing our changes with upstream updates. Finally, all of our codebase is Rust, and having to support Python CLI within the infrastructure, including keeping a good security track record by timely patching all the Python dependencies, puts an additional burden on the development team. In the end, we decided to pursue an alternative approach to generating keys, which we describe in the next paragraph.

Eth-staking-smith

Having endured even more difficulties with staking-cli when generating Ethereum keys on a large scale, our Ethereum Team decided to tackle the problem during our company-wide engineering hackathon where we built an MVP for an Ethereum key generation tool written in Rust. This was the birth of the Eth-staking-smith project.

Component diagram of Eth-staking-smith

Eth-staking-smith can be used as a CLI tool or as a Rust library to generate Signing keys and deposit data derived from a new mnemonic or to regenerate deposit data from an existing mnemonic. These use cases were implemented, in order to provide the same functionality as the staking-deposit-cli whilst avoiding all problems mentioned above.

Example command to generate keys from a newly generated mnemonic:

eth-staking-smith new-mnemonic --chain mainnet --keystore_password testtest --num_validators 1

Example command to generate keys from an existing mnemonic:

eth-staking-smith existing-mnemonic --chain mainnet --keystore_password testtest --mnemonic "entire habit bottom mention spoil clown finger wheat motion fox axis mechanic country make garment bar blind stadium sugar water scissors canyon often ketchup" --num_validators 1 --withdrawal_credentials "0x0100000000000000000000000000000000000000000000000000000000000001"

For both use cases, Eth-staking-smith will generate the following key material:
  • Private Signing keys
  • Keystores
  • Mnemonic (existing or newly generated)
  • Deposit data smart contract properties

Let’s zoom into the generated key material

Private Signing keys

As mentioned above, the Private Signing key is the key used to provide a signature to any action taken by the validator. The Eth-staking-smith Signing key output is done without encryption because in our use case, we use remote API to store the key material immediately upon the generation. Remote API implements encryption for both data transfer and data at rest. We decided to make keystore use optional, but Eth-staking-smith can still generate encrypted keystores for the users who need that.

Example:

{

"private_keys": [
"6d446ca271eb229044b9039354ecdfa6244d1a11615ec1a46fc82a800367de5d"
]

}

Keystores

The keystore is an encrypted version of the private Signing key in the specified format [6]. When generating keys with eth-staking smith, a keystore password can be specified and in that case, the keystore data will be output. Using a key derivation function (e.g. <code-text>scrypt<code-text>, or <code-text>pbkdf2<code-text>), a decryption-key is derived using the given passphrase and a set of strong built-in derivation arguments. The example below highlights the field <code-text>function<code-text> that shows the key derivation function used. The keystore is a useful alternative that is less vulnerable to an attacker than storing the private Signing keys in a plaintext file, since they would need the keystore file, as well as the passphrase to decrypt the file.

Example:

{

"keystores": [
{
"crypto": {
"checksum": {
"function": "sha256",
"message": "af14321c3083de535a0dd895b4e2fb156e6b0eda346120c8d7afb5277d3a489f",
"params": {}
},
"cipher": {
"function": "aes-128-ctr",
"message": "8032685ad92a579e66328bbd6c747e41497dc6897c17cebbd83958394943924b",
"params": {
"iv": "da5699bb18ee7fea6095634a2fa05d18"
}
},
"kdf": {
"function": "pbkdf2",
"message": "",
"params": {
"c": 262144,
"dklen": 32,
"prf": "hmac-sha256",
"salt": "afb431f05b7fe02f253d9bc446ac686776541d38956fa6d39e14894f44e414d8"
}
}
},
"description": "",
"name": null,
"path": "m/12381/3600/0/0/0",
"pubkey": "8844cebb34d10e0e57f3c29ada375dafe14762ab85b2e408c3d6d55ce6d03317660bca9f2c2d17d8fbe14a2529ada1ea",
"uuid": "6dbae828-d0f0–42ed-9c06-d9079642ea08",
"version": 4
}
],

}

Mnemonic

The mnemonic, passed in by the user or the one generated, is returned as part of the output so that the user can store it safely. Further information on mnemonics can be found under the reference [7].

Example:

{

"mnemonic": {
"seed": "ski interest capable knee usual ugly duty exercise tattoo subway delay upper bid forget say"
},

}

Deposit data

Finally, the deposit data is returned, which is used to make the deposit of 32ETH using the Ethereum deposit contract to activate the validator. One of the most important fields in the deposit data is the withdrawal credentials.

By default, withdrawal credentials are BLS addresses derived from the mnemonic, however, there exists a use case where a user might want to overwrite the derived withdrawal credentials with already existing ones.

The BLS address format is called <code-text>0x00<code-text> credentials, and is actually set to be deprecated sometime after withdrawals will be enabled. Another alternative way to provide withdrawal credentials is to use a legacy Ethereum wallet address, prefixed by <code-text>0x01<code-text>. Ethereum will be pivoting from using <code-text>0x00<code-text> (formerly eth2) to <code-text>0x01<code-text> execution (formerly eth1) addresses. To learn more about this, we recommend watching the panel from Devcon 2022 [8] and looking into the Ethereum specification [4]. Eth-staking-smith, therefore, allows the user to pass in a <code-text>0x00<code-text>, <code-text>0x01<code-text> execution withdrawal credentials, as well as an execution address to overwrite the withdrawal credentials.

Example deposit data with BLS credentials (<code-text>--withdrawal_credentials 0x0045b91b2f60b88e7392d49ae1364b55e713d06f30e563f9f99e10994b26221d<code-text>)

{

"deposit_data": [
{
"amount": 32000000000,
"deposit_cli_version": "2.3.0",
"deposit_data_root": "95ac4064aabfdece592ddeaba83dc77cf095f2644c09e3453f83253a8b7e0ae1",
"deposit_message_root": "6a0c14a9acd99ab4b9757f2ff2f41e04b44c0c53448fdf978c118841cd337582",
"fork_version": "00001020",
"network_name": "goerli",
"pubkey": "8844cebb34d10e0e57f3c29ada375dafe14762ab85b2e408c3d6d55ce6d03317660bca9f2c2d17d8fbe14a2529ada1ea",
"signature": "82effe6d57877b7d642775ae3d56f9411d41a85218b552c6318925c7ba23f7470ebe3a35045e2fc36b0e848e6f4ec1d503f2014dc5a7ad94a267f5b237f2475b5da9ff358fbd5a8e9f497f1db0cfb15624e686991d002077a6cd4efda8bdc67e",
"withdrawal_credentials": "01000000000000000000000071c7656ec7ab88b098defb751b7401b5f6d8976f"
}
],

}

Below we present a full-fledged example output of the key material generated by Eth-staking-smith:

Command:

eth-staking-smith existing-mnemonic --chain goerli --keystore_password testtest --mnemonic "ski interest capable knee usual ugly duty exercise tattoo subway delay upper bid forget say" --num_validators 1

Output:

{
"deposit_data": [
{
"amount": 32000000000,
"deposit_cli_version": "2.3.0",
"deposit_data_root": "2abc7681f73a01acbc1974ab47119766bf57d94f86a72828f8875295f5bd92de",
"deposit_message_root": "bfd9d2c616eb570ad3fd4d4caf169b88f80490d8923537474bf1f6c5cec5e56d",
"fork_version": "00001020",
"network_name": "goerli",
"pubkey": "8844cebb34d10e0e57f3c29ada375dafe14762ab85b2e408c3d6d55ce6d03317660bca9f2c2d17d8fbe14a2529ada1ea",
"signature": "97c0ad0d4f721dc53f33a399dbf0ff2cab6f679f4efdcdaa9f8bdd22cd11b5e37c12fdd2cd29369b1b907a51573a9ef60f93d768fd2d47a99b5d55fe6516a87b9090e16c42f5a8fcbf91d24883359bffb074a02d6d4d7f6c3cd04c8e09f8dc02",
"withdrawal_credentials": "0045b91b2f60b88e7392d49ae1364b55e713d06f30e563f9f99e10994b26221d"
}
],
"keystores": [
{
"crypto": {
"checksum": {
"function": "sha256",
"message": "af14321c3083de535a0dd895b4e2fb156e6b0eda346120c8d7afb5277d3a489f",
"params": {}
},
"cipher": {
"function": "aes-128-ctr",
"message": "8032685ad92a579e66328bbd6c747e41497dc6897c17cebbd83958394943924b",
"params": {
"iv": "da5699bb18ee7fea6095634a2fa05d18"
}
},
"kdf": {
"function": "pbkdf2",
"message": "",
"params": {
"c": 262144,
"dklen": 32,
"prf": "hmac-sha256",
"salt": "afb431f05b7fe02f253d9bc446ac686776541d38956fa6d39e14894f44e414d8"
}
}
},
"description": "",
"name": null,
"path": "m/12381/3600/0/0/0",
"pubkey": "8844cebb34d10e0e57f3c29ada375dafe14762ab85b2e408c3d6d55ce6d03317660bca9f2c2d17d8fbe14a2529ada1ea",
"uuid": "6dbae828-d0f0–42ed-9c06-d9079642ea08",
"version": 4
}
],
"mnemonic": {
"seed": "ski interest capable knee usual ugly duty exercise tattoo subway delay upper bid forget say"
},
"private_keys": [
"6d446ca271eb229044b9039354ecdfa6244d1a11615ec1a46fc82a800367de5d"
]
}

Security Improvements

Since writing key material on disk was a major security vulnerability for us, Eth-staking-smith removes this issue entirely by not writing any files on disk.

To avoid heavy-lifting and re-creating crypto primitives from scratch, we’re re-using functionalities from the lighthouse client implementation [9] for key generation, which builds on top of blst — a BLS12–381 signature library [10], which is currently undergoing formal verification.

For entropy collection, one customization was made: Eth-staking-smith defers entropy collection to the operating system by using <code-text>getrandom()<code-text> on Linux and thereby making use of Linux’s state-of-the-art randomness approach.

Tweaking Security <> Performance parameters

Finally, since key generation at scale was notoriously slow for us with staking-deposit-cli, we took initiative to add additional arguments for our users to tweak performance <> security parameters depending on their specific use case.

As per our use case, our API does not require the keystore file, but only the private key in raw format. We, therefore, enable the user to opt-out of keystore generation in order to improve performance. This can be done by omitting the <code-text>--keystore_password<code-text> argument as follows:

eth-staking-smith new-mnemonic --chain goerli --num_validators 1

We measured that omitting keystore speeds up the key generation process by 99%. The key generation performance we experience with Eth-staking-smith is consistently sub-second, with slight variability depending on hardware and platform.

In case the user requires the keystore file for redundancy, there’s another option to speed up the keystore generation process by choosing a different key-derivation function. By default, Eth-staking-smith will use <code-text>pbkdf2<code-text> to derive the decryption key to achieve better performance. There’s also the option to use <code-text>scrypt<code-text> which offers better security, however, consequently, worse performance. This can be done by choosing the key-derivation function using the <code-text>--kdf<code-text> argument as follows:

eth-staking-smith new-mnemonic --chain goerli --keystore_password testtest --num_validators 1 --kdf scrypt

Converting BLS withdrawal to execution address

As mentioned above, users who had previously specified a BLS (0x00) withdrawal address, will need to make a request to the beacon chain to update their validators’ withdrawal address to point to an execution address. To perform this operation, the user will need to have the BLS withdrawal key mnemonic phrase. Once done, withdrawals will be automatically funded on the execution address.

Eth-staking-smith enables the user to generate a signed <code-text>BLS_TO_EXECUTION_CHANGE<code-text> message which they can send to the beacon chain to update their withdrawal address.

eth-staking-smith bls-to-execution-change --chain mainnet --mnemonic "entire habit bottom mention spoil clown finger wheat motion fox axis mechanic country make garment bar blind stadium sugar water scissors canyon often ketchup" --validator_index 0 --withdrawal_credentials "0x0045b91b2f60b88e7392d49ae1364b55e713d06f30e563f9f99e10994b26221d" --execution_address "0x71C7656EC7ab88b098defB751B7401B5f6d8976F"

Users can use the response to make the request to the beacon node as follows:

```
curl -H "Content-Type: application/json" -d '{
"message": {
"validator_index": 0,
"from_bls_pubkey": "0x0045b91b2f60b88e7392d49ae1364b55e713d06f30e563f9f99e10994b26221d",
"to_execution_address": "0x71C7656EC7ab88b098defB751B7401B5f6d8976F"
},
"signature": "0x9220e5badefdfe8abc36cae01af29b981edeb940ff88c438f72c8af876fbd6416138c85f5348c5ace92a081fa15291aa0ffb856141b871dc807f3ec2fe9c8415cac3d76579c61455ab3938bc162e139d060c8aa13fcd670febe46bf0bb579c5a"
}'
http://localhost:3500/eth/v1/beacon/pool/bls_to_execution_change
```

Conclusion

Throughout this post, we explained the basics of Ethereum Beacon Chain block validation, the key material involved in the process, and walked through an automation tool we created at Chorus One for Proof-of-Stake key management.

We hope the tool can be useful to some of our readers, especially those who use Rust for their blockchain automation work. It is also open-source, and we will welcome bug reports and pull requests on Github.

If the reader is interested in using OPUS Validation-as-a-Service API which builds upon that automation, you are welcome to join the wait-list for private beta, contact via [email protected].

References:

[1] https://kb.beaconcha.in/ethereum-2-keys

[2] https://eth2book.info/altair/part2/building_blocks/signatures#aggregation

[3] https://eips.ethereum.org/EIPS/eip-4895

[4] https://notes.ethereum.org/@GW1ZUbNKR5iRjjKYx6_dJQ/Skxf3tNcg_

[5] https://github.com/ethereum/consensus-specs/blob/dev/specs/capella/beacon-chain.md

[6] https://github.com/ethereum/EIPs/blob/master/EIPS/eip-2335.md

[7] https://github.com/bitcoin/bips/blob/master/bip-0039.mediawiki

[8] https://www.youtube.com/watch?v=zf7HJT_DMFw&feature=youtu.be

[9] https://github.com/sigp/lighthouse

[10] https://github.com/supranational/blst

Core Research
Networks
Revisiting the “Deflationary Cryptocurrency” definition as ETH post-Merge may turn into one
Will Ethereum become a deflationary cryptocurrency, now that The Merge has happened? The answer, in short, can be given in two words.
October 18, 2022
5 min read

Will Ethereum become a deflationary cryptocurrency, now that The Merge has happened? The answer to this question, in short, can be given in two words: “it depends!” Its long form, however, would offer you a better understanding of whether Ethereum will indeed remain inflationary (albeit only slightly as miners have packed their bags) or become a deflationary asset as time goes on.

If you’re confused by all the information floating around and can’t quite grasp all the terms related to it, read on as this article simplifies the topic at hand. In this piece, we’ll take you by the hand and walk you through the following:
  • The Merge — what is it?
  • Back to the drawing board: What is inflation? What is deflation?
  • Inflationary vs deflationary vs disinflationary crypto assets
  • Explaining Ethereum’s issuance mechanism and inflationary state before The Merge
  • How Ethereum could go from a nearly Net Zero inflation rate to becoming deflationary after The Merge

The Merge — what is it?

At 6:42 AM UTC (2:42 AM EDT / 8:42 AM CEST) on Thursday, September 15, 2022, Ethereum’s long-awaited transition from Proof-of-Work to Proof-of-Stake, dubbed “The Merge”, was finally completed. As Chorus One and the rest of the ecosystem could confirm, the operation — after years of blood, sweat, and delays — was successful.

Having started out as a network relying on Proof-of-Work, thus fast shaping into the “hub” of miners as Bitcoin’s biggest competitor, Ethereum soon encountered scalability issues with its Execution Layer. Too much energy consumption between competing miners to process transactions and not enough security for the network, after all.

The Beacon Chain was, therefore, introduced in December 2020 as the network’s Consensus Layer. This innovation could be seen as Ethereum’s spine, master coordinator, or watchful lighthouse tower, with its key functions set to store data, and manage the network’s validators. Functionalities also included scanning the network, validating transactions, collecting votes, distributing rewards to performing validators, deducting rewards of offline validators, and slashing the ETH of malicious actors.

This Proof-of-Stake blockchain ran alongside the PoW network with the objective to — one day — merge and transform Ethereum into a Proof-of-Stake only network. A win for decentralization and the environment!

That day happened on September 15th, 2022. But before that, Ethereum was inflating at roughly 3.67% — with a ~ 4.62% issuance inflation rate. We will break down the calculations behind this inflation rate, shortly. But first, let’s go back to the drawing board to remind ourselves about the definition of inflation and deflation, in the first place.

What is inflation? What is deflation?

Inflation happens when more bills are printed (FIAT money) or more tokens are minted (cryptocurrency) for circulation in the system. The value of the currency then decreases. In FIAT, this means that more bills would be needed to afford things. In certain cryptocurrencies, this means that the price of the currency goes down.

Deflation, on the other hand, happens when tokens are removed or destroyed from the system through “burning”. By logic, the value of the currency is supposed to increase. There are, however, much more complicated dynamics to this. Those won’t be our point of focus, today.

Explaining Ethereum’s issuance mechanism and inflationary state before The Merge

Before The Merge, Ethereum rewarded the capital-intensive mining activity with up to 2.08 ETH approximately every ~ 13.3 seconds. This rounded up to roughly ~ 4,930,000 ETH/year in miners rewards. The network also had around ~ 119.3M ETH in total supply. (Source: Ethereum.org)

We can find the inflation rate by summing up the Executive Layer and Consensus Layer inflation rates.

Let’s calculate the figure for the Execution Layer by dividing the amount of PoW issued rewards with the total amount of ETH in circulation, to be:
  • ~ 4.93M ETH/ ~ 119.3M ETH = ~ 0.0413 = ~ 4.13%

Then, we move to the Consensus Layer issuance, based on the amount of ETH staked. We’ll round up that number to 13,000,000 of staked ETH presently.

If 1,600 ETH/day is issued, that’s 584K ETH/year in Consensus Layer issuance, amounting to an inflation rate of:
  • ~ 584K ETH/ ~ 119.3M ETH = ~ 0.00489 = ~ 0.49%

That’s almost Net Zero!

Summing both figures, we had an issuance inflation rate of ~ 4.62%, pre-Merge. In other words, miners made approximately ~ 89.4% of issued ETH whilst stakers got ~ 10.6% of the pie as ETH’s issuance inflated at ~ 4.62%.

Goodbye miners. Stay on, stakers!

Through The Merge, Ethereum has therefore addressed:
  1. Energy efficiency
  2. Issuance reduction (up to 88%!)
  3. PoS security, among other things
Among what it hasn’t addressed, however, are:
  1. High gas fees
  2. Slow transaction speed

We’ll get to understand how not addressing high gas fees could actually be a plus for “Deflationary Assets” or “Ultra Sound Money” advocates.

Inflationary vs disinflationary vs deflationary crypto assets

As we go back to the drawing board for the second time in our walk-through, let’s revisit the difference between inflationary, deflationary, and disinflationary crypto assets.

Inflationary

Some cryptocurrencies’ tokenomics are set-up to increase token supply over time. From the start, they are “programmed” to be inflationary. Other cryptocurrency projects, which propose unlimited coin supply, are inflationary as well — as unlimited supply is bound to outweigh demand, decreasing the currency’s value over time. An example of a coin with unlimited supply is DOGECOIN.

Disinflationary

With its halving mechanism until the last 21 millionth Bitcoin is minted, Bitcoin is a disinflationary cryptocurrency. It is set up for a chronological decrease in its issuance. A disinflationary cryptocurrency can, in other words, be described as “an inflationary cryptocurrency with disinflationary measures” in the sense that the demand may, over time, become greater than the diminishing issuance of new tokens.

Deflationary

A good example of a deflationary cryptocurrency is the Binance Coin. BNB’s initial supply saw 200,000,000 tokens in circulation. At the end of Q3, nearly 40 million BNBs had been burned as part of the plan to halve the initial supply from 200 million to 100 million.

Look at tokens in circulation as a balloon and issuance as air: BNB’s mechanism is to deflate the balloon till 50% of air in it remains whilst Bitcoin’s mechanism is to keep inflating its balloon with a set maximum air supply, but doing so with a little less air at every pump.

How Ethereum could go from a nearly Net Zero inflation rate to becoming deflationary after The Merge

So what about Ethereum, now that The Merge has basically rendered a close to Net Zero inflation rate? Why is it touted as a potential deflationary coin?

Enter EIP-1559, the mechanism that burns a portion of ETH gas fees during transactions on the network. With the inflation rate already dropping to 0.49% as explained above, EIP-1559 has the potential to decrease ETH supply — but only on the condition that the gas prices are above 15 Gwei.

Consequently, it is no surprise that ultra sound money advocates would plead users to set their ETH transaction fees to a minimum 15.1 gwei.

Ultrasound.money tracks Ethereum’s supply in real time. A negative figure reflects how many ETHs have been burned since The Merge. In other words, a negative figure showcases deflation, whilst a positive figure showcases inflation.

32 hours into The Merge era, Ethereum had issued over 376 more ETH. Inflationary.

Source: Ultrasound.Money

A month on and the figures keep rising…

Source: Ultrasound.Money

Or maybe not… a wider perspective shows us that there has actually been a decrease since October 8th, when the issuance peaked at over 13,000 ETH.

Source: Ultrasound.Money
Source: Ultrasound.Money

Ethereum — Deflationary or not?

Ultrasound.Money projects gas fees to be above 70 Gwei, registering a -3.40% supply decrease across the next two years.

Source: Ultrasound.Money

As we’ve witnessed now, four weeks since The Merge, we’re bound to see periods of a deflationary ETH and periods with a low but healthy inflation — both of which would be vital for an economic equilibrium.

News
Networks
Core Research
Axelar — Your Plug Into Any Blockchain
Axelar is the most secure, programmable, flexible and composable interoperability network in blockchain.
September 19, 2022
5 min read

Axelar is a universal interoperability network, secured by delegated Proof-of-Stake using AXL, the native token of Axelar: in short, Axelar is a blockchain that connects blockchains. With Axelar, users will be able to use any network with just one wallet (e.g., use MetaMask to make trades on Osmosis). Axelar facilitates many-to-many connectivity and programmability at the network layer for interoperability by connecting to any blockchain via a ‘Gateway’ installed on the connected chain. Users send messages to a Gateway on a source chain, and validators in Axelar’s network sign those messages on a destination chain. Axelar leverages threshold encryption in tandem with its Proof-of-Stake consensus to deliver secure cross-chain communication. Axelar solves the single point-of-failure risks and user-experience issues that are apparent in pairwise bridges and in other interoperability networks, alike. Axelar’s interoperability network unlocks more than just cross-chain transfers; General Message Passing allows developers to perform cross-chain calls of any kind that sync state securely between dApps on various ecosystems. Essentially, the enhanced functionality of cross-chain dApps enabled by Axelar’s network results in a better user experience for all users on all chains. Axelar is valuable for developers because of how inherently programmable, composable, and flexible the network is and for users given the new use-cases it will unlock across chains. Ultimately, Axelar provides permissionless transactions and validation, decentralised security, many-to-many connectivity, and programmability that other interoperability networks cannot duplicate.

What is Axelar?

Axelar is the first fully permissionless and decentralised interoperability network. Axelar is an interoperability Hub that facilitates many-to-many connectivity and acts as an adaptor for any dApp to leverage in order to communicate securely with any dApp on any other blockchain that has a ‘Gateway’ available for Axelar to plug into. The permissionless aspect of Axelar enables any validator to join the decentralised network; unlike other interoperability networks, it is not gated. Axelar reduces the amount of connections found in existing interoperability solutions by acting as a ‘Hub’, whereby each blockchain only needs to connect to Axelar in order to communicate with any other blockchain connected to it as opposed to opening many connections to many blockchains. The fact that Axelar is a blockchain, itself, enhances interoperability capabilities because programmability is possible at the network layer. To expand, actions such as address routing become much more efficient: new chains are immediately accessible to all connected chains, creating compounding network effects. User experience is also improved: Axelar is able to create one-time deposit addresses on connected blockchains, duplicating the user onramps used by centralized exchanges.

How Axelar Works

A user sends a payload to an Axelar Gateway, which is deployed by Axelar in the native language of the source blockchain (e.g. Solidity in Ethereum). The payload is recognised by a relayer in Axelar’s network, which notifies Axelar validators that there is a payload that is ready to be collectively signed (e.g. a cross-chain transfer from a user). At this point, validators come to consensus on what should be done with the payload sent by the user that has reached the Gateway on the source chain (e.g. Ethereum). Validators unwrap the payload and collectively sign on what should be done with it and where to route it (e.g. what network to send the payload to). Axelar network uses a weighted threshold signature scheme that validators abide by, whereby each validator has a % of the overall shares needed to produce a signature that correlates to the amount of AXL (token of Axelar network) staked with them. For example, a gateway might require a threshold percentage of signatures in order to sign a payload. If validators constituting that threshold percentage of the overall stake in Axelar’s network execute signing on a payload, then consensus is reached that approves the payload to be executed on a destination chain. In this case, if it is a cross-chain transfer, then a payload can be executed on a destination chain that mints tokens representing the tokens locked-up on the source chain. However, Axelar’s network can facilitate interoperability interactions that are far more intricate than this. More on this later.

What problem does Axelar solve?

Axelar has a simple but elegant design. The most important element in a bridge comes down to who the owners are of smart contracts that receive cross-chain intent payloads. These owners are given custodial or execution responsibility. If a bridge is centralised, a user would send a payload to a designated signer or group of signers, which would custody and approve the message on the user’s behalf. This approach is known as “proof of authority,” in contradistinction to “proof of stake.” The problem with Proof-of-Authority systems is that a user has to trust these designated signers to behave appropriately and not maliciously. If a centralised group of signers steals or cheats the user — or mismanages their private keys and is hacked — a user can do nothing about it. Therefore, Axelar has created a decentralised and dynamic set of validators to custody or sign payloads from users in a way that is trust-minimised (i.e. a permissionless protocol and incentives provided by the AXL token enforce that parties are responsible for signing or custodying payloads via mechanisms such as cryptography, consensus and economics). Axelar uses threshold encryption, a decentralised network and slashing economics to ensure that all validators behave honestly and user intent is executed across chains securely, safely and correctly.

In general, Proof-of-Authority setups have resulted in hundreds of millions in funds lost to security breaches. The Axie Infinity (Ronin Bridge) hack is a recent, costly example. More decentralised approaches can solve the problem of risks encountered by entrusting a designated group with our intent to move across chains. However, thoughtful approaches are still needed. Wormhole was hacked due to an operational error: a code vulnerability was exposed on their GitHub before it was patched. LayerZero, a well-known decentralised bridge network, leaves critical security decisions up to the application developer and user. Nomad, another well-known project, puts safety behind liveness (if the network halts, transactions are not safe). Nomad recently suffered a multimillion-dollar hack due to a vulnerability left unaddressed in its codebase. Axelar code is rigorously and regularly reviewed by auditors; audits are published here. Axelar code is open-source; a multi-million-dollar bug-bounty program encourages white-hat developers to search for vulnerabilities. Loss-prevention measures are also enabled, including mandatory key rotations, and the ability to disconnect compromised chains quickly, set rate limits and cap transaction amounts.

Axelar solves the security problems that are apparent in other interoperability networks by leveraging threshold encryption and a Proof-of-Stake network for security and consensus whilst simultaneously solving the usability problems presented by pairwise bridges. The user barely has to lift a finger when an application they are interacting with leverages Axelar.

There are other high-quality solutions that match Axelar in terms of security, safety and usability such as Inter Blockchain Communication (IBC). However, IBC is restricted in that it requires extensive integration work to connect to blockchains outside of the ecosystem it was built for (Cosmos). Ultimately, Axelar is the premier solution that solves all interoperability problems faced by other solutions and is unmatched when it comes to security, usability and interconnectivity as Axelar can seamlessly connect to any type of blockchain, regardless of the underlying technology.

Unlocking new use-cases for the cryptocurrency ecosystem with Axelar

As mentioned earlier, Axelar can facilitate interoperability interactions that are far more intricate than just cross-chain transfers. Axelar opens up a multitude of possibilities for users to engage with different chains without having to leave their source chain. This is powerful to comprehend, given users can take actions cross-chain using tools familiar to them such as native wallets and currencies. Let’s dig in.

One example of what is made possible with Axelar’s network is a Cosmos user instantly being able to receive USDC to use on Osmosis from a centralised exchange such as Coinbase without needing to use Ethereum. As it stands right now, if a user has USDC on a centralised exchange and wants to withdraw it to a decentralised exchange, it is highly likely that a user will only be able to withdraw USDC to a network such as Ethereum. This is a terrible user experience for Cosmos users, who will need to receive USDC on Ethereum first, before bridging it to Osmosis. Not only is this an unnecessary amount of steps but a user will also need to purchase ETH in order to pay gas costs to move across chains. With the advent of Axelar (as well as Interchain Accounts), if a user provides a centralised exchange with an address on Ethereum that is being observed by Axelar validators on Ethereum, it will arrive on Osmosis without a user needing to take any extra actions or pay any extra fees. This is possible because validators in Axelar’s network observe payloads incoming into a Gateway (in this case on Ethereum) and the Axelar network understands how to translate it and route it cross-chain. Once a payload arrives on Ethereum, Axelar can create an address for a user on Osmosis to receive the USDC. As a blockchain connecting blockchains, Axelar can execute logic that enables multiple steps to be assembled into 1 for users to take actions cross-chain. In this example, Osmosis users will be able to withdraw from centralised exchanges in 1-step, even if a centralised exchange does not provide the optionality. This will unleash a new wave of liquidity into deFi apps and other decentralized applications, like Osmosis.

The power of Axelar’s network can also be leveraged by users outside the Cosmos ecosystem. For example, an Ethereum user that does not want to leave the comfort of the network can utilise Axelar to take actions on applications that exist outside of Ethereum. To elaborate, let’s say that a user wants to swap ETH for AVAX and then borrow USDC on Avalanche with AVAX as collateral, in a decentralised manner. Right now, a user would probably send ETH to a centralised exchange using MetaMask and pay fees in ETH, sell ETH for USDT/USDC on an exchange, buy AVAX with USDT/USDC in another transaction on an exchange, send the AVAX to an Avalanche wallet and pay fees in AVAX, navigate to a lending protocol front-end, deposit AVAX and pay AVAX fees with an Avalanche wallet and then borrow USDT on a lending protocol with an Avalanche wallet (paying another AVAX fee).

Axelar completely abstracts away these extra steps and payments by creating a sequence of instructions for the network to execute cross-chain on behalf of a user.

In this scenario, if a user was on Ethereum as a source chain, the user would use MetaMask to send intent to a Gateway connected to Axelar, alongside a payment of ETH that is requested by network services in order to execute the intent cross-chain. Axelar network then abstracts the payment flow: ETH is converted into AXL to pay validators and then into AVAX to pay fees on Avalanche. A user does not have to leave MetaMask, or Ethereum, or purchase any other currencies in order to transact on other chains. (Notably, this process may create deflationary effects in Axelar, as “change” from these conversions is either refunded to the user, or applied toward potential buyback-and-burn programs. More on this from Axelar Foundation, here). At this point, Axelar has done all of the work on behalf of the user and a user has successfully borrowed USDC on a lending protocol in Avalanche. Axelar opens up new possibilities for users to take cross-chain actions without needing to learn new tools or purchase new currencies to pay fees.

AXL — The Token of Axelar Network

Axelar is a Proof-of-Stake network built with Cosmos SDK and Tendermint consensus. The AXL token is used to secure the decentralised network. For a refresher, stake is the value of a token that has been delegated to validators to secure a Byzantine system. The more stake (value) that has been delegated, and the more diverse the pool of token-holders and validators, the harder it is to attack the system. At this point, it is extremely unlikely for a validator to be malicious in any case given it would be explicitly risking a large sum of its own stake and implicitly risking its reputation in the cryptocurrency ecosystem. Even in a scenario where the value at stake in AXL is less than the amount being transferred, validator collusion toward a malicious outcome is unlikely, given the explicit reward for doing so would likely be very low and reputation risk extremely high.

Holders of AXL have a strong incentive to delegate their AXL to a validator(s) to secure the network. Validators earn block rewards for successfully proposing new blocks that are verified by other validators in the network. A validator has more opportunity to propose blocks (and hence earn more rewards) if it has more stake delegated to it. Delegators are the ones that stand to benefit the most from block rewards because delegators earn the majority of it (often >90%), whilst validators take a commission for securing the network on behalf of them (i.e for. running the node that participates in the Axelar’s network consensus). If an AXL holder does not delegate, they risk being diluted as they will miss out on block rewards being received by other AXL stakers and validators.

Token-holders also have an incentive in the form of their long exposure to AXL, to delegate AXL to validators that they believe will secure the network in the best possible fashion. Delegators can review data on the full list of validators via the Axelar block explorer, Axelarscan, at axelarscan.io/validators. The more AXL that is staked with a validator, the more voting power a validator has (i.e. more chance of a validator being chosen to produce the next block) — but this does not lead to concentration of voting power, because Axelar has implemented quadratic voting. In short, quadratic voting means validators’ voting power is equivalent to the square root of their delegated stake. E.g., to get one vote, a validator would need 1 token; but to get 2 votes they would need 4; to get 3 votes, 9 tokens would be needed, and so on. The validator set of Axelar is limited, so AXL token-holders can play a direct role in ensuring the active validator set is performant and available by delegating to high-quality validators. Ideally, Axelar’s network is very decentralised whereby it would take not just a lot of stake to break liveness guarantees of the network but also a lot of validators (e.g. validator diversification).

Aside from securing Axelar’s network, AXL is also used for token-holders to participate in governance. Due to the fact that Axelar is built with Cosmos SDK, this means that all governance proposals are created and voted upon on-chain. The more AXL that a token-holder holds in the network, the more votes a token-holder has on governance proposals. For example, governance proposals might cover connecting new chains or a proposed upgrade that improves the features of Axelar’s network. However, it is not a requirement for AXL token-holders to participate in governance in Axelar. In networks built using Cosmos SDK, token-holders inherit the vote of the validators they delegate to if they do not vote themselves. If a user does not agree with the vote of a validator, the user always has the optionality to change the vote that was inherited from their validator. All in all, on-chain governance in Cosmos SDK chains runs smoother than most and is a great way for token-holders to actively participate and contribute to decentralised networks.

Finally, AXL is used to pay transaction fees to validators in Axelar’s network. For example, a user active on source-chain Ethereum that signals intent to take actions on destination-chain Avalanche would pay fees in ETH to Axelar’s Gateway on Ethereum. Axelar’s SDK provides services that observe the Gateways and then convert the ETH fee into AXL to pay Axelar validators and AVAX to pay Avalanche validators (all-the-while taking a cut for doing so). In essence, AXL is the fuel for validators to come to consensus on cross-chain intent. Demand for AXL comes from services such as Axelar SDK, which convert other currencies into AXL in order to pay validators for their work. Anyone can provide these services; they can even be handled manually by the user, if desired. The more usage Axelar’s network gets, the more currencies that are converted into AXL to pay validators, the more demand for AXL.

What makes Axelar valuable?

There are many reasons why Axelar is a valuable network. The network is valuable for developers, users and token-holders.

For developers, Axelar is useful due to the Turing-complete programmability the network facilitates, as well as the ability to compose functions cross-chain. Starting with composability, developers that build on top of Axelar can build one-click user experiences consisting of multiple components that interact with each other cross-chain. (Read more for an introduction to architecture approaches, when composing cross-chain.) For example, a developer might choose to build a yield optimiser, whereby a financial strategy reads yield of a certain asset across multiple chains and deploys more or less capital (rebalancing) on a connected chain in order to optimise yield for the next block. Axelar is also entirely programmable, which means that validators in Axelar’s network can take any action on behalf of a user cross-chain, no matter what it is. For example, a developer could choose to build a governance aggregator application whereby a validator set can vote on behalf of a user in a DAO, cross-chain, in the same direction as the majority vote (e.g. vote YES if majority vote is already YES). Related to programmability, Axelar network is Turing-complete, meaning any program that is created by developers can be run by the network, given enough memory and time. These features are possible because Axelar is a blockchain that connects blockchains, and cannot be duplicated by other interoperability networks. All in all, Axelar is the most customisable, flexible, programmable and composable interoperability network.

Users of Axelar can look forward to greater liquidity in their respective ecosystems, a better user experience, less transaction costs and new use-cases. Greater liquidity will be able to freely flow across blockchains that are connected to Axelar and as a result, users will have new assets to trade that were not available previously on their blockchains. There will be a better experience for users moving cross-chain as users will not need to hold multiple tokens across chains to take actions and not need to make separate transactions for each transaction. Any cross-chain transaction can be paid for with one token and instructions can be bundled by validators to execute atomically. Users will also be able to access new types of applications that exist on chains that are not native to the chain they currently interact with. For example, a user on Ethereum might be able to utilise a cross-chain AMM built on Axelar to swap Ethereum assets with assets on Avalanche. Axelar and its partners are already working with the largest dexes on multiple chains (Osmosis, a Cosmos project, is a notable example), who are building these cross-chain liquidity networks. Moreover, many of these projects are using Axelar’s unique functionality to build user onramps (such as one-time deposit addresses) that can rival centralised exchanges for ease-of-use, and welcome users seamlessly, regardless of what tokens they hold.

AXL is the fuel to the Axelar economy. The value of AXL comes from how it is used to secure the network, govern the network and pay node operators in the network to execute cross-chain intent. Holding AXL gives users a way to directly contribute to the sustainability and security of the network.

Axelar Overview

To conclude, Axelar is a decentralised and permissionless interoperability network built with Cosmos SDK that has a mixture of properties such as many-to-many connectivity, programmability, composability and Proof-of-Stake security that constitutes the most robust interoperability network available for users. Axelar will be secured by AXL, which is used to secure the Proof-of-Stake network, as well as for governance and payment for validators to execute cross-chain intent. Axelar will unlock a variety of use-cases that have not yet been seen, such as interacting cross-chain with other blockchains that might not speak the same language as the user’s source blockchain. For the first time, cross-chain user experience will be seamless as a flux of applications are being built on top of Axelar currently to leverage the profound properties of the interoperability network. Users who enter Web3 via one blockchain will easily access applications and assets on other blockchains, perhaps without even knowing they are doing so. Axelar solves problems of centralised bridges and interoperability networks to produce what can ultimately be argued as the safest, most secure and best cross-chain user experience that is available for users.

Acknowledgements: Thanks to Galen Moore from Axelar for his review of this article.

About the Author

Xavier Meegan is Research and Ventures Lead at Chorus One.

Medium: https://medium.com/@xave.meegan
Twitter: https://twitter.com/0xave

About Chorus One

Chorus One is one of the largest staking providers globally. We provide node infrastructure and closely work with over 30 Proof-of-Stake networks.

Website: https://chorus.one
Twitter: https://twitter.com/chorusone
Telegram: https://t.me/chorusone
Newsletter: https://substack.chorusone.com
YouTube: https://www.youtube.com/c/ChorusOne

About Axelar

Axelar delivers secure cross-chain communication for Web3, enabling dApp users to interact with any asset or application, on any chain, with 1 click.

Website: https://axelar.network/
Twitter: https://twitter.com/axelarcore
Discord: https://discord.com/invite/aRZ3Ra6f7D
Blog: https://axelar.network/blog
YouTube: https://www.youtube.com/c/Axelarcore

Core Research
Networks
Solana Validator Economics
Blockchains not only need to be technically good. Besides the protocol and implementation levels, a key element in the success of a decentralized system is having different and independent groups using it, operating it, and governing it.
August 23, 2022
5 min read

Blockchains not only need to be technically good. Besides the protocol and implementation levels, a key element in the success of a decentralized system is having different and independent groups using it, operating it, and governing it. The economic incentives in decentralized systems to achieve such participation by all these different groups have gained attention from researchers who are now interested in “tokenomics”, as a new field of study.

In this article, we are going to explore Solana economics, focusing on the stimulus to network node operators, or validators. We conducted an analysis of the inflation model, the costs and rewards to validators and stakers, as well as the current network activity levels. We also estimate the minimum stake required of a validator in order to break-even, and estimate the impact of different market scenarios, considering the most important variables and how they affect validator profitability.

For this purpose, we built the Solana Validator Dashboard and the Solana Validation Cost Estimator.

Inflation Design

Solana validators currently earn from two sources:

  1. protocol-based rewards: generated from inflationary issuances from a protocol-defined inflation schedule.
  2. transaction fees: currently, 50% of the transaction fee is burned and the remaining 50% goes to the validator leader of the respective slot.

The Solana inflation design has defined SOL emissions as starting at 8%, and decreasing by 15% every year. The model was activated on February 10th, 2021 with the payment of 213,841 SOL.

Validators started to receive rewards from inflation in February 2021.
Source: Solana Validator Dashboard

As of July 2022, Solana’s inflation rate is around 6.8%. The staking yield is equivalent to 9.1%, as 75% of the total supply is currently staked (i.e. total inflation rewards are distributed to staked tokens only, resulting in a dilution of non-staked tokens). The rate does not reflect the yearly emission rate. It can be considered a target instead, and the mechanism behind it is broken down below.

Solana’s inflation model considers 400ms block times even though it is mentioned on Docs that the current implementation targets block times to 800ms. The recent average is around 650ms but with high variance.

Solana block times over the 35-day period ending August 9, 2022

Although Solana remains extremely performant to the everyday user, the difference in slot times directly impacts the economics and business viability of running a validator on Solana. Longer block times will result in smaller rewards, given a smaller number of epochs in a calendar year, decreasing the amount of SOL distributed to network participants.

Inflation Rewards Pool

In every epoch, Solana calculates the number of tokens instantiated for the inflation pool. The result will be the amount of SOL tokens to be distributed to validators and stakers as inflation rewards, according to the voting and staking status from the previous epoch. 0.45 SOL is the approximate amount currently allocated and distributed among eligible validators in each slot — 195 thousand SOL per epoch.

Block times impact inflation rewards as the function will taper the initial rate (8%) given how many slots have passed since inflation activation on Mainnet — as a proportion of how many slots fit in one year.

Comparing effective inflation rate, given average block times

Considering an average block time of 650 ms, the inflation being distributed in every epoch is equivalent to a 4.1% yearly rate and the stake yield falls to 5.5%, instead of the 6.8% and 9.1% previously assumed.

Commission Fee

Also relevant to validator economics will be the commission. In fact, stake owners, a.k.a. delegators, earn the inflation rewards. Validators earn a portion of it represented by the commission. In the plot below, we can see that a common fee for public nodes is around 10%. There are only 81 nodes charging a 5% fee or smaller. 100% commission is assumed to refer to private nodes (100 validators).

Amount of SOL staked versus commission rate for each validator.
Stake status as of August, 2nd 2022.

Transaction Fees

Block reward from transaction fees varies according to network activity. Recent average is around 0.01 SOL per slot. Total per epoch increases with voting power, as the number of slots attributed to the validator is based on the proportional stake.

Rewards from transaction fees per epoch as a proportion of inflation rewards.

Theoretically, as inflation decreases with time, validators’ rewards would be supplemented by the increase in transaction fees. The assumption can eventually become a truth as the network matures. Some plots below show that currently this is an unfair assumption:

1- The market’s cyclic nature — the number of non-vote transactions will not necessarily be growing over time. Total transactions (vote + non-vote) picks in Oct21, around 180 thousand in one day. And falls to less than 100 thousand transactions in Apr22.

The number of transactions (orange) and Rewards from transaction fees (gray). Inflation activation is shown in blue
Source: Solana Validator Dashboard

2- Solana network has invested in growing the network of validators. The plot below shows the number of unique rewards recipients (addresses).

Number of validators (orange) and Rewards from transaction fees (gray) since inflation activation (blue)
Source: Solana Validator Dashboard

3- As a consequence of voting power dilution and lower network activity, rewards obtained from transaction fees decreased for validators in an individual perspective.

Number of slots under C1 leadership (green) and Rewards from transaction fees (orange)
Source: Solana Validator Dashboard

Validator Costs

Hardware and Personnel

We split the cost into i) hardware, colocation, and bandwidth, to host the validator and ii) personnel, which can vary significantly. The official recommendations can be found on the Solana Documentation.

Small Validator

  • Hardware: a single node on the most budget hardware that can still run Solana.
  • Personnel: hobbyist who spends a few hours/week.

Medium Validator

  • Hardware: a pair of nodes with an average provider and 1 Gbps traffic.
  • Personnel: shared site reliability engineering team, equivalent to 0.25 full-time employees focused on Solana.

Professional Validator

  • Hardware: a pair of nodes with a specialized provider and >10 Gbps traffic.
  • Personnel: dedicated site reliability engineering team, equivalent of 1.5 full-time employees focused on Solana.

Voting Costs

The vote is an affirmation that a block it has received has been verified, as well as a promise not to vote for a conflicting block. — Solana Docs

Validators are expected to vote on the validity of the state proposed by the slot leader. A validator node, at startup, creates a new vote account and registers it in the network. On every new block, the validator submits a new vote transaction and pays the transaction fee (0.000005 SOL).

SOL Token Acquisition

Validators usually own (a portion or the total of) the staked tokens, a.k.a. self-staking. In this case, the cost of tokens depends on the average price of acquisition. For the purpose of the current analyses, we will consider the validators only to own 100 SOL at a US$ 50 price.

The Solana Foundation promotes the growth of the validator set through the Solana Delegation Program. Applications require small validators to achieve the “baseline” criteria, which includes running a node also on the Testnet, in order to receive 25,000 SOL. Those who meet the baseline criteria and also the “bonus” criteria can receive an extra (dynamic) amount in the delegation. A recent post on stake delegation strategies and why delegation programs are needed, goals, and criteria can be found in How can networks nurture decentralization?

Solana Delegation Program, baseline criteria example
Solana Delegation Program, bonus criteria example

Break-even

In summary, Solana validator’s profitability depends on the current inflation rate, block times — reflected on the number of epochs in one year, the voting power, the total supply, the number of transactions, the cost structure, and the SOL market price.

For the three operational levels stated above, we will look at three different economic scenarios: optimistic, average, and pessimistic, with the average scenario being the closest to the current values.

The average market price in one year is fixed at $50 for the purpose of break-even analysis. Different price scenarios can be evaluated in a further session.

Break-Even Third Party Stake (thousands of SOL)

We found that the 40,000 SOL to break even would be a realistic amount for a small validator, on an average scenario, close to current levels. The number grows to 253,000 SOL for the medium setup. A professional validator would need more than 1.3 million SOL staked.

Minimum thousands of SOL in third party stake to break-even

For a validator with a 0.01% stake, we estimate a 25 SOL reward from transaction fees in one year. The voting process costs around 200 SOLs per year for each node operator. Therefore, small validators are dependent on inflation rewards to achieve break-even, and ideally, become profitable. Around 350 thousand SOL staked would be needed to fully cover the voting cost, when considering rewards from transaction fees only.

Considering active stake on August 2nd:

  • 89.5% of validators control less than 115 thousand SOL, and;
  • 3.6% of validators control more than 1 million SOL each.
The number of validators by the amount of stake. Stake status as of August 2022.

Although the number of validators may be considered high compared to other Proof of Stake networks, 71 accounts are responsible for 57% of the total 365 million SOL staked.

The number of validators by the amount of stake. Stake status as of August 2022.

The majority of validators currently stake between 80 and 90 thousand SOL, as seen in the plot below. There are at least 138 (7%) instances of the validator client with stake amounts smaller than 40 thousand SOL, the estimated break-even level for a small validator.

The number of validators by the amount of stake. Stake status as of August, 2nd 2022.

Market Price Exposure

Simulation shows that medium and professional validators are more sensible to fluctuations in the SOL market price than small-size validators. Considering SOL average price in a year to be $75, the break-even level decreased by more than 30% for medium and professional levels and only 7% for small validators. A similar effect is found if the average price drops to $25.

Adjusting Inflation Model

In PoS networks, adopting an accurate inflation model in conjunction with direct incentives in form of delegation is important to:

  • attract new, independent validators, promoting decentralization and censorship resistance;
  • increase staking levels and interest for SOL by long-term holders;
  • guarantee the incentive to existing validators, given the current market price and network activity level.

Solana validators and stakers have seen rewards decreasing with higher block times compared to the projected rewards from the initial inflation model. As additional factors, the network experienced a contraction in non-vote transactions during the latest months and the expansion of the validator set.

According to the break-even levels discussed above, an 8.85% inflation target would be the rate level to reflect an effective 5.5% emission in one year, considering 650 ms block times (6.3% if 550 ms block times). Assuming 75% of total supply is delegated to validators, staking yield would become 7.1% in one year and the minimum amount in stake to break even drops by 24%, to 32 thousand SOL.

The inflation rate is even more relevant for small validators’ profitability, compared to transaction fee rewards. Adjusting the inflation model according to the actual network configuration would reinforce the interest of those validators staking less than 40 thousand SOL. Supposing the 8.85% rate simulation above, approximately 21 more validators (1.12%) would reach the break-even level — that is the number of validators currently in range 30-40 thousand SOL in stake.

Conclusion

In this study, we explored the variables behind the Solana validator economics, estimating profitability levels for different market scenarios.

We found that:

  • The actual inflation emission rate is around 4.1% per year, instead of the 6.8% theoretical target because of increased block times;
  • 40 thousand SOL would be a realistic amount for a small validator to break even;
  • 7% of validators control stake amounts smaller than break-even level;
  • Voting cost per year averages 200 SOL. Validators controlling less than 350 thousand SOL depends on inflation rewards to fully cover the voting cost;
  • 71 validators control more than 1 million SOL each, yielding 57% of the total supply;
  • Medium and professional validators are more sensible to fluctuations in the SOL market price. Small size validators are more sensible to inflation parameters;
  • A 30% adjustment in the inflation target would bring the effective rate to 5.5% per year — better reflecting the initial model, and reduce the minimum amount to break even in 24% for all validator sizes.

Fee markets are now live on Solana but the adoption of the priority fee by dApps and general users at the moment is low, with the proportion of around 4% of transactions paying a higher fee than the fixed rate. It has been in an uptrend since launched, in late July.

Go to the Solana Validator Cost estimator in getguesstimate to explore the relevant variables, their interactions, and correlations. Thanks,

Ruud

, Chorus One engineer, for building it.

“Look below the surface and you will find that all seemingly solo acts are really team efforts.” —John C. Maxwell

This article was brought to you by

Chorus One

. With meticulous review by

Felix Lutsch

and

Ruud

.

About Chorus One

Chorus One is one of the largest staking providers globally. We provide node infrastructure and closely work with over 30 Proof-of-Stake networks.

Website: https://chorus.one
Twitter: https://twitter.com/chorusone
Telegram: https://t.me/chorusone
Newsletter: https://substack.chorusone.com
YouTube: https://www.youtube.com/c/ChorusOne

No results found.

Please try different keywords.

 Join our mailing list to receive our latest updates, research reports, and industry news.

Want to be a guest?
Drop us a line!

Submit
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.