Blog

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Core Research
A deep-dive into Eth-staking-smith
Performant Ethereum validator key management.
January 13, 2023
5 min read

Authors: Jennifer Parak, Maksym Kulish

One of the most important events of 2022 in the crypto community was The Merge upgrade of the Ethereum protocol, which switched Ethereum from a Proof-of-Work legacy chain implementation to a Proof-of-Stake Beacon chain. It has proved that principal innovation is possible for the oldest and largest decentralized systems, without any disruption to the protocol users.

At Chorus One, we worked on securing next-generation Ethereum since the Beacon Chain took off in 2020, and we operate multiple thousands of validators on the mainnet today. Our new product OPUS — an Ethereum Validation-as-a-Service API — is designed to enable any organization and individual to run staking validator clients on Ethereum Beacon Chain, with a non-custodial, permission-less approach where we require customers to specify their own withdrawal and fee recipient addresses, so they remain in possession of both their stake funds and rewards. This post focuses on the technology implementation of validator keys provision and storage approach within our Validation-as-a-Service API product and shows off some challenges we faced and solutions we created in the process.

Background

Ethereum Staking Keys

The Merge has introduced two new types of keys involved in securing the Ethereum chain, in addition to legacy chain wallet keys that are remaining unchanged within Beacon Chain [1]. These keys are composed of the Signing (Validator) key pair and the Withdrawal key pair. In addition to new key functionality, the Signing key is also using a new cryptographic signature scheme, called BLS, which stands for Boneh–Lynn–Shacham. This means older key generation tools will not work for creating Signing keys. BLS signatures, specifically those over the BLS12–381 curve are used in Beacon chain block signatures and attestations. This makes it possible to aggregate multiple signatures and verify them in a single operation, which is an outstanding improvement in scalability [2].

Like most other Proof-Of-Stake blockchains, next-generation Ethereum depends on the functioning of validators for securing the transaction flow. Validators are members of the network who lock a portion of their Ethereum coins (with a minimum amount of 32 ETH) to become responsible for proposing new signed blocks of transactions, and verifying such signatures of other validators, which is called attesting. Normally, every Ethereum validator should attest signatures for a slot once per Ethereum epoch (around 6.4 minutes); and for every slot, in every epoch, one validator is pseudo-randomly chosen to produce a block of transactions to be attested by others. Validators are being rewarded for both block proposals and block attestations. The mechanism of signing the blocks and verifying the signatures of others relies on the Signing key pair. The verification mechanism works because every public part of a Signing key (Public Signing Key) is published on-chain, so every signature done with the private part of the Signing key (Private Signing Key) can be verified by every other validator. Despite having the power for creating blockchain content, the Signing keys can not be used to move any funds including staking funds, and they only listen for and sign the transaction content provided by the peering network of Ethereum nodes.

The Withdrawal key pair is neither used for blocks nor for attestations, but it has control over staked funds. After the Shanghai fork, withdrawals will be activated, which will enable the funds to be moved to an owner-controlled withdrawal address specified in the deposit contract. With EIP-4895 withdrawals will be enabled in a push-based fashion [3], such that funds that were previously locked on the consensus layer on depositing are automatically pushed to the execution layer as a system-level operation. This means users won’t have to pay any gas for a withdrawal transaction. For users who have specified a BLS withdrawal address in their deposit contract, they would need to broadcast a BLS_TO_EXECUTION_CHANGE message to the beacon chain to update their withdrawal address to an execution address.

Finally, when the validator successfully proposes a block, a special Fee Recipient address receives the accumulated gas fees from the block. Since the Fee Recipient is not directly involved in staking, we will largely omit it in this post.

More information about different types of keys involved in Ethereum staking can be found in the following resources: [4], [5]

Managing Validator keys for OPUS Validation-as-a-Service API

As part of the OPUS Validation-as-a-Service API, we require customers to retain ownership of Withdrawal keys, so that staked funds can never be controlled or accessed by Chorus One. A Signing key, however, is different: since Chorus One is a responsible party for hosting and maintaining the Ethereum validator, the inner workings of the Validation-as-a-Service API require us to generate, load, and store Signing keys. Thus, a robust solution for key management is an essential part of our Validation-as-a-Service API.

Early into the project lifecycle, we used the staking deposit command line interface (CLI) provided by the Ethereum Foundation (https://github.com/ethereum/staking-deposit-cli). While the staking CLI is a great tool for solo/home stakers, we realized that it was not designed for our use case. First of all, staking-deposit-cli by default stores the newly generated keystore into a filesystem, posing a potential security threat from leaking key material. While it is possible to use infrastructure-specific workarounds like ramdisks to mitigate the threat, such workarounds would add complexity and failure points to the platform. The open-source nature of staking-deposit-cli allowed us to fork the source code and modify it to cater to our needs, but the lack of thoroughly automated test suites meant we had a hard time syncing our changes with upstream updates. Finally, all of our codebase is Rust, and having to support Python CLI within the infrastructure, including keeping a good security track record by timely patching all the Python dependencies, puts an additional burden on the development team. In the end, we decided to pursue an alternative approach to generating keys, which we describe in the next paragraph.

Eth-staking-smith

Having endured even more difficulties with staking-cli when generating Ethereum keys on a large scale, our Ethereum Team decided to tackle the problem during our company-wide engineering hackathon where we built an MVP for an Ethereum key generation tool written in Rust. This was the birth of the Eth-staking-smith project.

Component diagram of Eth-staking-smith

Eth-staking-smith can be used as a CLI tool or as a Rust library to generate Signing keys and deposit data derived from a new mnemonic or to regenerate deposit data from an existing mnemonic. These use cases were implemented, in order to provide the same functionality as the staking-deposit-cli whilst avoiding all problems mentioned above.

Example command to generate keys from a newly generated mnemonic:

eth-staking-smith new-mnemonic --chain mainnet --keystore_password testtest --num_validators 1

Example command to generate keys from an existing mnemonic:

eth-staking-smith existing-mnemonic --chain mainnet --keystore_password testtest --mnemonic "entire habit bottom mention spoil clown finger wheat motion fox axis mechanic country make garment bar blind stadium sugar water scissors canyon often ketchup" --num_validators 1 --withdrawal_credentials "0x0100000000000000000000000000000000000000000000000000000000000001"

For both use cases, Eth-staking-smith will generate the following key material:
  • Private Signing keys
  • Keystores
  • Mnemonic (existing or newly generated)
  • Deposit data smart contract properties

Let’s zoom into the generated key material

Private Signing keys

As mentioned above, the Private Signing key is the key used to provide a signature to any action taken by the validator. The Eth-staking-smith Signing key output is done without encryption because in our use case, we use remote API to store the key material immediately upon the generation. Remote API implements encryption for both data transfer and data at rest. We decided to make keystore use optional, but Eth-staking-smith can still generate encrypted keystores for the users who need that.

Example:

{

"private_keys": [
"6d446ca271eb229044b9039354ecdfa6244d1a11615ec1a46fc82a800367de5d"
]

}

Keystores

The keystore is an encrypted version of the private Signing key in the specified format [6]. When generating keys with eth-staking smith, a keystore password can be specified and in that case, the keystore data will be output. Using a key derivation function (e.g. <code-text>scrypt<code-text>, or <code-text>pbkdf2<code-text>), a decryption-key is derived using the given passphrase and a set of strong built-in derivation arguments. The example below highlights the field <code-text>function<code-text> that shows the key derivation function used. The keystore is a useful alternative that is less vulnerable to an attacker than storing the private Signing keys in a plaintext file, since they would need the keystore file, as well as the passphrase to decrypt the file.

Example:

{

"keystores": [
{
"crypto": {
"checksum": {
"function": "sha256",
"message": "af14321c3083de535a0dd895b4e2fb156e6b0eda346120c8d7afb5277d3a489f",
"params": {}
},
"cipher": {
"function": "aes-128-ctr",
"message": "8032685ad92a579e66328bbd6c747e41497dc6897c17cebbd83958394943924b",
"params": {
"iv": "da5699bb18ee7fea6095634a2fa05d18"
}
},
"kdf": {
"function": "pbkdf2",
"message": "",
"params": {
"c": 262144,
"dklen": 32,
"prf": "hmac-sha256",
"salt": "afb431f05b7fe02f253d9bc446ac686776541d38956fa6d39e14894f44e414d8"
}
}
},
"description": "",
"name": null,
"path": "m/12381/3600/0/0/0",
"pubkey": "8844cebb34d10e0e57f3c29ada375dafe14762ab85b2e408c3d6d55ce6d03317660bca9f2c2d17d8fbe14a2529ada1ea",
"uuid": "6dbae828-d0f0–42ed-9c06-d9079642ea08",
"version": 4
}
],

}

Mnemonic

The mnemonic, passed in by the user or the one generated, is returned as part of the output so that the user can store it safely. Further information on mnemonics can be found under the reference [7].

Example:

{

"mnemonic": {
"seed": "ski interest capable knee usual ugly duty exercise tattoo subway delay upper bid forget say"
},

}

Deposit data

Finally, the deposit data is returned, which is used to make the deposit of 32ETH using the Ethereum deposit contract to activate the validator. One of the most important fields in the deposit data is the withdrawal credentials.

By default, withdrawal credentials are BLS addresses derived from the mnemonic, however, there exists a use case where a user might want to overwrite the derived withdrawal credentials with already existing ones.

The BLS address format is called <code-text>0x00<code-text> credentials, and is actually set to be deprecated sometime after withdrawals will be enabled. Another alternative way to provide withdrawal credentials is to use a legacy Ethereum wallet address, prefixed by <code-text>0x01<code-text>. Ethereum will be pivoting from using <code-text>0x00<code-text> (formerly eth2) to <code-text>0x01<code-text> execution (formerly eth1) addresses. To learn more about this, we recommend watching the panel from Devcon 2022 [8] and looking into the Ethereum specification [4]. Eth-staking-smith, therefore, allows the user to pass in a <code-text>0x00<code-text>, <code-text>0x01<code-text> execution withdrawal credentials, as well as an execution address to overwrite the withdrawal credentials.

Example deposit data with BLS credentials (<code-text>--withdrawal_credentials 0x0045b91b2f60b88e7392d49ae1364b55e713d06f30e563f9f99e10994b26221d<code-text>)

{

"deposit_data": [
{
"amount": 32000000000,
"deposit_cli_version": "2.3.0",
"deposit_data_root": "95ac4064aabfdece592ddeaba83dc77cf095f2644c09e3453f83253a8b7e0ae1",
"deposit_message_root": "6a0c14a9acd99ab4b9757f2ff2f41e04b44c0c53448fdf978c118841cd337582",
"fork_version": "00001020",
"network_name": "goerli",
"pubkey": "8844cebb34d10e0e57f3c29ada375dafe14762ab85b2e408c3d6d55ce6d03317660bca9f2c2d17d8fbe14a2529ada1ea",
"signature": "82effe6d57877b7d642775ae3d56f9411d41a85218b552c6318925c7ba23f7470ebe3a35045e2fc36b0e848e6f4ec1d503f2014dc5a7ad94a267f5b237f2475b5da9ff358fbd5a8e9f497f1db0cfb15624e686991d002077a6cd4efda8bdc67e",
"withdrawal_credentials": "01000000000000000000000071c7656ec7ab88b098defb751b7401b5f6d8976f"
}
],

}

Below we present a full-fledged example output of the key material generated by Eth-staking-smith:

Command:

eth-staking-smith existing-mnemonic --chain goerli --keystore_password testtest --mnemonic "ski interest capable knee usual ugly duty exercise tattoo subway delay upper bid forget say" --num_validators 1

Output:

{
"deposit_data": [
{
"amount": 32000000000,
"deposit_cli_version": "2.3.0",
"deposit_data_root": "2abc7681f73a01acbc1974ab47119766bf57d94f86a72828f8875295f5bd92de",
"deposit_message_root": "bfd9d2c616eb570ad3fd4d4caf169b88f80490d8923537474bf1f6c5cec5e56d",
"fork_version": "00001020",
"network_name": "goerli",
"pubkey": "8844cebb34d10e0e57f3c29ada375dafe14762ab85b2e408c3d6d55ce6d03317660bca9f2c2d17d8fbe14a2529ada1ea",
"signature": "97c0ad0d4f721dc53f33a399dbf0ff2cab6f679f4efdcdaa9f8bdd22cd11b5e37c12fdd2cd29369b1b907a51573a9ef60f93d768fd2d47a99b5d55fe6516a87b9090e16c42f5a8fcbf91d24883359bffb074a02d6d4d7f6c3cd04c8e09f8dc02",
"withdrawal_credentials": "0045b91b2f60b88e7392d49ae1364b55e713d06f30e563f9f99e10994b26221d"
}
],
"keystores": [
{
"crypto": {
"checksum": {
"function": "sha256",
"message": "af14321c3083de535a0dd895b4e2fb156e6b0eda346120c8d7afb5277d3a489f",
"params": {}
},
"cipher": {
"function": "aes-128-ctr",
"message": "8032685ad92a579e66328bbd6c747e41497dc6897c17cebbd83958394943924b",
"params": {
"iv": "da5699bb18ee7fea6095634a2fa05d18"
}
},
"kdf": {
"function": "pbkdf2",
"message": "",
"params": {
"c": 262144,
"dklen": 32,
"prf": "hmac-sha256",
"salt": "afb431f05b7fe02f253d9bc446ac686776541d38956fa6d39e14894f44e414d8"
}
}
},
"description": "",
"name": null,
"path": "m/12381/3600/0/0/0",
"pubkey": "8844cebb34d10e0e57f3c29ada375dafe14762ab85b2e408c3d6d55ce6d03317660bca9f2c2d17d8fbe14a2529ada1ea",
"uuid": "6dbae828-d0f0–42ed-9c06-d9079642ea08",
"version": 4
}
],
"mnemonic": {
"seed": "ski interest capable knee usual ugly duty exercise tattoo subway delay upper bid forget say"
},
"private_keys": [
"6d446ca271eb229044b9039354ecdfa6244d1a11615ec1a46fc82a800367de5d"
]
}

Security Improvements

Since writing key material on disk was a major security vulnerability for us, Eth-staking-smith removes this issue entirely by not writing any files on disk.

To avoid heavy-lifting and re-creating crypto primitives from scratch, we’re re-using functionalities from the lighthouse client implementation [9] for key generation, which builds on top of blst — a BLS12–381 signature library [10], which is currently undergoing formal verification.

For entropy collection, one customization was made: Eth-staking-smith defers entropy collection to the operating system by using <code-text>getrandom()<code-text> on Linux and thereby making use of Linux’s state-of-the-art randomness approach.

Tweaking Security <> Performance parameters

Finally, since key generation at scale was notoriously slow for us with staking-deposit-cli, we took initiative to add additional arguments for our users to tweak performance <> security parameters depending on their specific use case.

As per our use case, our API does not require the keystore file, but only the private key in raw format. We, therefore, enable the user to opt-out of keystore generation in order to improve performance. This can be done by omitting the <code-text>--keystore_password<code-text> argument as follows:

eth-staking-smith new-mnemonic --chain goerli --num_validators 1

We measured that omitting keystore speeds up the key generation process by 99%. The key generation performance we experience with Eth-staking-smith is consistently sub-second, with slight variability depending on hardware and platform.

In case the user requires the keystore file for redundancy, there’s another option to speed up the keystore generation process by choosing a different key-derivation function. By default, Eth-staking-smith will use <code-text>pbkdf2<code-text> to derive the decryption key to achieve better performance. There’s also the option to use <code-text>scrypt<code-text> which offers better security, however, consequently, worse performance. This can be done by choosing the key-derivation function using the <code-text>--kdf<code-text> argument as follows:

eth-staking-smith new-mnemonic --chain goerli --keystore_password testtest --num_validators 1 --kdf scrypt

Converting BLS withdrawal to execution address

As mentioned above, users who had previously specified a BLS (0x00) withdrawal address, will need to make a request to the beacon chain to update their validators’ withdrawal address to point to an execution address. To perform this operation, the user will need to have the BLS withdrawal key mnemonic phrase. Once done, withdrawals will be automatically funded on the execution address.

Eth-staking-smith enables the user to generate a signed <code-text>BLS_TO_EXECUTION_CHANGE<code-text> message which they can send to the beacon chain to update their withdrawal address.

eth-staking-smith bls-to-execution-change --chain mainnet --mnemonic "entire habit bottom mention spoil clown finger wheat motion fox axis mechanic country make garment bar blind stadium sugar water scissors canyon often ketchup" --validator_index 0 --withdrawal_credentials "0x0045b91b2f60b88e7392d49ae1364b55e713d06f30e563f9f99e10994b26221d" --execution_address "0x71C7656EC7ab88b098defB751B7401B5f6d8976F"

Users can use the response to make the request to the beacon node as follows:

```
curl -H "Content-Type: application/json" -d '{
"message": {
"validator_index": 0,
"from_bls_pubkey": "0x0045b91b2f60b88e7392d49ae1364b55e713d06f30e563f9f99e10994b26221d",
"to_execution_address": "0x71C7656EC7ab88b098defB751B7401B5f6d8976F"
},
"signature": "0x9220e5badefdfe8abc36cae01af29b981edeb940ff88c438f72c8af876fbd6416138c85f5348c5ace92a081fa15291aa0ffb856141b871dc807f3ec2fe9c8415cac3d76579c61455ab3938bc162e139d060c8aa13fcd670febe46bf0bb579c5a"
}'
http://localhost:3500/eth/v1/beacon/pool/bls_to_execution_change
```

Conclusion

Throughout this post, we explained the basics of Ethereum Beacon Chain block validation, the key material involved in the process, and walked through an automation tool we created at Chorus One for Proof-of-Stake key management.

We hope the tool can be useful to some of our readers, especially those who use Rust for their blockchain automation work. It is also open-source, and we will welcome bug reports and pull requests on Github.

If the reader is interested in using OPUS Validation-as-a-Service API which builds upon that automation, you are welcome to join the wait-list for private beta, contact via sales@chorus.one.

References:

[1] https://kb.beaconcha.in/ethereum-2-keys

[2] https://eth2book.info/altair/part2/building_blocks/signatures#aggregation

[3] https://eips.ethereum.org/EIPS/eip-4895

[4] https://notes.ethereum.org/@GW1ZUbNKR5iRjjKYx6_dJQ/Skxf3tNcg_

[5] https://github.com/ethereum/consensus-specs/blob/dev/specs/capella/beacon-chain.md

[6] https://github.com/ethereum/EIPs/blob/master/EIPS/eip-2335.md

[7] https://github.com/bitcoin/bips/blob/master/bip-0039.mediawiki

[8] https://www.youtube.com/watch?v=zf7HJT_DMFw&feature=youtu.be

[9] https://github.com/sigp/lighthouse

[10] https://github.com/supranational/blst

Core Research
Networks
Revisiting the “Deflationary Cryptocurrency” definition as ETH post-Merge may turn into one
Will Ethereum become a deflationary cryptocurrency, now that The Merge has happened? The answer, in short, can be given in two words.
October 18, 2022
5 min read

Will Ethereum become a deflationary cryptocurrency, now that The Merge has happened? The answer to this question, in short, can be given in two words: “it depends!” Its long form, however, would offer you a better understanding of whether Ethereum will indeed remain inflationary (albeit only slightly as miners have packed their bags) or become a deflationary asset as time goes on.

If you’re confused by all the information floating around and can’t quite grasp all the terms related to it, read on as this article simplifies the topic at hand. In this piece, we’ll take you by the hand and walk you through the following:
  • The Merge — what is it?
  • Back to the drawing board: What is inflation? What is deflation?
  • Inflationary vs deflationary vs disinflationary crypto assets
  • Explaining Ethereum’s issuance mechanism and inflationary state before The Merge
  • How Ethereum could go from a nearly Net Zero inflation rate to becoming deflationary after The Merge

The Merge — what is it?

At 6:42 AM UTC (2:42 AM EDT / 8:42 AM CEST) on Thursday, September 15, 2022, Ethereum’s long-awaited transition from Proof-of-Work to Proof-of-Stake, dubbed “The Merge”, was finally completed. As Chorus One and the rest of the ecosystem could confirm, the operation — after years of blood, sweat, and delays — was successful.

Having started out as a network relying on Proof-of-Work, thus fast shaping into the “hub” of miners as Bitcoin’s biggest competitor, Ethereum soon encountered scalability issues with its Execution Layer. Too much energy consumption between competing miners to process transactions and not enough security for the network, after all.

The Beacon Chain was, therefore, introduced in December 2020 as the network’s Consensus Layer. This innovation could be seen as Ethereum’s spine, master coordinator, or watchful lighthouse tower, with its key functions set to store data, and manage the network’s validators. Functionalities also included scanning the network, validating transactions, collecting votes, distributing rewards to performing validators, deducting rewards of offline validators, and slashing the ETH of malicious actors.

This Proof-of-Stake blockchain ran alongside the PoW network with the objective to — one day — merge and transform Ethereum into a Proof-of-Stake only network. A win for decentralization and the environment!

That day happened on September 15th, 2022. But before that, Ethereum was inflating at roughly 3.67% — with a ~ 4.62% issuance inflation rate. We will break down the calculations behind this inflation rate, shortly. But first, let’s go back to the drawing board to remind ourselves about the definition of inflation and deflation, in the first place.

What is inflation? What is deflation?

Inflation happens when more bills are printed (FIAT money) or more tokens are minted (cryptocurrency) for circulation in the system. The value of the currency then decreases. In FIAT, this means that more bills would be needed to afford things. In certain cryptocurrencies, this means that the price of the currency goes down.

Deflation, on the other hand, happens when tokens are removed or destroyed from the system through “burning”. By logic, the value of the currency is supposed to increase. There are, however, much more complicated dynamics to this. Those won’t be our point of focus, today.

Explaining Ethereum’s issuance mechanism and inflationary state before The Merge

Before The Merge, Ethereum rewarded the capital-intensive mining activity with up to 2.08 ETH approximately every ~ 13.3 seconds. This rounded up to roughly ~ 4,930,000 ETH/year in miners rewards. The network also had around ~ 119.3M ETH in total supply. (Source: Ethereum.org)

We can find the inflation rate by summing up the Executive Layer and Consensus Layer inflation rates.

Let’s calculate the figure for the Execution Layer by dividing the amount of PoW issued rewards with the total amount of ETH in circulation, to be:
  • ~ 4.93M ETH/ ~ 119.3M ETH = ~ 0.0413 = ~ 4.13%

Then, we move to the Consensus Layer issuance, based on the amount of ETH staked. We’ll round up that number to 13,000,000 of staked ETH presently.

If 1,600 ETH/day is issued, that’s 584K ETH/year in Consensus Layer issuance, amounting to an inflation rate of:
  • ~ 584K ETH/ ~ 119.3M ETH = ~ 0.00489 = ~ 0.49%

That’s almost Net Zero!

Summing both figures, we had an issuance inflation rate of ~ 4.62%, pre-Merge. In other words, miners made approximately ~ 89.4% of issued ETH whilst stakers got ~ 10.6% of the pie as ETH’s issuance inflated at ~ 4.62%.

Goodbye miners. Stay on, stakers!

Through The Merge, Ethereum has therefore addressed:
  1. Energy efficiency
  2. Issuance reduction (up to 88%!)
  3. PoS security, among other things
Among what it hasn’t addressed, however, are:
  1. High gas fees
  2. Slow transaction speed

We’ll get to understand how not addressing high gas fees could actually be a plus for “Deflationary Assets” or “Ultra Sound Money” advocates.

Inflationary vs disinflationary vs deflationary crypto assets

As we go back to the drawing board for the second time in our walk-through, let’s revisit the difference between inflationary, deflationary, and disinflationary crypto assets.

Inflationary

Some cryptocurrencies’ tokenomics are set-up to increase token supply over time. From the start, they are “programmed” to be inflationary. Other cryptocurrency projects, which propose unlimited coin supply, are inflationary as well — as unlimited supply is bound to outweigh demand, decreasing the currency’s value over time. An example of a coin with unlimited supply is DOGECOIN.

Disinflationary

With its halving mechanism until the last 21 millionth Bitcoin is minted, Bitcoin is a disinflationary cryptocurrency. It is set up for a chronological decrease in its issuance. A disinflationary cryptocurrency can, in other words, be described as “an inflationary cryptocurrency with disinflationary measures” in the sense that the demand may, over time, become greater than the diminishing issuance of new tokens.

Deflationary

A good example of a deflationary cryptocurrency is the Binance Coin. BNB’s initial supply saw 200,000,000 tokens in circulation. At the end of Q3, nearly 40 million BNBs had been burned as part of the plan to halve the initial supply from 200 million to 100 million.

Look at tokens in circulation as a balloon and issuance as air: BNB’s mechanism is to deflate the balloon till 50% of air in it remains whilst Bitcoin’s mechanism is to keep inflating its balloon with a set maximum air supply, but doing so with a little less air at every pump.

How Ethereum could go from a nearly Net Zero inflation rate to becoming deflationary after The Merge

So what about Ethereum, now that The Merge has basically rendered a close to Net Zero inflation rate? Why is it touted as a potential deflationary coin?

Enter EIP-1559, the mechanism that burns a portion of ETH gas fees during transactions on the network. With the inflation rate already dropping to 0.49% as explained above, EIP-1559 has the potential to decrease ETH supply — but only on the condition that the gas prices are above 15 Gwei.

Consequently, it is no surprise that ultra sound money advocates would plead users to set their ETH transaction fees to a minimum 15.1 gwei.

Ultrasound.money tracks Ethereum’s supply in real time. A negative figure reflects how many ETHs have been burned since The Merge. In other words, a negative figure showcases deflation, whilst a positive figure showcases inflation.

32 hours into The Merge era, Ethereum had issued over 376 more ETH. Inflationary.

Source: Ultrasound.Money

A month on and the figures keep rising…

Source: Ultrasound.Money

Or maybe not… a wider perspective shows us that there has actually been a decrease since October 8th, when the issuance peaked at over 13,000 ETH.

Source: Ultrasound.Money
Source: Ultrasound.Money

Ethereum — Deflationary or not?

Ultrasound.Money projects gas fees to be above 70 Gwei, registering a -3.40% supply decrease across the next two years.

Source: Ultrasound.Money

As we’ve witnessed now, four weeks since The Merge, we’re bound to see periods of a deflationary ETH and periods with a low but healthy inflation — both of which would be vital for an economic equilibrium.

News
Networks
Core Research
Axelar — Your Plug Into Any Blockchain
Axelar is the most secure, programmable, flexible and composable interoperability network in blockchain.
September 19, 2022
5 min read

Axelar is a universal interoperability network, secured by delegated Proof-of-Stake using AXL, the native token of Axelar: in short, Axelar is a blockchain that connects blockchains. With Axelar, users will be able to use any network with just one wallet (e.g., use MetaMask to make trades on Osmosis). Axelar facilitates many-to-many connectivity and programmability at the network layer for interoperability by connecting to any blockchain via a ‘Gateway’ installed on the connected chain. Users send messages to a Gateway on a source chain, and validators in Axelar’s network sign those messages on a destination chain. Axelar leverages threshold encryption in tandem with its Proof-of-Stake consensus to deliver secure cross-chain communication. Axelar solves the single point-of-failure risks and user-experience issues that are apparent in pairwise bridges and in other interoperability networks, alike. Axelar’s interoperability network unlocks more than just cross-chain transfers; General Message Passing allows developers to perform cross-chain calls of any kind that sync state securely between dApps on various ecosystems. Essentially, the enhanced functionality of cross-chain dApps enabled by Axelar’s network results in a better user experience for all users on all chains. Axelar is valuable for developers because of how inherently programmable, composable, and flexible the network is and for users given the new use-cases it will unlock across chains. Ultimately, Axelar provides permissionless transactions and validation, decentralised security, many-to-many connectivity, and programmability that other interoperability networks cannot duplicate.

What is Axelar?

Axelar is the first fully permissionless and decentralised interoperability network. Axelar is an interoperability Hub that facilitates many-to-many connectivity and acts as an adaptor for any dApp to leverage in order to communicate securely with any dApp on any other blockchain that has a ‘Gateway’ available for Axelar to plug into. The permissionless aspect of Axelar enables any validator to join the decentralised network; unlike other interoperability networks, it is not gated. Axelar reduces the amount of connections found in existing interoperability solutions by acting as a ‘Hub’, whereby each blockchain only needs to connect to Axelar in order to communicate with any other blockchain connected to it as opposed to opening many connections to many blockchains. The fact that Axelar is a blockchain, itself, enhances interoperability capabilities because programmability is possible at the network layer. To expand, actions such as address routing become much more efficient: new chains are immediately accessible to all connected chains, creating compounding network effects. User experience is also improved: Axelar is able to create one-time deposit addresses on connected blockchains, duplicating the user onramps used by centralized exchanges.

How Axelar Works

A user sends a payload to an Axelar Gateway, which is deployed by Axelar in the native language of the source blockchain (e.g. Solidity in Ethereum). The payload is recognised by a relayer in Axelar’s network, which notifies Axelar validators that there is a payload that is ready to be collectively signed (e.g. a cross-chain transfer from a user). At this point, validators come to consensus on what should be done with the payload sent by the user that has reached the Gateway on the source chain (e.g. Ethereum). Validators unwrap the payload and collectively sign on what should be done with it and where to route it (e.g. what network to send the payload to). Axelar network uses a weighted threshold signature scheme that validators abide by, whereby each validator has a % of the overall shares needed to produce a signature that correlates to the amount of AXL (token of Axelar network) staked with them. For example, a gateway might require a threshold percentage of signatures in order to sign a payload. If validators constituting that threshold percentage of the overall stake in Axelar’s network execute signing on a payload, then consensus is reached that approves the payload to be executed on a destination chain. In this case, if it is a cross-chain transfer, then a payload can be executed on a destination chain that mints tokens representing the tokens locked-up on the source chain. However, Axelar’s network can facilitate interoperability interactions that are far more intricate than this. More on this later.

What problem does Axelar solve?

Axelar has a simple but elegant design. The most important element in a bridge comes down to who the owners are of smart contracts that receive cross-chain intent payloads. These owners are given custodial or execution responsibility. If a bridge is centralised, a user would send a payload to a designated signer or group of signers, which would custody and approve the message on the user’s behalf. This approach is known as “proof of authority,” in contradistinction to “proof of stake.” The problem with Proof-of-Authority systems is that a user has to trust these designated signers to behave appropriately and not maliciously. If a centralised group of signers steals or cheats the user — or mismanages their private keys and is hacked — a user can do nothing about it. Therefore, Axelar has created a decentralised and dynamic set of validators to custody or sign payloads from users in a way that is trust-minimised (i.e. a permissionless protocol and incentives provided by the AXL token enforce that parties are responsible for signing or custodying payloads via mechanisms such as cryptography, consensus and economics). Axelar uses threshold encryption, a decentralised network and slashing economics to ensure that all validators behave honestly and user intent is executed across chains securely, safely and correctly.

In general, Proof-of-Authority setups have resulted in hundreds of millions in funds lost to security breaches. The Axie Infinity (Ronin Bridge) hack is a recent, costly example. More decentralised approaches can solve the problem of risks encountered by entrusting a designated group with our intent to move across chains. However, thoughtful approaches are still needed. Wormhole was hacked due to an operational error: a code vulnerability was exposed on their GitHub before it was patched. LayerZero, a well-known decentralised bridge network, leaves critical security decisions up to the application developer and user. Nomad, another well-known project, puts safety behind liveness (if the network halts, transactions are not safe). Nomad recently suffered a multimillion-dollar hack due to a vulnerability left unaddressed in its codebase. Axelar code is rigorously and regularly reviewed by auditors; audits are published here. Axelar code is open-source; a multi-million-dollar bug-bounty program encourages white-hat developers to search for vulnerabilities. Loss-prevention measures are also enabled, including mandatory key rotations, and the ability to disconnect compromised chains quickly, set rate limits and cap transaction amounts.

Axelar solves the security problems that are apparent in other interoperability networks by leveraging threshold encryption and a Proof-of-Stake network for security and consensus whilst simultaneously solving the usability problems presented by pairwise bridges. The user barely has to lift a finger when an application they are interacting with leverages Axelar.

There are other high-quality solutions that match Axelar in terms of security, safety and usability such as Inter Blockchain Communication (IBC). However, IBC is restricted in that it requires extensive integration work to connect to blockchains outside of the ecosystem it was built for (Cosmos). Ultimately, Axelar is the premier solution that solves all interoperability problems faced by other solutions and is unmatched when it comes to security, usability and interconnectivity as Axelar can seamlessly connect to any type of blockchain, regardless of the underlying technology.

Unlocking new use-cases for the cryptocurrency ecosystem with Axelar

As mentioned earlier, Axelar can facilitate interoperability interactions that are far more intricate than just cross-chain transfers. Axelar opens up a multitude of possibilities for users to engage with different chains without having to leave their source chain. This is powerful to comprehend, given users can take actions cross-chain using tools familiar to them such as native wallets and currencies. Let’s dig in.

One example of what is made possible with Axelar’s network is a Cosmos user instantly being able to receive USDC to use on Osmosis from a centralised exchange such as Coinbase without needing to use Ethereum. As it stands right now, if a user has USDC on a centralised exchange and wants to withdraw it to a decentralised exchange, it is highly likely that a user will only be able to withdraw USDC to a network such as Ethereum. This is a terrible user experience for Cosmos users, who will need to receive USDC on Ethereum first, before bridging it to Osmosis. Not only is this an unnecessary amount of steps but a user will also need to purchase ETH in order to pay gas costs to move across chains. With the advent of Axelar (as well as Interchain Accounts), if a user provides a centralised exchange with an address on Ethereum that is being observed by Axelar validators on Ethereum, it will arrive on Osmosis without a user needing to take any extra actions or pay any extra fees. This is possible because validators in Axelar’s network observe payloads incoming into a Gateway (in this case on Ethereum) and the Axelar network understands how to translate it and route it cross-chain. Once a payload arrives on Ethereum, Axelar can create an address for a user on Osmosis to receive the USDC. As a blockchain connecting blockchains, Axelar can execute logic that enables multiple steps to be assembled into 1 for users to take actions cross-chain. In this example, Osmosis users will be able to withdraw from centralised exchanges in 1-step, even if a centralised exchange does not provide the optionality. This will unleash a new wave of liquidity into deFi apps and other decentralized applications, like Osmosis.

The power of Axelar’s network can also be leveraged by users outside the Cosmos ecosystem. For example, an Ethereum user that does not want to leave the comfort of the network can utilise Axelar to take actions on applications that exist outside of Ethereum. To elaborate, let’s say that a user wants to swap ETH for AVAX and then borrow USDC on Avalanche with AVAX as collateral, in a decentralised manner. Right now, a user would probably send ETH to a centralised exchange using MetaMask and pay fees in ETH, sell ETH for USDT/USDC on an exchange, buy AVAX with USDT/USDC in another transaction on an exchange, send the AVAX to an Avalanche wallet and pay fees in AVAX, navigate to a lending protocol front-end, deposit AVAX and pay AVAX fees with an Avalanche wallet and then borrow USDT on a lending protocol with an Avalanche wallet (paying another AVAX fee).

Axelar completely abstracts away these extra steps and payments by creating a sequence of instructions for the network to execute cross-chain on behalf of a user.

In this scenario, if a user was on Ethereum as a source chain, the user would use MetaMask to send intent to a Gateway connected to Axelar, alongside a payment of ETH that is requested by network services in order to execute the intent cross-chain. Axelar network then abstracts the payment flow: ETH is converted into AXL to pay validators and then into AVAX to pay fees on Avalanche. A user does not have to leave MetaMask, or Ethereum, or purchase any other currencies in order to transact on other chains. (Notably, this process may create deflationary effects in Axelar, as “change” from these conversions is either refunded to the user, or applied toward potential buyback-and-burn programs. More on this from Axelar Foundation, here). At this point, Axelar has done all of the work on behalf of the user and a user has successfully borrowed USDC on a lending protocol in Avalanche. Axelar opens up new possibilities for users to take cross-chain actions without needing to learn new tools or purchase new currencies to pay fees.

AXL — The Token of Axelar Network

Axelar is a Proof-of-Stake network built with Cosmos SDK and Tendermint consensus. The AXL token is used to secure the decentralised network. For a refresher, stake is the value of a token that has been delegated to validators to secure a Byzantine system. The more stake (value) that has been delegated, and the more diverse the pool of token-holders and validators, the harder it is to attack the system. At this point, it is extremely unlikely for a validator to be malicious in any case given it would be explicitly risking a large sum of its own stake and implicitly risking its reputation in the cryptocurrency ecosystem. Even in a scenario where the value at stake in AXL is less than the amount being transferred, validator collusion toward a malicious outcome is unlikely, given the explicit reward for doing so would likely be very low and reputation risk extremely high.

Holders of AXL have a strong incentive to delegate their AXL to a validator(s) to secure the network. Validators earn block rewards for successfully proposing new blocks that are verified by other validators in the network. A validator has more opportunity to propose blocks (and hence earn more rewards) if it has more stake delegated to it. Delegators are the ones that stand to benefit the most from block rewards because delegators earn the majority of it (often >90%), whilst validators take a commission for securing the network on behalf of them (i.e for. running the node that participates in the Axelar’s network consensus). If an AXL holder does not delegate, they risk being diluted as they will miss out on block rewards being received by other AXL stakers and validators.

Token-holders also have an incentive in the form of their long exposure to AXL, to delegate AXL to validators that they believe will secure the network in the best possible fashion. Delegators can review data on the full list of validators via the Axelar block explorer, Axelarscan, at axelarscan.io/validators. The more AXL that is staked with a validator, the more voting power a validator has (i.e. more chance of a validator being chosen to produce the next block) — but this does not lead to concentration of voting power, because Axelar has implemented quadratic voting. In short, quadratic voting means validators’ voting power is equivalent to the square root of their delegated stake. E.g., to get one vote, a validator would need 1 token; but to get 2 votes they would need 4; to get 3 votes, 9 tokens would be needed, and so on. The validator set of Axelar is limited, so AXL token-holders can play a direct role in ensuring the active validator set is performant and available by delegating to high-quality validators. Ideally, Axelar’s network is very decentralised whereby it would take not just a lot of stake to break liveness guarantees of the network but also a lot of validators (e.g. validator diversification).

Aside from securing Axelar’s network, AXL is also used for token-holders to participate in governance. Due to the fact that Axelar is built with Cosmos SDK, this means that all governance proposals are created and voted upon on-chain. The more AXL that a token-holder holds in the network, the more votes a token-holder has on governance proposals. For example, governance proposals might cover connecting new chains or a proposed upgrade that improves the features of Axelar’s network. However, it is not a requirement for AXL token-holders to participate in governance in Axelar. In networks built using Cosmos SDK, token-holders inherit the vote of the validators they delegate to if they do not vote themselves. If a user does not agree with the vote of a validator, the user always has the optionality to change the vote that was inherited from their validator. All in all, on-chain governance in Cosmos SDK chains runs smoother than most and is a great way for token-holders to actively participate and contribute to decentralised networks.

Finally, AXL is used to pay transaction fees to validators in Axelar’s network. For example, a user active on source-chain Ethereum that signals intent to take actions on destination-chain Avalanche would pay fees in ETH to Axelar’s Gateway on Ethereum. Axelar’s SDK provides services that observe the Gateways and then convert the ETH fee into AXL to pay Axelar validators and AVAX to pay Avalanche validators (all-the-while taking a cut for doing so). In essence, AXL is the fuel for validators to come to consensus on cross-chain intent. Demand for AXL comes from services such as Axelar SDK, which convert other currencies into AXL in order to pay validators for their work. Anyone can provide these services; they can even be handled manually by the user, if desired. The more usage Axelar’s network gets, the more currencies that are converted into AXL to pay validators, the more demand for AXL.

What makes Axelar valuable?

There are many reasons why Axelar is a valuable network. The network is valuable for developers, users and token-holders.

For developers, Axelar is useful due to the Turing-complete programmability the network facilitates, as well as the ability to compose functions cross-chain. Starting with composability, developers that build on top of Axelar can build one-click user experiences consisting of multiple components that interact with each other cross-chain. (Read more for an introduction to architecture approaches, when composing cross-chain.) For example, a developer might choose to build a yield optimiser, whereby a financial strategy reads yield of a certain asset across multiple chains and deploys more or less capital (rebalancing) on a connected chain in order to optimise yield for the next block. Axelar is also entirely programmable, which means that validators in Axelar’s network can take any action on behalf of a user cross-chain, no matter what it is. For example, a developer could choose to build a governance aggregator application whereby a validator set can vote on behalf of a user in a DAO, cross-chain, in the same direction as the majority vote (e.g. vote YES if majority vote is already YES). Related to programmability, Axelar network is Turing-complete, meaning any program that is created by developers can be run by the network, given enough memory and time. These features are possible because Axelar is a blockchain that connects blockchains, and cannot be duplicated by other interoperability networks. All in all, Axelar is the most customisable, flexible, programmable and composable interoperability network.

Users of Axelar can look forward to greater liquidity in their respective ecosystems, a better user experience, less transaction costs and new use-cases. Greater liquidity will be able to freely flow across blockchains that are connected to Axelar and as a result, users will have new assets to trade that were not available previously on their blockchains. There will be a better experience for users moving cross-chain as users will not need to hold multiple tokens across chains to take actions and not need to make separate transactions for each transaction. Any cross-chain transaction can be paid for with one token and instructions can be bundled by validators to execute atomically. Users will also be able to access new types of applications that exist on chains that are not native to the chain they currently interact with. For example, a user on Ethereum might be able to utilise a cross-chain AMM built on Axelar to swap Ethereum assets with assets on Avalanche. Axelar and its partners are already working with the largest dexes on multiple chains (Osmosis, a Cosmos project, is a notable example), who are building these cross-chain liquidity networks. Moreover, many of these projects are using Axelar’s unique functionality to build user onramps (such as one-time deposit addresses) that can rival centralised exchanges for ease-of-use, and welcome users seamlessly, regardless of what tokens they hold.

AXL is the fuel to the Axelar economy. The value of AXL comes from how it is used to secure the network, govern the network and pay node operators in the network to execute cross-chain intent. Holding AXL gives users a way to directly contribute to the sustainability and security of the network.

Axelar Overview

To conclude, Axelar is a decentralised and permissionless interoperability network built with Cosmos SDK that has a mixture of properties such as many-to-many connectivity, programmability, composability and Proof-of-Stake security that constitutes the most robust interoperability network available for users. Axelar will be secured by AXL, which is used to secure the Proof-of-Stake network, as well as for governance and payment for validators to execute cross-chain intent. Axelar will unlock a variety of use-cases that have not yet been seen, such as interacting cross-chain with other blockchains that might not speak the same language as the user’s source blockchain. For the first time, cross-chain user experience will be seamless as a flux of applications are being built on top of Axelar currently to leverage the profound properties of the interoperability network. Users who enter Web3 via one blockchain will easily access applications and assets on other blockchains, perhaps without even knowing they are doing so. Axelar solves problems of centralised bridges and interoperability networks to produce what can ultimately be argued as the safest, most secure and best cross-chain user experience that is available for users.

Acknowledgements: Thanks to Galen Moore from Axelar for his review of this article.

About the Author

Xavier Meegan is Research and Ventures Lead at Chorus One.

Medium: https://medium.com/@xave.meegan
Twitter: https://twitter.com/0xave

About Chorus One

Chorus One is one of the largest staking providers globally. We provide node infrastructure and closely work with over 30 Proof-of-Stake networks.

Website: https://chorus.one
Twitter: https://twitter.com/chorusone
Telegram: https://t.me/chorusone
Newsletter: https://substack.chorusone.com
YouTube: https://www.youtube.com/c/ChorusOne

About Axelar

Axelar delivers secure cross-chain communication for Web3, enabling dApp users to interact with any asset or application, on any chain, with 1 click.

Website: https://axelar.network/
Twitter: https://twitter.com/axelarcore
Discord: https://discord.com/invite/aRZ3Ra6f7D
Blog: https://axelar.network/blog
YouTube: https://www.youtube.com/c/Axelarcore

Core Research
Networks
Solana Validator Economics
Blockchains not only need to be technically good. Besides the protocol and implementation levels, a key element in the success of a decentralized system is having different and independent groups using it, operating it, and governing it.
August 23, 2022
5 min read

Blockchains not only need to be technically good. Besides the protocol and implementation levels, a key element in the success of a decentralized system is having different and independent groups using it, operating it, and governing it. The economic incentives in decentralized systems to achieve such participation by all these different groups have gained attention from researchers who are now interested in “tokenomics”, as a new field of study.

In this article, we are going to explore Solana economics, focusing on the stimulus to network node operators, or validators. We conducted an analysis of the inflation model, the costs and rewards to validators and stakers, as well as the current network activity levels. We also estimate the minimum stake required of a validator in order to break-even, and estimate the impact of different market scenarios, considering the most important variables and how they affect validator profitability.

For this purpose, we built the Solana Validator Dashboard and the Solana Validation Cost Estimator.

Inflation Design

Solana validators currently earn from two sources:

  1. protocol-based rewards: generated from inflationary issuances from a protocol-defined inflation schedule.
  2. transaction fees: currently, 50% of the transaction fee is burned and the remaining 50% goes to the validator leader of the respective slot.

The Solana inflation design has defined SOL emissions as starting at 8%, and decreasing by 15% every year. The model was activated on February 10th, 2021 with the payment of 213,841 SOL.

Validators started to receive rewards from inflation in February 2021.
Source: Solana Validator Dashboard

As of July 2022, Solana’s inflation rate is around 6.8%. The staking yield is equivalent to 9.1%, as 75% of the total supply is currently staked (i.e. total inflation rewards are distributed to staked tokens only, resulting in a dilution of non-staked tokens). The rate does not reflect the yearly emission rate. It can be considered a target instead, and the mechanism behind it is broken down below.

Solana’s inflation model considers 400ms block times even though it is mentioned on Docs that the current implementation targets block times to 800ms. The recent average is around 650ms but with high variance.

Solana block times over the 35-day period ending August 9, 2022

Although Solana remains extremely performant to the everyday user, the difference in slot times directly impacts the economics and business viability of running a validator on Solana. Longer block times will result in smaller rewards, given a smaller number of epochs in a calendar year, decreasing the amount of SOL distributed to network participants.

Inflation Rewards Pool

In every epoch, Solana calculates the number of tokens instantiated for the inflation pool. The result will be the amount of SOL tokens to be distributed to validators and stakers as inflation rewards, according to the voting and staking status from the previous epoch. 0.45 SOL is the approximate amount currently allocated and distributed among eligible validators in each slot — 195 thousand SOL per epoch.

Block times impact inflation rewards as the function will taper the initial rate (8%) given how many slots have passed since inflation activation on Mainnet — as a proportion of how many slots fit in one year.

Comparing effective inflation rate, given average block times

Considering an average block time of 650 ms, the inflation being distributed in every epoch is equivalent to a 4.1% yearly rate and the stake yield falls to 5.5%, instead of the 6.8% and 9.1% previously assumed.

Commission Fee

Also relevant to validator economics will be the commission. In fact, stake owners, a.k.a. delegators, earn the inflation rewards. Validators earn a portion of it represented by the commission. In the plot below, we can see that a common fee for public nodes is around 10%. There are only 81 nodes charging a 5% fee or smaller. 100% commission is assumed to refer to private nodes (100 validators).

Amount of SOL staked versus commission rate for each validator.
Stake status as of August, 2nd 2022.

Transaction Fees

Block reward from transaction fees varies according to network activity. Recent average is around 0.01 SOL per slot. Total per epoch increases with voting power, as the number of slots attributed to the validator is based on the proportional stake.

Rewards from transaction fees per epoch as a proportion of inflation rewards.

Theoretically, as inflation decreases with time, validators’ rewards would be supplemented by the increase in transaction fees. The assumption can eventually become a truth as the network matures. Some plots below show that currently this is an unfair assumption:

1- The market’s cyclic nature — the number of non-vote transactions will not necessarily be growing over time. Total transactions (vote + non-vote) picks in Oct21, around 180 thousand in one day. And falls to less than 100 thousand transactions in Apr22.

The number of transactions (orange) and Rewards from transaction fees (gray). Inflation activation is shown in blue
Source: Solana Validator Dashboard

2- Solana network has invested in growing the network of validators. The plot below shows the number of unique rewards recipients (addresses).

Number of validators (orange) and Rewards from transaction fees (gray) since inflation activation (blue)
Source: Solana Validator Dashboard

3- As a consequence of voting power dilution and lower network activity, rewards obtained from transaction fees decreased for validators in an individual perspective.

Number of slots under C1 leadership (green) and Rewards from transaction fees (orange)
Source: Solana Validator Dashboard

Validator Costs

Hardware and Personnel

We split the cost into i) hardware, colocation, and bandwidth, to host the validator and ii) personnel, which can vary significantly. The official recommendations can be found on the Solana Documentation.

Small Validator

  • Hardware: a single node on the most budget hardware that can still run Solana.
  • Personnel: hobbyist who spends a few hours/week.

Medium Validator

  • Hardware: a pair of nodes with an average provider and 1 Gbps traffic.
  • Personnel: shared site reliability engineering team, equivalent to 0.25 full-time employees focused on Solana.

Professional Validator

  • Hardware: a pair of nodes with a specialized provider and >10 Gbps traffic.
  • Personnel: dedicated site reliability engineering team, equivalent of 1.5 full-time employees focused on Solana.

Voting Costs

The vote is an affirmation that a block it has received has been verified, as well as a promise not to vote for a conflicting block. — Solana Docs

Validators are expected to vote on the validity of the state proposed by the slot leader. A validator node, at startup, creates a new vote account and registers it in the network. On every new block, the validator submits a new vote transaction and pays the transaction fee (0.000005 SOL).

SOL Token Acquisition

Validators usually own (a portion or the total of) the staked tokens, a.k.a. self-staking. In this case, the cost of tokens depends on the average price of acquisition. For the purpose of the current analyses, we will consider the validators only to own 100 SOL at a US$ 50 price.

The Solana Foundation promotes the growth of the validator set through the Solana Delegation Program. Applications require small validators to achieve the “baseline” criteria, which includes running a node also on the Testnet, in order to receive 25,000 SOL. Those who meet the baseline criteria and also the “bonus” criteria can receive an extra (dynamic) amount in the delegation. A recent post on stake delegation strategies and why delegation programs are needed, goals, and criteria can be found in How can networks nurture decentralization?

Solana Delegation Program, baseline criteria example
Solana Delegation Program, bonus criteria example

Break-even

In summary, Solana validator’s profitability depends on the current inflation rate, block times — reflected on the number of epochs in one year, the voting power, the total supply, the number of transactions, the cost structure, and the SOL market price.

For the three operational levels stated above, we will look at three different economic scenarios: optimistic, average, and pessimistic, with the average scenario being the closest to the current values.

The average market price in one year is fixed at $50 for the purpose of break-even analysis. Different price scenarios can be evaluated in a further session.

Break-Even Third Party Stake (thousands of SOL)

We found that the 40,000 SOL to break even would be a realistic amount for a small validator, on an average scenario, close to current levels. The number grows to 253,000 SOL for the medium setup. A professional validator would need more than 1.3 million SOL staked.

Minimum thousands of SOL in third party stake to break-even

For a validator with a 0.01% stake, we estimate a 25 SOL reward from transaction fees in one year. The voting process costs around 200 SOLs per year for each node operator. Therefore, small validators are dependent on inflation rewards to achieve break-even, and ideally, become profitable. Around 350 thousand SOL staked would be needed to fully cover the voting cost, when considering rewards from transaction fees only.

Considering active stake on August 2nd:

  • 89.5% of validators control less than 115 thousand SOL, and;
  • 3.6% of validators control more than 1 million SOL each.
The number of validators by the amount of stake. Stake status as of August 2022.

Although the number of validators may be considered high compared to other Proof of Stake networks, 71 accounts are responsible for 57% of the total 365 million SOL staked.

The number of validators by the amount of stake. Stake status as of August 2022.

The majority of validators currently stake between 80 and 90 thousand SOL, as seen in the plot below. There are at least 138 (7%) instances of the validator client with stake amounts smaller than 40 thousand SOL, the estimated break-even level for a small validator.

The number of validators by the amount of stake. Stake status as of August, 2nd 2022.

Market Price Exposure

Simulation shows that medium and professional validators are more sensible to fluctuations in the SOL market price than small-size validators. Considering SOL average price in a year to be $75, the break-even level decreased by more than 30% for medium and professional levels and only 7% for small validators. A similar effect is found if the average price drops to $25.

Adjusting Inflation Model

In PoS networks, adopting an accurate inflation model in conjunction with direct incentives in form of delegation is important to:

  • attract new, independent validators, promoting decentralization and censorship resistance;
  • increase staking levels and interest for SOL by long-term holders;
  • guarantee the incentive to existing validators, given the current market price and network activity level.

Solana validators and stakers have seen rewards decreasing with higher block times compared to the projected rewards from the initial inflation model. As additional factors, the network experienced a contraction in non-vote transactions during the latest months and the expansion of the validator set.

According to the break-even levels discussed above, an 8.85% inflation target would be the rate level to reflect an effective 5.5% emission in one year, considering 650 ms block times (6.3% if 550 ms block times). Assuming 75% of total supply is delegated to validators, staking yield would become 7.1% in one year and the minimum amount in stake to break even drops by 24%, to 32 thousand SOL.

The inflation rate is even more relevant for small validators’ profitability, compared to transaction fee rewards. Adjusting the inflation model according to the actual network configuration would reinforce the interest of those validators staking less than 40 thousand SOL. Supposing the 8.85% rate simulation above, approximately 21 more validators (1.12%) would reach the break-even level — that is the number of validators currently in range 30-40 thousand SOL in stake.

Conclusion

In this study, we explored the variables behind the Solana validator economics, estimating profitability levels for different market scenarios.

We found that:

  • The actual inflation emission rate is around 4.1% per year, instead of the 6.8% theoretical target because of increased block times;
  • 40 thousand SOL would be a realistic amount for a small validator to break even;
  • 7% of validators control stake amounts smaller than break-even level;
  • Voting cost per year averages 200 SOL. Validators controlling less than 350 thousand SOL depends on inflation rewards to fully cover the voting cost;
  • 71 validators control more than 1 million SOL each, yielding 57% of the total supply;
  • Medium and professional validators are more sensible to fluctuations in the SOL market price. Small size validators are more sensible to inflation parameters;
  • A 30% adjustment in the inflation target would bring the effective rate to 5.5% per year — better reflecting the initial model, and reduce the minimum amount to break even in 24% for all validator sizes.

Fee markets are now live on Solana but the adoption of the priority fee by dApps and general users at the moment is low, with the proportion of around 4% of transactions paying a higher fee than the fixed rate. It has been in an uptrend since launched, in late July.

Go to the Solana Validator Cost estimator in getguesstimate to explore the relevant variables, their interactions, and correlations. Thanks,

Ruud

, Chorus One engineer, for building it.

“Look below the surface and you will find that all seemingly solo acts are really team efforts.” —John C. Maxwell

This article was brought to you by

Chorus One

. With meticulous review by

Felix Lutsch

and

Ruud

.

About Chorus One

Chorus One is one of the largest staking providers globally. We provide node infrastructure and closely work with over 30 Proof-of-Stake networks.

Website: https://chorus.one
Twitter: https://twitter.com/chorusone
Telegram: https://t.me/chorusone
Newsletter: https://substack.chorusone.com
YouTube: https://www.youtube.com/c/ChorusOne

Core Research
Exploring Validator Economics on Solana
We present our findings on the profitability and economic viability of running validators on Solana.
August 23, 2022
5 min read

Blockchains not only need to be technically good. Besides the protocol and implementation levels, a key element in the success of a decentralized system is having different and independent groups using it, operating it, and governing it. The economic incentives in decentralized systems to achieve such participation by all these different groups have gained attention from researchers who are now interested in “tokenomics”, as a new field of study.

In this article, we are going to explore Solana economics, focusing on the stimulus to network node operators, or validators. We conducted an analysis of the inflation model, the costs and rewards to validators and stakers, as well as the current network activity levels. We also estimate the minimum stake required of a validator in order to break-even, and estimate the impact of different market scenarios, considering the most important variables and how they affect validator profitability.

For this purpose, we built the Solana Validator Dashboard and the Solana Validation Cost Estimator.

Inflation Design

Solana validators currently earn from two sources:
  1. protocol-based rewards: generated from inflationary issuances from a protocol-defined inflation schedule.
  2. transaction fees: currently, 50% of the transaction fee is burned and the remaining 50% goes to the validator leader of the respective slot.

The Solana inflation design has defined SOL emissions as starting at 8%, and decreasing by 15% every year. The model was activated on February 10th, 2021 with the payment of 213,841 SOL.

Validators started to receive rewards from inflation in February 2021.
Source: Solana Validator Dashboard

As of July 2022, Solana’s inflation rate is around 6.8%. The staking yield is equivalent to 9.1%, as 75% of the total supply is currently staked (i.e. total inflation rewards are distributed to staked tokens only, resulting in a dilution of non-staked tokens). The rate does not reflect the yearly emission rate. It can be considered a target instead, and the mechanism behind it is broken down below.

Solana’s inflation model considers 400ms block times even though it is mentioned on Docs that the current implementation targets block times to 800ms. The recent average is around 650ms but with high variance.

Solana block times over the 35-day period ending August 9, 2022

Although Solana remains extremely performant to the everyday user, the difference in slot times directly impacts the economics and business viability of running a validator on Solana. Longer block times will result in smaller rewards, given a smaller number of epochs in a calendar year, decreasing the amount of SOL distributed to network participants.

Inflation Rewards Pool

In every epoch, Solana calculates the number of tokens instantiated for the inflation pool. The result will be the amount of SOL tokens to be distributed to validators and stakers as inflation rewards, according to the voting and staking status from the previous epoch. 0.45 SOL is the approximate amount currently allocated and distributed among eligible validators in each slot — 195 thousand SOL per epoch.

Block times impact inflation rewards as the function will taper the initial rate (8%) given how many slots have passed since inflation activation on Mainnet — as a proportion of how many slots fit in one year.

Comparing effective inflation rate, given average block times

Considering an average block time of 650 ms, the inflation being distributed in every epoch is equivalent to a 4.1% yearly rate and the stake yield falls to 5.5%, instead of the 6.8% and 9.1% previously assumed.

Commission Fee

Also relevant to validator economics will be the commission. In fact, stake owners, a.k.a. delegators, earn the inflation rewards. Validators earn a portion of it represented by the commission. In the plot below, we can see that a common fee for public nodes is around 10%. There are only 81 nodes charging a 5% fee or smaller. 100% commission is assumed to refer to private nodes (100 validators).

Amount of SOL staked versus commission rate for each validator.
Stake status as of August, 2nd 2022.

Transaction Fees

Block reward from transaction fees varies according to network activity. Recent average is around 0.01 SOL per slot. Total per epoch increases with voting power, as the number of slots attributed to the validator is based on the proportional stake.

Rewards from transaction fees per epoch as a proportion of inflation rewards.

Theoretically, as inflation decreases with time, validators’ rewards would be supplemented by the increase in transaction fees. The assumption can eventually become a truth as the network matures. Some plots below show that currently this is an unfair assumption:

1- The market’s cyclic nature — the number of non-vote transactions will not necessarily be growing over time. Total transactions (vote + non-vote) picks in Oct21, around 180 thousand in one day. And falls to less than 100 thousand transactions in Apr22.

The number of transactions (orange) and Rewards from transaction fees (gray). Inflation activation is shown in blue
Source: Solana Validator Dashboard

2- Solana network has invested in growing the network of validators. The plot below shows the number of unique rewards recipients (addresses).

Number of validators (orange) and Rewards from transaction fees (gray) since inflation activation (blue)
Source: Solana Validator Dashboard

3- As a consequence of voting power dilution and lower network activity, rewards obtained from transaction fees decreased for validators in an individual perspective.

Number of slots under C1 leadership (green) and Rewards from transaction fees (orange)
Source: Solana Validator Dashboard

Validator Costs

Hardware and Personnel

We split the cost into i) hardware, colocation, and bandwidth, to host the validator and ii) personnel, which can vary significantly. The official recommendations can be found on the Solana Documentation.

Small Validator
  • Hardware: a single node on the most budget hardware that can still run Solana.
  • Personnel: hobbyist who spends a few hours/week.
Medium Validator
  • Hardware: a pair of nodes with an average provider and 1 Gbps traffic.
  • Personnel: shared site reliability engineering team, equivalent to 0.25 full-time employees focused on Solana.
Professional Validator
  • Hardware: a pair of nodes with a specialized provider and >10 Gbps traffic.
  • Personnel: dedicated site reliability engineering team, equivalent of 1.5 full-time employees focused on Solana.

Voting Costs

The vote is an affirmation that a block it has received has been verified, as well as a promise not to vote for a conflicting block. — Solana Docs

Validators are expected to vote on the validity of the state proposed by the slot leader. A validator node, at startup, creates a new vote account and registers it in the network. On every new block, the validator submits a new vote transaction and pays the transaction fee (0.000005 SOL).

SOL Token Acquisition

Validators usually own (a portion or the total of) the staked tokens, a.k.a. self-staking. In this case, the cost of tokens depends on the average price of acquisition. For the purpose of the current analyses, we will consider the validators only to own 100 SOL at a US$ 50 price.

The Solana Foundation promotes the growth of the validator set through the Solana Delegation Program. Applications require small validators to achieve the “baseline” criteria, which includes running a node also on the Testnet, in order to receive 25,000 SOL. Those who meet the baseline criteria and also the “bonus” criteria can receive an extra (dynamic) amount in the delegation. A recent post on stake delegation strategies and why delegation programs are needed, goals, and criteria can be found in How can networks nurture decentralization?

Solana Delegation Program, baseline criteria example
Solana Delegation Program, bonus criteria example

Break-even

In summary, Solana validator’s profitability depends on the current inflation rate, block times — reflected on the number of epochs in one year, the voting power, the total supply, the number of transactions, the cost structure, and the SOL market price.

For the three operational levels stated above, we will look at three different economic scenarios: optimistic, average, and pessimistic, with the average scenario being the closest to the current values.

The average market price in one year is fixed at $50 for the purpose of break-even analysis. Different price scenarios can be evaluated in a further session.

Break-Even Third Party Stake (thousands of SOL)

We found that the 40,000 SOL to break even would be a realistic amount for a small validator, on an average scenario, close to current levels. The number grows to 253,000 SOL for the medium setup. A professional validator would need more than 1.3 million SOL staked.

Minimum thousands of SOL in third party stake to break-even

For a validator with a 0.01% stake, we estimate a 25 SOL reward from transaction fees in one year. The voting process costs around 200 SOLs per year for each node operator. Therefore, small validators are dependent on inflation rewards to achieve break-even, and ideally, become profitable. Around 350 thousand SOL staked would be needed to fully cover the voting cost, when considering rewards from transaction fees only.

Considering active stake on August 2nd:
  • 89.5% of validators control less than 115 thousand SOL, and;
  • 3.6% of validators control more than 1 million SOL each.
The number of validators by the amount of stake. Stake status as of August 2022.

Although the number of validators may be considered high compared to other Proof of Stake networks, 71 accounts are responsible for 57% of the total 365 million SOL staked.

The number of validators by the amount of stake. Stake status as of August 2022.

The majority of validators currently stake between 80 and 90 thousand SOL, as seen in the plot below. There are at least 138 (7%) instances of the validator client with stake amounts smaller than 40 thousand SOL, the estimated break-even level for a small validator.

The number of validators by the amount of stake. Stake status as of August, 2nd 2022.

Market Price Exposure

Simulation shows that medium and professional validators are more sensible to fluctuations in the SOL market price than small-size validators. Considering SOL average price in a year to be $75, the break-even level decreased by more than 30% for medium and professional levels and only 7% for small validators. A similar effect is found if the average price drops to $25.

Adjusting Inflation Model

In PoS networks, adopting an accurate inflation model in conjunction with direct incentives in form of delegation is important to:
  • attract new, independent validators, promoting decentralization and censorship resistance;
  • increase staking levels and interest for SOL by long-term holders;
  • guarantee the incentive to existing validators, given the current market price and network activity level.

Solana validators and stakers have seen rewards decreasing with higher block times compared to the projected rewards from the initial inflation model. As additional factors, the network experienced a contraction in non-vote transactions during the latest months and the expansion of the validator set.

According to the break-even levels discussed above, an 8.85% inflation target would be the rate level to reflect an effective 5.5% emission in one year, considering 650 ms block times (6.3% if 550 ms block times). Assuming 75% of total supply is delegated to validators, staking yield would become 7.1% in one year and the minimum amount in stake to break even drops by 24%, to 32 thousand SOL.

The inflation rate is even more relevant for small validators’ profitability, compared to transaction fee rewards. Adjusting the inflation model according to the actual network configuration would reinforce the interest of those validators staking less than 40 thousand SOL. Supposing the 8.85% rate simulation above, approximately 21 more validators (1.12%) would reach the break-even level — that is the number of validators currently in range 30-40 thousand SOL in stake.

Conclusion

In this study, we explored the variables behind the Solana validator economics, estimating profitability levels for different market scenarios.

We found that:
  • The actual inflation emission rate is around 4.1% per year, instead of the 6.8% theoretical target because of increased block times;
  • 40 thousand SOL would be a realistic amount for a small validator to break even;
  • 7% of validators control stake amounts smaller than break-even level;
  • Voting cost per year averages 200 SOL. Validators controlling less than 350 thousand SOL depends on inflation rewards to fully cover the voting cost;
  • 71 validators control more than 1 million SOL each, yielding 57% of the total supply;
  • Medium and professional validators are more sensible to fluctuations in the SOL market price. Small size validators are more sensible to inflation parameters;
  • A 30% adjustment in the inflation target would bring the effective rate to 5.5% per year — better reflecting the initial model, and reduce the minimum amount to break even in 24% for all validator sizes.

Fee markets are now live on Solana but the adoption of the priority fee by dApps and general users at the moment is low, with the proportion of around 4% of transactions paying a higher fee than the fixed rate. It has been in an uptrend since launched, in late July.

Go to the Solana Validator Cost estimator in getguesstimate to explore the relevant variables, their interactions, and correlations. Thanks, Ruud, Chorus One engineer, for building it.

“Look below the surface and you will find that all seemingly solo acts are really team efforts.” —John C. Maxwell

This article was brought to you by Chorus One. With meticulous review by Felix Lutsch and Ruud.

About Chorus One

Chorus One is one of the largest staking providers globally. We provide node infrastructure and closely work with over 30 Proof-of-Stake networks.

Website: https://chorus.one
Twitter: https://twitter.com/chorusone
Telegram: https://t.me/chorusone
Newsletter: https://substack.chorusone.com
YouTube: https://www.youtube.com/c/ChorusOne

Core Research
MEV
Networks
Analyzing MEV Instances on Solana — Part 3
We look at on-chain data, challenges, and implications of current MEV strategies.
August 8, 2022
5 min read

Introduction

Maximum Extractable Value (MEV) represents a fundamental concept in cryptoeconomics, highly affecting permissionless blockchains. MEV is the consequence of the design of protocols and brings with it bad and good externalities. Indeed, not all MEV can be considered benign as some represent an invisible tax on the user, e.g. check out one of our previous articles — Solana MEV Outlook. In general, MEV can also be an incentive for consensus instability, see e.g. the time bandit attack. However, considering all types of MEV as bad externalities is wrong. There exist benign forms of MEV that ensure protocol efficiency, and one prominent example is arbitrage. Let’s imagine that some user swaps a huge amount of token A on a specific AMM (huge with respect to the total amount in the pool) and that this transaction creates a $5,000 arbitrage opportunity. All users that swap tokens in the same pool and same direction will see their output lowered with respect to the actual value. Thus, whoever exploits this MEV opportunity will also bring the market back to parity with the true price. This will make the AMM more efficient without harming its users in the process.

On Solana, MEV still represents a dark forest since no one has pointed a flashlight at it. This is because Solana is a much younger blockchain compared to Ethereum, which can be seen in the lack of products like Flashbots. One project that is moving in this direction is Jito Labs, which recently delivered the first MEV Dashboard for Solana representing an explorer aimed at illuminating MEV — see here for an introduction. However, it is not the only one trying to fulfill this duty. Pointing lights on some Solana Decentralized Exchanges (DEXs) in order to illuminate the dark forest is one of the key objectives at Chorus One. MEV is a consequence that will be a crucial factor for the future of PoS networks and we are continually looking for the best way to ride it. You can explore our Solana MEV dashboard here.

It is important to understand that a simple copy of Flashbots may not be good for Solana, since it represents a drastically different network from Ethereum — and Jito seems to be something intrinsically different. In this article, we are going to assess what are the MEV challenges Solana faces. We’ll also review the status of our internal research regarding MEV.

In Section 2, we’ll analyze the current and future status of MEV on Solana, with a detailed analysis of what we found on-chain in Section 2.1.

In Section 3, we’ll discuss some implications of the current MEV strategies and how these can affect the functionality of a PoS network.

Section 2: Current and Future Status of MEV on Solana

MEV has a specific supply chain, which “describes the chain of activity which helps users transform intentions into finalized state transitions in the presence of MEV ”. However, despite this “universal” definition, MEV on Proof of Stake (PoS) networks is drastically different from what it represents on Proof of Work (PoW) networks. This is for several reasons. For sure, the most important difference relies on the possibility of knowing for sure that a validator will propose a block at some point. Further, validators have delegators and can offer to them a portion of the MEV revenue (e.g. lowering the commission) attracting users to delegate with them. This makes MEV on PoS networks a growing business model, which constitutes one of the building blocks for cryptoeconomic incentives. From one side, we have validators who can use MEV revenue to reduce commission rate — even go to negative values — by returning all incomes to the delegators. On the other side, we have incentives for Layer-1 (L1) blockchains to improve network performance. This is because, if the “scaling problem” is solved by the introduction of L2s, the MEV and transaction (txs) fees are also moved away from the main chain, weakening the L1 business model.

Fig. 2.1: Source here.

This is exactly what blockchains like Ethereum are facing right now, representing one of the great risks over the next few years. See this Twitter thread for a better understanding of the topic.

But, what is the current status of MEV on Solana? Let’s start from the beginning. Solana does not have a public mempool, meaning that some bad externalities of MEV are very difficult to achieve. However, Solana is not free from them since MEV extraction may produce a bad performance of the network, e.g. spam txs, dropped txs, etc. Indeed, some MEV opportunities only exist if searchers run their own validator, inspect txs that come to them, and run an MEV-extraction code on top of it. Having a high stake and getting access to more MEV opportunities is not an easy task. This dramatically reduces the likelihood of being highly profitable, as the distribution of MEV revenues averages around zero, with a tail towards higher values — see Fig. 2.2.

Note that this is obtained in a specific time window, so it is only representative of the shape of the actual distribution.

Fig. 2.2: Survival probability distribution of arbitrage revenues. The left panel shows a zoom of the right panel for profit between 10 USDC and 500 USDC.

Since txs fees on Solana are low and MEV opportunities can bring validators more profit, validators are incentivized to auction off their block space to searchers, or at least some rumors are pointing towards this possibility.

Fig. 2.3: Source here.

Further, on Solana, fees are currently fixed and cheap, meaning that if there is high competition in a specific market, users face the risk of not getting transactions executed. Since a gas-fee auction is still missing, currently MEV searchers spam transactions to the leader (and following validators in the leader schedule) in the hopes of “winning the battle”.

Fig. 2.4: Source here.

Lastly, on Solana, MEV competition may incentivize validators to perform denial of service (DoS) attacks on other validators in order to leave the spotted MEV opportunities just there, sitting on the table where they are until the attacker can extract them.

The current status of MEV indicates how bad the problem of blockspace-waste is, which resulted in degraded performance for normal users. At the time of writing, according to what can be found on Jito’s MEV dashboard, we have 12,072,328 successful arbitrages against the 350,179,786 unsuccessful ones in 6 months (i.e. a 3.3% of success rate). If we also include liquidations, the success rate goes down to roughly 3%. The total extracted “good” MEV is around $33M. Of course, this is only a lower-bound since MEV can be created any time a user interacts with a blockchain, and smart contracts enable a functionally infinite number of potential interactions. Thus, it is computationally infeasible to calculate a blockchain’s total potential MEV by brute force. Further, we have some previous analyses that show how a huge amount was extracted during periods of stressful market conditions, e.g. $13M MEV during Wormhole Incident and $43M Total MEV from Luna/ UST Collapse on Solana.

Future Solana improvements aim to introduce several features, forcing current MEV strategies to change. Introducing these new features represents a two-sided coin for MEV searchers. Indeed, some spamming bots would be forced to shut down since the local fee market will make it unprofitable to massively spam txs. However, improving the network means more and more users are attracted to use it. This has the immediate consequence of also increasing the total amount of MEV, allowing the chain of implications to continue by incentivizing competition around MEV and “inviting” new searchers to step in.

Section 2.1: Pool congestion assessment

One of the main problems that can worsen an AMM’s functionality is pool congestion. This is because if there are too many txs happening on a specific pool, users may experience a worse trade due to pool unbalancing. This is why arbitraging is a sort of service that normalizes DEXs functionality. But, despite the fact that we know MEV is happening on Solana, where are the greatest opportunities? In other words, what are the DEXs with the highest pool congestion, and who is “solving” it? To answer these questions, we built an MEV dashboard on Dune Analytics. This is because, by looking at the exchanged volume, — using Solscan — you can definitely have an idea of where the congestion is, but nothing is clear when the question is if searchers are solving for it.

Our preliminary research shows that in 10 days (from July 16th to July 26th), the paths with the highest extracted MEV on Solana were live on Orca and Raydium with a lower bound of 20,775 USD extracted, see Fig. 2.5. There were 68 MEV extractors on these cross DEXs during the analyzed period, thus not a great number in terms of competition. Fig. 2.6 shows how the extracted revenues are concentrated among a few searchers. Precisely, 5 different accounts extracted 80.1% of the total MEV.

Fig. 2.5: Extracted MEV on Orca x Raydium bi-cycles. Precisely, the transactions under scrutiny happen between 2 “identical” pools on the two DEXs.

It is worth mentioning that none of the studied DEX combinations show a uniform distribution in terms of MEV opportunities, according to what we show in Fig. 2.2.

Fig. 2.6: Extracted MEV on Orca x Raydium bi-cycles divided by accounts. Precisely, the transactions under scrutiny happen between 2 “identical” pools on the two DEXs.

If we extend the analysis by looking at the USDC Token Accounts belonging to the most profitable MEV searchers, we have that 7 accounts were able to extract 95.6% of total extracted MEV, see Fig. 2.7. Two of them, GjT…m2P and G9D…y2m, interact with the same smart contract, which may indicate that these two accounts belong to the same user. Since these accounts are in the top 7 accounts, this means that it is likely that only 6 users were able to extract 95.6% of the total extracted MEV.

Fig. 2.7: Extracted MEV by most profitable accounts. Here you have the amount of USDC extracted, independent from the path and DEXs.

By deep diving, we also found two accounts interacting with a smart contract with clear reference to Jito, Jito…HoMA, with a total extracted MEV in 10 days of 3,342.30 USDC (at time of writing), over a total of 158,132 USDC extracted — i.e. 2.1% of the total amount.

Section 3: Challenges for securing MEV

We already stated that, on PoS networks, MEV can be seen as a business model since validators can share a portion of the extracted amount with their delegators. However, as shown in Sec. 2.1, this sometimes can constitute a deal that does not truly mean high returns. MEV revenues are strongly correlated with market conditions and DEXs’ usage, meaning that we’re unable to estimate a fixed income to share with delegators. Further, if competition does not grow fast, the promise of sharing revenue with delegators may bring a centralization problem.

To assess this statement, let’s try to formulate a “gedanken-experiment”. Imagine that the volume exchanged by DEXs on Solana grows by a factor 30, and assume that there is only one validator extracting MEV and redistributing the revenues to delegators. The implication of the increased volume is that MEV also increases. Indeed, a factor of 30 means that in 30 days the DEX’s volume on Solana is greater than $30B, and assuming that the 0.04% of it is MEV — as it happens on Ethereum — this means more than $144M yearly. The implication of having only one validator playing this game is that the extracted amount also increases, making the delegation to them an appealing deal. We can just think that a validator with ~2% of the total stake can extract an MEV of ~ $2.9M yearly. Once the delegation starts to concentrate around a single validator — the sole player — again we have a boost in MEV revenues, since the leader schedule is “stake-dependent” on Solana. This is because the revenue per block is not uniformly distributed, so a higher stake means an increased likelihood of capturing a rare juicy opportunity, pushing up the median of the extracted MEV. If there is no competition, this gedanken-experiment has a single outcome: concentration of stake — i.e. centralization.

Risks become higher if one considers that at the moment Solana is one of the fastest blockchains and that future development aims to improve this even further. The high number of processed txs per second could pave the way for prop firms to enter the market, meaning that more SOL can be delegated to a single validator — the winner of the MEV war.

This, without any doubt, points toward the necessity of building competitive validators for what regards MEV extraction. Once Jito delivers its third-party client for Solana that’s been optimized for efficient MEV extraction (plus its bundle), the risk of centralization can be mitigated. However, even with decentralized block building, as Flashbots aims to achieve with MEV-boost, we remain still far from a definitive solution. Indeed, such an environment makes it easier for builders to buy the blockspace of all validators and thereby isolate the centralization to the builder layer, see e.g. here. At the moment a decentralized MEV from top to bottom is a chimera. The first step toward this direction would require open-sourcing the MEV-extracting validator, starting collaboration between many validators, in the true spirit of open source. Indeed, it is worth noting that adopting validator products developed — and belonging — to a single entity reduces the problem of stake concentration, but can decrease the network’s censorship resistance. If block production is centralized to a single entity, that may represent an enormous censorship risk, regardless of how many validators participate.

For example, let’s assume that this entity gets adopted by 50% of the stake. Suppose now that this entity is regulated by a specific government, which demands that all transactions are blocked. Then, at best, users would need to get their transactions into the other blocks, but in the worst case, this entity can refuse to include vote transactions that vote on blocks that contain sanctioned transactions. This is a simple example that shows how some MEV strategy outcomes could pave the way for censorship risks.

Before concluding, it is worth mentioning that other possibilities do exist. One of them is to frame MEV-extraction as a service, where it is the protocol itself that takes the MEV and shares the corresponding revenue with protocol-token stakers, see e.g. recent rumors on Osmosis development. Despite this “method” seeming to be less prone to a centralization risk, it remains unclear if the time needed to extract MEV is enough to guarantee the AMM functionality — remember that poor competition means some opportunities may remain there for a “long” time. The outcome is the difficulty of assessing all the details of how this will affect the future of the chain.

This article aims to collect some thoughts on how framing MEV may affect the future of PoS ecosystems, focussing on some of its “bad” consequences. Despite the fast development around this huge and complex topic, we at Chorus One are continuously researching this topic with an eye to the future: the healthiness of all networks is always our first priority.

If you’re interested in framing the topic and require research/advisory services on MEV, you can contact our Research Team at research@chorus.one

Core Research
Networks
What are Avalanche subnets and why are they a big deal?
We analyze its technical setup and talk about some of our favorite subnets.
July 7, 2022
5 min read

Avalanche has a thriving, friendly, and engaging community. On top of that, it also has the quickest and most valuable bridge solution to and from Ethereum, with BTC onboarding shortly. Avalanche is fortunate to have a team that consistently produces and executes at the top level. It’s great for validators like us too. There’s no slashing and rewards are dependent only on uptime. Currently, the annual staking rewards are at 9.1%. This makes locking AVAX to stake appealing. The thriving ecosystem is already on display, with liquid-staking now accessible via BenQi (sAVAX, $179M in TVL) and two additional solutions on the way: LAVA and Eden Network + YieldYak. Lido is also building its liquid staking implementation for AVAX. A competitive DeFi landscape is also in operation, including TraderJoe (DEX, $179M in TVL), Platypus (stable swap, $155M in TVL), Aave (lending, $4.64Bn in TVL), and many more. Subnets now allow innovative technologies in both consensus and horizontal scalability architecture to join the network. To make the experience complete they even provide VMs as free open source code ready to be picked up by companies wishing to join the subnet movement.

What are subnets?

Avalanche mainnet is made up of two blockchains (C-Chain and P-Chain) and one DAG (X-Chain for ultra-high TPS). These are two types of distributed ledger technologies (DLTs). The P-Chain is responsible not only for dealing with Subnet and all validator information but also to create new subnets and blockchains.

Avalanche and its multiple chains
https://docs.avax.network/subnets

Although the term “subnet” is used interchangeably and synonymously with blockchains, subnets are a bit more complex than that. The technical definition of a subnet is as follows:

A Subnet is a dynamic set of validators working together to achieve consensus on the state of a set of blockchains, according to Avalanche’s FAQ page.

The unleashing and unlocking of subnets is an event of great importance in the wider web 3 ecosystem. It brings value from its extensive use cases and benefits:
  • Horizontal scaling capabilities for the primary network or any project that wants to scale beyond one blockchain or include multi-blockchain functionality.
  • Virtual Machine (VM) flexibility is enabled and virtual machines using EVM, WASM, B-Script, and other cross-ecosystem technologies can be used for a subnet. Also, any native blockchain token can be used for gas fees, selected by the developer.
  • Highly customizable and flexible in design so they can be compliant with regulatory and jurisdiction laws.
  • A marketplace can emerge where validators offer their services to validate subnets.
  • Virtualize entire ecosystems such as Ethereum, Solana etc. on Avalanche.
  • Because there is no competition for block space, TPS is greater because transactions in one chain are not hindered by dApps on other chains.

So if devs can decide their own token and VM, how does this help AVAX?

Verifying a subnet needs validating the mainnet (remember: C-chain, X-chain and P-chain) too. This needs AVAX. Hence, when new subnets form, more AVAX will be staked. There is a limit to how many subnets a validator may operate (due to hardware constraints), however, the number of validators inside a subnet is infinite, with a minimum of 5. Each validator can realistically operate the C-chain plus a few more subnets at most, so, validators should attempt to choose the finest subnet games out there, supporting competition and production of competitive products. The mechanism is as follows:
  • A new subnet is constructed, and the operating nodes cannot handle further subnet validation, thus new nodes are built; as a result, more AVAX is staked and the mainnet is secured further.
  • Staking is limited to a minimum of 2000 AVAX and a maximum of 3,000,000 AVAX (likely to decline) per validator. This implies that validators cannot operate a single, highly concentrated node; rather, they must operate new nodes.
  • Delegators may have up to four times the stake. This implies that you can not simultaneously operate a validator with 2K AVAX and have 1M AVAX delegated to you. This implies that you must continue staking additional nodes, enhancing the mainnet.
  • As such, decentralisation is highly incentivised.

Subnets allow anybody to quickly establish permissioned or permissionless networks with unique implementations that are powerful, dependable, and secure. Developers can use AvalancheGo or AvalancheJS, and Ethereum developers can seamlessly use Solidity to launch dApps as it is fully compatible. Avalanche includes features not seen on other chains, such as the ability to choose which validators secure their Subnet activity, which token is utilized for gas costs, bespoke economic models, and more. Subnets, crucially, stay naturally linked with the larger Avalanche ecosystem, do not compete for network resources with other projects, and are accessible in an infinite supply. With standard rules underlying all apps on a smart contract network, Web3 applications may distinguish on user experience like never before. A similar approach can be found in Cosmos with Saga and their “chainlets” approach and in Ethereum with Skale.

Why are Avalanche subnets a big deal? — Enter GameFi

GameFi, a common phrase in the crypto-verse, is a combination of the words “Gaming” and “Finance.” It covers the gamification of the working system in order to generate profit via play-to-earn crypto games. In GameFi games, items are represented by NFTs. Users may boost their earning potential by levelling up and upgrading their characters, as well as participating in tournaments. As an example, players in Axie Infinity (arguably the biggest GameFi game in 2021) earned more than $1000 worth of $SPL a month before it suffered a hack. Many of these blockchain games are communities where players may earn tokens to swap for money. It’s remarkable to watch blockchain games with a few hundred players in 2013 turn into top-grossing games like Axie Infinity with hundreds of thousands of dollars in daily trade volume. And this is just the first generation of games on blockchains.

Adoption has skyrocketed over the past years. With a large number of retail investors as well as big companies like Microsoft, Nike, Meta and many more already involved, the metaverse market is expected to grow significantly. Major investors such as Gala Games and C2 Ventures formed a $100 million venture fund for GameFi. Solana Ventures and others also launched a $150 million fund by the end of 2021. More recently, Framework Ventures has allocated half of the $400M fund to Web3 gaming. As evidence of the blockchain gaming industry’s expansion, the blockchain games and infrastructure business attracted over $4 billion in venture capital financing in 2021 alone. Blockchain gaming has grown by 2,000 percent in a year, according to the conclusions of a joint report by DappRadar and the Blockchain Game Alliance (BGA). Although this was prior to the latest crypto meltdown. The scenario might be extremely different right now. However, the crypto gaming business has already received $2.5 billion in investment this year; if this trend continues, it might reach $10 billion by the end of 2022. The report also states that blockchain games drew $1.22 million in unique active wallets (UAW) in March, representing 52% of industry activity. With all of the various technologies collaborating to build a self-sustaining ecosystem, the blockchain gaming sector is poised to become a significant income source and probably the first real utility for blockchains outside payments.

What is required for crypto games to become mainstream?

GameFi might expose a big market to crypto, but its games aren’t there yet. At least initially, players don’t need to realize the game uses NFTs and tokens. Gamers shouldn’t learn about wallets or pay great amounts until they’re addicted. For the optimal user and developer experience, games require application-specific chains. ASBs are the best way to scale block space for the next billion users. Cosmos, Avalanche Subnets, Polygon Supernets, and StarkNet Layer 3s sell block space. Application-specific blockchains provide cheaper costs, fine-tuned performance, transaction isolation, and developer control. Other requirements are:
  • Transactions per second (TPS) — A single popular game will need 1000s of transactions per second.
  • Time to finality- It is adamant. No one wants to take 5 minutes to kick a football.
  • Free gas fees for users — Users will not kick the football if the cost is more than the worth of the action. Ideally, consumers have no idea what gas or transactions are.
  • Strong financial incentives for validators — Gas costs should be used to motivate validators; otherwise, no one will operate a validator node. This is a tricky balance since it contradicts the goal of keeping gas prices low for customers.
  • Ease of development — Game designers should not be required to create their own chain. Distributed consensus is a really difficult problem. The majority of Web2 developers no longer build their own software infrastructure and instead rely on cloud companies.

Why Avalanche for games?

The key advantage of using AVAX for GameFi is the three-pronged structure, which comprises validators and subnets using the P-Chain. Subnets let projects create their own application-specific blockchains (ASBs) that do not disrupt the rest of the chain. As a result, no single game utilizes the whole network bandwidth. GameFi on Avalanche offers the best chance for blockchain games to thrive in their intended setting. Avalanche is also great for creating NFTs, which makes digital assets like NFTs easily available for P2E games or the metaverse. Users can utilize Avalanche to establish their own localized chains that run independently of other chains, allowing them to sandbox their own knowledge and technology for the benefit of their own efforts. Most developers use their own token for gas on their subnet, however, a subsidised gas fee is also an option. Avalanche allows network developers to utilize whatever virtual machine they want or to create their own. You may use EVM or any other VM you like. Aside from the EVM and AvalancheVM, Avalanche now provides SpacesVM (key/value storage), BlobVM (binary storage), TimestampVM (a minimum viable VM), and others are in the works. Modularity rules the roost. Observing web2 games moving into web3 through subnets is a great place to start.

Some of our favourite emerging Subnets

It is worth noting that Avalanche gaming developers are taking a Play-and-Earn method rather than a Play-to-Earn approach. This emphasizes the necessity for the game is enjoyable and long-lasting.

  • Shrapnel, the world’s first blockchain-enabled AAA first-person shooter game, has announced that it would use the Avalanche network as its foundation for its impending release. They want to establish a subnet devoted to the game using the Avalanche Subnet capabilities. Shrapnel is creating a novel AAA experience for gamers that puts competitive multiplayer, creative tools, and genuine digital ownership front and centre.
  • TimeShuffle is a play-and-earn turn-based strategy game in which warriors from across history battle in randomly created battlefield settings. Each player may begin his conquest with a free-to-play hero and advance his heroes as they play, unleashing the full potential of cryptocurrency gaming and the play-and-earn paradigm.
  • Ascenders is a sci-fantasy, open-world action RPG powered by Avalanche with a fully decentralized, player-driven economy. Players may participate in daily tasks for AGC tokens while also producing NFT products and land plots. The game’s development team has concentrated on developing a truly player-centric experience. The first alpha release of the gameplay is scheduled in the coming months.
  • Ragnarok is one of the most hyped initiatives in both the NFT and blockchain gaming sectors. Earlier this year, the official NFT collection debuted on Ethereum. The team is now working on creating one of the most thrilling gaming experiences to hit the Avalanche blockchain, positioning subnets as an alternative to Ethereum. The game will unveil the first 77 in-game playable characters this month. Find many more subnet projects here.

Subnet Disclaimers

  • Games on the blockchain need to be the next big thing. We still have to see a real adoption for GameFi which will test the technology.
  • The biggest drawback of subnets is that there is no Inter-Blockchain Communication (IBC) protocol yet. This means that subnets need to bridge to one another, which is less secure than IBC. This is being considered as Ava Labs announced in their first Avalanche Summit but is still in the early stages. For the time being, only projects within the same subnet are able to benefit from shared security.

Conclusion

Overall, blockchain games continue to be one of the most appealing parts of the dApp market. Although demand for blockchain games looks to have peaked, gaming dApps continue to drive most of the industry’s on-chain activities. Notably, subnet games like Crabada and Defi Kingdoms are still drawing players even in a difficult 2022.

VCs and investors are pouring money into Web3 gaming ventures at an all-time high pace. Furthermore, financial firms like Morgan Stanley have assessed the metaverse’s economic potential to be at least an $8 trillion business. The Sandbox’s second Alpha season, Decentraland’s Fashion Week, and the overwhelming demand for NFT Worlds indicate a positive future for GameFi. However, security risks such as the Ronin bridge vulnerability and the difficulties of attaining full interoperability remind everyone interested that widespread adoption is not yet here. Avalanche Foundation believes that subnets like Shrapnel and TimeShuffle are the solution for the next generation of gaming, thus it launched Avalanche Multiverse last March, a $290 million incentive program to accelerate the growth of the new Internet of Subnets.

Core Research
Networks
Upcoming Upgrades on Solana That Will Improve Network Performance
Solana has announced three main changes in its mitigation plan to address the stability and resilience of the network: QUIC Stake Weighted QoS Fee Markets
June 15, 2022
5 min read

Solana has announced three main changes in its mitigation plan to address the stability and resilience of the network:

  1. QUIC
  2. Stake Weighted QoS
  3. Fee Markets

The measures are targeting the intense traffic responsible for two out of the three recent incidents. Although the changes being proposed by Solana developers are considered abstract or deeply technical for the general part of the community, the concepts are not completely new, being imported from other already mature systems. In this article, we will try to break down the technicalities and explain them in simple terms.

The current Solana client version for validator nodes (v1.10) already paves the way for some of these improvements to be iterated on until optimal market fit. Fee prioritization is targeted for the v1.11 release, according to the official announcement.

Some Context on Network Communication Protocols

Solana used to adopt the User Datagram Protocol (UDP) for transmitting transactions between nodes in the network. Nodes send transactions through UDP directly to the leader — the staked node responsible for proposing the block in that particular slot — without a previous connection being established. UDP does not handle traffic congestion or delivery confirmation for data. In situations of network congestion, the leader is unable to handle the volume of incoming traffic, which means some packets get dropped. Even at quiet times, some level of packet loss is normal. By sending the same transaction multiple times, users have a greater chance that at least one of their attempts will arrive.

Fig. 1: Illustration of UDP protocol, featuring multiple data transfers that can burden the receiver and lose packets in the way.

In contrast to UDP is the Transmission Control Protocol (TCP). TCP includes more sophisticated features but for this to work, it requires a session (i.e. a known connection was previously established between the client and the server). The receiver acknowledges (“acks”) packets and the sender knows when to stop sending packets in case of intense traffic. TCP allows for re-transmitting lost packets, once the sender stops receiving acks, the interpretation is that something must be lost, so the sender should slow down.

TCP is not ideal for some use cases though. In particular, it sequences all traffic. If one portion of the data is lost, everything after it needs to wait. That is not great for Solana transactions, which are independent.

Fig. 2: Illustration of TCP protocol, featuring serial data transfer. Lost packets affect subsequent ones since the sender waits for server acknowledgement.

1. QUIC

QUIC is a general-purpose protocol which is used by more than half of all connections from the Chrome web browser to Google’s servers. QUIC is the name of the protocol, not an acronym.

QUIC is an alternative to TCP with similar features: a session, which then enables backpressure to slow the sender down, but it also has a concept of separate streams; so if one transaction gets dropped, it doesn’t need to block the remaining ones.

Fig. 3: And QUIC protocol illustration, representing a mix of features from TCP and UDP.

Solana is a permissionless network. Anyone running a Solana client is a “node” in the network and is able to send messages to the leader. Nodes can operate as validators — when it is signing and sending votes — and (or) they can expose their RPC (Remote Procedure Call) interface to receive messages from applications such as wallets and DEXs, and send those to the leader.

The leader listens on a UDP port and RPCs listen on a TCP port. Given the leader schedule is public, sophisticated players with algorithmic strategies (“bots”) are able to send transactions to the leader directly, bypassing any additional RPC nodes that would only increase latency. With the leader being spammed, the network gets congested and that deteriorates performance. The UDP port used by the leader will be replaced by a QUIC port.

2. Stake Weighted QoS

Quality of Service (“QoS”) is the practice of prioritizing certain types of traffic when there is more traffic than the network can handle.

Last January, after Solana faced performance issues as automated trading strategies (aka “liquidator bots”) spammed the network with more than 2 million packets per second, mostly duplicate messages, Anatoly Yakovenko mentioned in a tweet that they would bring the QoS concept to Solana.

The Leader currently tries to process transactions as soon as they arrive. Because IPs are verifiable through QUIC, validators will be able to prioritize and limit the traffic for specific connections. Instead of validators and RPCs blasting transactions at the leader as fast as they can, effectively DoS’ing the leader, they would have a persistent QUIC connection. If the network (IP) gets congested, it will be possible to identify and apply policies to large traffic connections, limiting the number of messages the node can send (“throttle”). These policies are known as QoS.

Internally, staked weighted QoS means queuing transactions in different channels depending on the sender, weighted by the amount of SOL staked. Non-staked nodes will then be incentivized to send transactions to staked nodes first, instead of sending directly to the leader, for a better chance of finding execution, since excess messages from non-staked nodes will most likely be dropped by the leader.

According to Anatoly validators will be responsible for shaping their own traffic, and applying policies that will avoid vulnerability. For example, if a particular node sends huge amounts of transactions, even if they are staked, validators can take action, ignoring the connections established with this node in order to protect network performance.

3. Fee Markets

Solana fees are currently fixed and charged for each signature required in a transaction (5000 lamports = 0.000005 SOL). If there is high competition in a specific market, users face the risk of not getting transactions executed. With a fixed transaction fee, there is no way to communicate priority or compete by paying more to get their transaction prioritized. Without alternatives, users (usually bots) spam transactions to the leader (and soon-to-be leaders) in hope that at least one of them is successful. In many situations, this behavior generates more traffic than what the network can process.

A priority fee is soon to be included in Solana, allowing users to specify an arbitrary “additional fee” to be collected upon execution of the transaction and its inclusion in a block. This mechanism would not only help the network to prioritize time-sensitive transactions but also tends to reduce the amount of invalid or duplicated messages sent by algorithms since speculative operations can become unprofitable with an increase in the total cost.

The ratio of this fee to the requested compute units (the computational cost to the program to perform all operations) will serve as a transaction’s execution priority weight. This ratio will be used by nodes to prioritize the transactions they send to the leader. Additional fees will be treated identically to the base fee today: 50% of the fees paid will be collected by the leader and 50% will be burned.

At this point, you could think of several blocks being filled only with transactions targeting an NFT mint. However, there is a limit time for each account to be locked for writing on a single slot (600 to 800 milliseconds). The remnant block space can be filled with transactions writing in different accounts, even if they offer a smaller priority fee. High-priority transactions trying to write to an account that has already reached its limit will be included in the next block.

Each Solana transaction specifies the writable accounts — the portion of the state that will be modified. This allows transactions to be executed in parallel, as long as transactions are independent, i.e. do not access the same accounts. If two transactions write or read to the same account, these two transactions can not be processed in parallel, because they affect the same state.

The Solana team argues that the priority fee will then behave as parallel auctions, affecting only the “hot market” instead of the global price, allowing the fee to grow for a specific queue of transactions trying to write in that account only.

How does the user know the fee to adopt to get a mint? RPCs nodes will need to estimate an adequate fee, most likely using a simple statistical method, for example averaging the actual cost of similar transactions in previous N blocks, or even a quantile. The optimal method will depend on the market, and whether fees for similar transactions are more volatile (high demand) or stable (less demand).

In practice, the priority fee can have a global effect, if the parallel auctions are not implemented on the validator client. With RPCs and users being responsible for arbitrarily setting it, during high intense traffic, applications will likely try to get priority even though they do not interact with the “hot market”, causing an increase in the fee price for other lower demand dApps.

Fee prioritization is targeted for the v1.11 release, according to the official announcement.

In Short

The present article covered the three pieces Solana is actively working on to deal with congestion issues, which include changing the communication protocol from UDP to QUIC, adding stake-weighted QoS for transaction prioritization and a fee market that increases fees with high demand. All of these 3 improvements aspire to improve the performance of Solana, which has been experiencing degraded performance quite often.

We hope it was possible to clarify these concepts and understand the motivations and choices being made. Exploring Solana source code would be an essential next step to investigate the exact metrics being implemented in QoS to select or drop transactions or the mechanism behind the increase (and decrease) of fees and other questions that remain unanswered.

I would like to thank the Chorus One team for the enlightening discussions and knowledge sharing, especially Ruud van Asseldonk for the technical review, and Xavier Meegan for the support.

No results found.

Please try different keywords.

 Join our mailing list to receive our latest updates, research reports, and industry news.

Thanks for subscribing. Watch out for us in your inbox.
Oops! Something went wrong while submitting the form.

Want to be a guest?
Drop us a line!

Submit
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.