Blog

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Guides
Networks
How to stake ETH on OPUS via Ledger/Metamask
A step-by-step guide to staking ETH on OPUS via Ledger/MetaMask
July 26, 2023
5 min read

This step-by-step guide is designed to help you stake Ethereum on OPUS. Throughout this guide, we will break down the process into simple, manageable steps.

1. Connect Ledger to Metamask

  • We assume that you have a ledger with some ETH, and versed with using Ledger. Please reach out to our team for any support here.
  • Please follow the instructions found in this link to connect Ledger with Metamask: https://support.ledger.com/hc/en-us/articles/4404366864657-Connect-your-Ledger-to-MetaMask?docs=true

💡 Tip: If you face the below error(0x650f), please follow this link to resolve the error.

💡 Tip: After this configuration, Metamask doesn’t have access to Ledger private keys. This configuration allows Ledger to leverage Metamask as a visual interface.

2. Enable Blind Signing on Ledger by following the steps shown in this link: https://support.ledger.com/hc/en-us/articles/4405481324433-Enable-blind-signing-in-the-Ethereum-ETH-app?support=true

You have now successfully connected Ledger to Metamask. Next step is to Login to OPUS Poral.

3. Our sales team must have sent you your login credentials. If not, please reach out to them here

4. Now, please enter your organisation name, and login with SSO.

5. Connect Metamask to OPUS.

6. Select Amount of ETH using the Slider

💡 Tip: OPUS Staking slider helps you stake up to 800 ETH(25 Validators) in one transaction.

💡 Tip: OPUS staking screen shows the backward looking APR, and projected rewards.

7. Confirm stake transaction on Metamask.

8. Approve transcation on Ledger

  • Steps involved: Review Transaction > Blind Signing > Amount 800 ETH > Address > Network > Max Fees > Accept and Send

You have now staked Ethereum on OPUS!  To stake more, please follow the guide from step#6.

If you are facing any issues, please reach out to us at Chorus One support.

About Chorus One

Chorus One is one of the biggest institutional staking providers globally operating infrastructure for 40+ Proof-of-Stake networks including Ethereum, Cosmos, Solana, Avalanche, and Near amongst others. Since 2018, we have been at the forefront of the PoS industry and now offer easy enterprise-grade staking solutions, industry-leading research, and also invest in some of the most cutting-edge protocols through Chorus Ventures.

Opinion
Networks
Others
Ethereum Stake and Unstake 101
We take a look at expected times to participate in Ethereum staking.
July 3, 2023
5 min read

Ethereum Stake and Unstake 101

We take a look at expected times to participate in Ethereum staking.

Ethereum protocol times are measured in epochs, with 1 epoch being 384 seconds or around 6 and a half minutes. For ease of understanding, times based on these measurements have been translated roughly into minutes, hours and days.

Staking

  • Every signed transaction visits the Ethereum mempool first, which can be referred to as the waiting room for transactions. Here the pending time is unknown, and will depend on network status, chosen gas fee and priority fee.
  • Deposited - Once the transaction reaches the deposit contract (assuming it’s correct), the validator has a status of Deposited, meaning it has been accepted by the Execution Layer. At this point, the Consensus side is unaware of this deposit. Here there’s a waiting mechanism that is a legacy of our pre-Merge past, used to avoid the minuscule possibility of a chain reorg (might be removed in the future). The validator sits here for ~7 and a half hours.
  • Pending - Once the deposit agreement has been finally accepted by the Consensus Layer, it moves to Pending state, meaning it has been added to the “staking queue”. Ethereum only allows a small number of validators to start or stop validating at a time ~(2025* a day), to maintain stability of the validator set. This takes from at least 25 minutes but can extend depending on the queue; right now it’s ~6 weeks. Timing can impacted with churn rate changes, so this exact time might be slightly lower.
Pending Validators

 88,885 / 25

 Source: https://beaconcha.in/

  • Active - The validator is attesting to and proposing blocks. This is the state where you earn rewards!

Conclusion: Staking takes at least 8 hours, but it is very likely to take a lot longer as the demand to stake grows and more validators are added to the queue (the queue at the time of writing is 88,885 validators waiting). The waiting time right now is about a month and a half.

 Join our mailing list to receive our latest updates, research reports, and industry news.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Unstaking

  • Exiting - The validator is in the process of halting attesting to and proposing blocks. This means you have signed a valid voluntary exit or you’ve been slashed. Either way, you need to keep up your duties for now. When in exiting state, it means you’ve been added to the “withdrawal queue”. To move through this queue takes at least 25 minutesbut can extend depending on the queue; right now it’s 25.
Pending Validators

88,885 / 25

Source: https://beaconcha.in/

  • Slashing - Slashing is a very small risk in Ethereum, but for informational purposes, let’s talk about this state. The validator is in the process of halting attesting to and proposing blocks, having been caught violating some consensus rule. You still have to perform duties, but you will get kicked out of the set. You also move through the queue but you’ll need 36 days to access the funds to account for extra penalties.
  • Exited - The validator is no longer attesting or proposing, you are safe to disconnect the node clients. After exit there is one final delay of approximately 1 day before funds can be transferred to the withdrawal address.

Conclusion: Unstaking takes at least 25 minutes, but can vary depending on the withdrawal queue with a similar model as staking (the queue right now is 25 validators waiting and is expected to clear quite quickly). You also have a 1 day delay to access funds. So, all in all the waiting time right now is about 1 day.

* This number corresponds to the churn rate applied to the staking and withdrawal queues. For every 65,536 additional validators that are active on the Ethereum network, the number of new validators that can be activated per epoch increases by one, and the number of validator exits that can be processed per epoch also increases by one. Right now, the churn rate is 9.

News
Networks
Network 101 - Archway: The blockchain built for developers
Chorus One announces staking support for Archway Network
July 3, 2023
5 min read

Amidst the dynamic blockchain landscape, Archway Network stands out as a platform that has captured the attention of developers and enthusiasts alike. This blog delves into the unique features and opportunities that Archway offers, shedding light on its architecture, tokenomics, use cases, developer rewards, and recent activities that ascribe it as a prominent player in the ecosystem.

Archway’s Architecture: Where Innovation Thrives

Archway Network is a testament to visionary architecture. By leveraging the Cosmos SDK, Tendermint, and CosmWasm, the Archway team have built an infrastructure that excels in speed, scalability, and security. What truly sets Archway apart is its seamless interoperability through the Inter-Blockchain Communication (IBC) protocol, which fosters a cohesive ecosystem where data and value can flow freely between different blockchains.

Unlike L1 blockchains that primarily focus on token distribution to early participants, Archway takes a different approach. It recognizes the value and impact of developers and builders by incentivizing them based on their contributions to the network. This unique model aims to level the playing field among developers, providing equal access to capital and support, regardless of their connections or associations.

Tokenomics: Incentivizing Developer Contributions

Archway introduces a novel approach to developer rewards by exploring revenue distribution alternatives. Beyond gas fees, developers building on the network are incentivized through a meticulously designed combination of gas rebates, inflation, and premiums. This multifaceted reward system ensures that developers are recognized and rewarded for their invaluable contributions to the network.

Gas fees are not just divided up among validators and dApps, but split evenly between them, ensuring a fair distribution of rewards. But Archway doesn't stop there—it pushes the boundaries further. With the inflation rate at 10% and expected APR ranging around ±21% at launch, developers have a stake in shaping the network's future.

A quarter of the inflation is allocated to the dApps rewards pool, a vibrant ecosystem where developers are rewarded based on the gas generated by their applications within a given epoch. Additionally, developers have the freedom to set custom fees for interactions with their smart contracts, enabling them to earn 100% of the charges and fostering a direct stake in their application's success. By seamlessly embedding these fees within the network fee, Archway Network delivers a user-friendly experience, sparing users from the complexity of multiple fees.

 Join our mailing list to receive our latest updates, research reports, and industry news.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Use Cases

Archway Network opens up a wide range of possibilities for developers and entrepreneurs. By rerouting their rewards to a shared pool, developers can subsidize gas fees for users, creating a more familiar and accessible experience akin to traditional web applications. This user-centric approach revolutionizes the way people interact with blockchain-powered applications, removing the burden of high transaction costs and propelling mainstream adoption.

Moreover, Archway Network empowers developers to swiftly launch their dApps without the need to bootstrap a standalone chain. For early-stage projects struggling to secure funding or establish a secure blockchain, Archway provides a springboard for testing product-market fit and scaling ambitions. Developers can prototype and iterate on Archway before transitioning to their own appchain or rollup, amplifying their chances of success.

Archway isn't solely focused on providing a versatile blockchain infrastructure—it also fosters a vibrant and supportive developer community. By offering a plethora of hackathons, workshops, grants, bug bounties, and developer-focused initiatives, Archway stimulates a culture of innovation and collaboration. Developers are incentivized to create new modules, tooling, and applications that enrich the ecosystem and unlock new possibilities.

Check out some of Archway’s key initiatives here:

Hackathons: https://blog.archway.io/tagged/hackathons

Workshops: Archway Workshops

Grants: https://blog.archway.io/accelerating-value-capture-the-archway-foundation-grants-program-40f3edbdf9

Governance

Archway Network implements a governance model that allows participants and token holders to actively shape the protocol's future. Through proposals and on-chain voting, Archway's decentralized community can influence the direction of the platform. Governance is facilitated by their native token, $ARCH, which ensures fair and transparent participation. Holders of the token can propose changes and vote on active proposals, with consensus being reached through a defined threshold. Engaging with the Archway community involves actively participating in governance by submitting proposals or casting votes on existing ones.

Recent Developments

In its relentless pursuit of excellence, Archway Network has achieved several milestones that highlight its potential as a catalyst for change. Notably, the launch of its incentivized testnet, the successful completion of multiple security audits, and the adoption of WebAssembly (Wasm) have garnered attention from developers and blockchain enthusiasts alike. Now with the mainnet launch, Archway is poised to reshape the blockchain landscape, offering an unprecedented level of developer empowerment and accessibility.

Deep Dive into Archway network: https://www.youtube.com/watch?v=TCoTNlzohIo

Useful resources:

Website: https://archway.io

Twitter: https://twitter.com/archwayHQ

Medium: https://medium.com/@archwayHQ

Github: https://github.com/archway-network

Docs: https://docs.archway.io

Discord: https://discord.com/invite/5FVvx3WGfa

Reddit: https://www.reddit.com/r/Archway/

Telegram: https://t.me/archway_hq

Staking $ARCH with Chorus One

Inflation rate: 10%

Staking APR: expected ±21% at launch (with 35% ARCH staked)

To start staking with Chorus One, reach out to staking@chorus.one.

About Chorus One

Chorus One is one of the biggest institutional staking providers globally operating infrastructure for 40+ Proof-of-Stake networks including Ethereum, Cosmos, Solana, Avalanche, and Near amongst others. Since 2018, we have been at the forefront of the PoS industry and now offer easy enterprise-grade staking solutions, industry-leading research, and also invest in some of the most cutting-edge protocols through Chorus Ventures.

Guides
Networks
How to stake MATIC (Polygon)
A step-by-step guide to staking MATIC with Chorus One
June 5, 2023
5 min read

Polygon is a Layer 2 scaling solution built on Ethereum that aims to provide multiple tools to improve the speed and reduce the cost and complexities of transactions on blockchain networks.

Overview

  1. To start staking $MATIC, first log in to https://staking.polygon.technology/ on the browser of your choice.

Ensure that the browser has integrated any of the wallets supported by Polygon.

  1. Then, click on Login and connect to the wallet of your choice. Click ‘View all’ to see all the wallets supported by Polygon. We have chosen MetaMask.
  1. Once you have connected your wallet, click on ‘Become a Delegator’, and search for ‘Chorus One’ amongst the list of available validators.

Click on ‘Chorus One’ to verify all the details. Ensure that the Validator address (shown as ‘Owner’) is 0xbbd83024be631bb6f3dd3c0363b3d43b5d91c35f.

Note: The commission rate to stake $MATIC with Chorus One is 5%.

  1. Once you have verified all the details, click ‘Become a Delegator’ .
  1. Next, enter the amount of $MATIC you would like to stake. Then, click ‘Continue’.
  1. You will be redirected to your wallet to approve the transaction, which will take a few minutes.

You have now completed the process and staked your $MATIC with Chorus One!  

About Chorus One

Chorus One is one of the biggest institutional staking providers globally operating infrastructure for 40+ Proof-of-Stake networks including Ethereum, Cosmos, Solana, Avalanche, and Near amongst others. Since 2018, we have been at the forefront of the PoS industry and now offer easy enterprise-grade staking solutions, industry-leading research, and also invest in some of the most cutting-edge protocols through Chorus Ventures. We are a team of over 50 passionate individuals spread throughout the globe who believe in the transformative power of blockchain technology.

Networks
News
Chorus One announces staking support for Sui Network
Sui is a Layer 1 blockchain and smart contract platform designed to make digital asset ownership fast, private, secure, and accessible to everyone.
May 4, 2023
5 min read

After three rounds of rigorous testnets, the Sui Network Mainnet is live, and Chorus One is proud to support the network as a genesis staking provider and validator.

What is SUI?

Sui Network is a permissionless Layer-1 blockchain and smart contract designed from the ground up to make digital assets ownership fast, secure, and accessible to the next generation of Web3 users. Its pioneering architecture is implemented to create a world-class developer experience, in addition to vastly improving performance and user experience of L1 blockchains.

Sui Move

Sui uses Rust and supports smart contracts written in Sui Move -  a customized version of the Move programming language that enables the definition and management of assets with owners. These assets can be created, transferred, and mutated through custom rules defined in the smart contract, offering a flexible way to manage digital assets on the blockchain. This enables a vast range of use-cases such as tokens, virtual real estate, and more.

SUI’s unique design features

  1. Parallel agreement

Sui has a unique system design that allows it to scale horizontally and handle a high volume of transactions at low operating costs. Unlike other blockchains that require global consensus on all transactions, Sui enables parallel agreement on independent transactions through a novel data model and Byzantine Consistent Broadcast. This approach eliminates the need for global consensus and enhances scalability without compromising safety and liveness guarantees.

The object-centric view and Move's strong ownership types enable parallel execution of transactions that affect different objects while transactions that affect shared state are ordered through Byzantine Fault Tolerant consensus and executed in parallel.

  1. Scalability and Immediate Settlement

Sui’s scalability characteristic is highly innovative and distinct from existing blockchains that have bottlenecks. Currently, most blockchains have limited capacity to handle a high volume of transactions, resulting in slow processing times and expensive fees. This can lead to a poor user experience, particularly in gaming and financial applications. Sui addresses these issues by scaling horizontally to meet the demands of applications. It does this by adding more processing power through additional validators, resulting in lower fees and faster processing times even during periods of high network traffic.

  1. Novel Storage Ability

Sui allows developers to store complex assets directly on the blockchain, which makes it easier to create and execute smart contracts. This results in low-cost and horizontally scalable storage that enables developers to define rich assets and implement application logic. With this capability, new applications and economies can be created based on utility without relying solely on artificial scarcity.

SUI Tokens

Sui’s native token, SUI, has a fixed supply and is used to pay for gas fees. Additionally, users can earn rewards by staking their SUI tokens with validators like Chorus One. To learn more about how you can stake SUI with Chorus One, visit: https://chorus.one/articles/how-to-stake-sui-sui-network

Sui Use Cases

Sui enables developers to define and build:

  • On-chain DeFi and Traditional Finance (TradFi) primitives: enabling real-time, low latency on-chain trading
  • Reward and loyalty programs: deploying mass airdrops that reach millions of people through low-cost transactions
  • Complex games and business logic: implementing on-chain logic transparently, extending the functionality of assets, and delivering value beyond pure scarcity
  • Asset tokenization services: making ownership of everything from property deeds to collectibles to medical and educational records perform seamlessly at scale
  • Decentralized social media networks: empowering creator-owned media, posts, likes, and networks with privacy and interoperability in mind

Staking $SUI with Chorus One

SUI can be delegated to Chorus One delegation pool

Current Staking APR: 8.3%

For any other questions, reach out to staking@chorus.one

Useful links, tools, and resources

Website: https://sui.io

Twitter: https://twitter.com/SuiNetwork?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor

Docs: https://docs.sui.io/learn/about-sui

Explorer: https://suiscan.xyz/mainnet/home  

Discord: https://discord.com/invite/sui

GitHub: https://github.com/MystenLabs/sui

About Chorus One

Chorus One is one of the biggest institutional staking providers globally operating infrastructure for 40+ Proof-of-Stake networks including Ethereum, Cosmos, Solana, Avalanche, and Near amongst others. Since 2018, we have been at the forefront of the PoS industry and now offer easy enterprise-grade staking solutions, industry-leading research, and also invest in some of the most cutting-edge protocols through Chorus Ventures. We are a team of over 50 passionate individuals spread throughout the globe who believe in the transformative power of blockchain technology.

Opinion
Networks
Unstoppable games in Avalanche
Erwin Dassen explains how Avalanche's multichain architecture is ideal for developing an unstoppable blockchain game controlled by its users.
April 7, 2023
5 min read

What is this about

At Chorus One, we have a strong conviction in the potential of a multichain future. We believe that specialized blockchains play a crucial role in discovering and nurturing new use cases, and ultimately driving mainstream adoption. Since joining Chorus One about two years ago, I've been pushing for us to do the same in the Avalanche ecosystem as these two ecosystems have similar visions of what the multichain future can, should, and will look like.

Last year, we entered the Avalanche ecosystem. Our work will only intensify in the coming years with Chorus Ventures, our ventures arm, investing in native Avalanche projects. We also use our expertise in tokenomics and infrastructure to help projects launch their permissionless subnets. We will be presenting this topic both at the online Subnet Summit mid April and the Avalanche Summit II beginning of May.

In my view gaming is key to onboarding the next wave of users and a fundamental step in the road to mass adoption. This article aims to present the exciting future of blockchain gaming and demonstrate how the Avalanche architecture, particularly the multichain subnet architecture, is the ideal substrate for this vision. Through a two-part series, I will illustrate how one can develop an unstoppable game. By unstoppable, I mean a game that not even its creators can censor or stop if one day they move on to other projects. A game in control of its users.

So let's get to it.

Path of Exile

To make this exercise as clear as possible I will look at a game I have plenty of experience with having played around 2000h. The game in question is Path of Exile, in my personal opinion the Diablo killer. This game needs no introductions but:

  • It is the number 2 action-RPG (ARPG) in terms of concurrent gamers on Steam consistently.
  • It has the most interesting in-game economy of any game I am aware of and competes with EVE Online in this regard. So much so that the community developed a variety of tools to track and facilitate the movement of goods.
  • It is completely free-to-play with no pay-to-win mechanics. Game profits come from cosmetic-only purchases.
  • It is fun! I've sunk +2000h in this game and most hardcore gamers sink this amount of time in the game per season.
  • The game is constantly refreshed with new mechanics, bosses and lore via the seasonal leagues which also boosts the revenue for the developers.
  • It is complex with multiple mechanics and endless build options. Gamers evolve but are entertained from noob level to youtuber level.
  • Look at the passive skill tree!

A short video:

Take a look here for some more gameplay videos.

I cannot emphasize enough how deep the economics goes in this game. The reason being that its economics are fundamentally tied to the crafting system for equipment and to the simple fact that you need to craft gear yourself or buy it from someone if you want to reach the endgame. Purely random drops cannot take you there.

Every season a "league patch" is released with new contents and the economy is reset. Characters and loot from previous seasons are still available to play in the "standard" league and the standard league economics are interesting in their own right but as a driver for innovation and to give new players the ability to compete in a more level playing field, these resets are very important.

The goal is thus to envision a version of PoE that is unstoppable and in the hands of gamers. You might ask: why would developers make such a game? To which I answer: the first one to do this becomes a first mover in a technology that soon will be expected from all games. And why will this be expected? Why do gamers want this? Well, this is a game you can continue playing and you can really own it. Like how it was in the dawn of console gaming. Be it real or game money, you can trade assets and no one can censor you. If you recall the anecdote, this was the reason Vitalik started his work in crypto.

The anatomy of an ARPG

In the centralized world, an ARPG like Path of Exile consists of a client/server platform where the server infrastructure is run by the game developer, and where the client freely available or purchased in a marketplace. Next, we will look at the features and responsibilities of the server side as this will be where our decentralization efforts will mostly focus.

Anti-cheat

The ability of client-side tampering with binary can cause all types of attacks/cheating. This is an arms race but currently, it is tackled via lock-step state validation. More on this later.

Randomness

Most games need randomness. For anti-cheat reasons, this is taken care of at the server level. In the case of PoE (and ARPG more broadly) this is even more important as loot, damage, map layout, and even AI are parameterized by random inputs.

Loot generation

Of fundamental importance for a healthy in-game economy is that the more powerful items are, the rarer they should be. That is, their drop rate should be lower. This is accomplished by drop-rate lookup tables that are set and maintained by the server. Again due to anti-cheat measures, it is the server that, when appropriate, generates a random drop.

Match-making

Even in PoE which tends to be dominated by PvE (player versus environment), there are situations where players interact: regions where PvP (player versus player) are allowed and sanctuary environments also called player hubs. These interactions need to be facilitated by the game server.

Trading

Special trading windows and functionalities are implemented so that players can exchange goods in a safe way.

The backends

Looking at the above set of functionalities that the server must provide, we can identify three different types of backends that the server infrastructure needs to maintain. These are the components we will need to "permissionlessly" decentralize. The following figure gives an idea of how the server-side interacts with these backends and the client (overlap indicates communication).

Queryable databases

There is a need for queryable databases, with loot tables clearly being one such need. But many more are present: leaderboards, player info, skill table, effect mechanics, and many more.

Content delivery

A key-value store that can deliver monolithic "chunks of bytes" is also a necessary backend. The game needs to ship itself and its updates with a big proportion being graphical assets. For this dedicated content-delivery networks are employed.

Anti-cheat logic

As mentioned before the server infrastructure needs to be able to keep clients in sync across PvE and PvP both for anti-cheat purposes and for facilitation of user interactions.

A short digression into Subnets

So why is Avalanche especially suited for this exercise? How will the architecture of such a game change and what technologies do we need to leverage to accomplish our goal of a decentralized, unstoppable ARPG game?

Avalanche has two genius breakthroughs in its design: its consensus being the first and the subnet architecture being the second. The latter is highly dependent on the former. Let's see why.

Avalanche consensus is without a doubt the most advanced consensus out there and is correctly categorized as a third type of byzantine fault tolerant consensus following the discovery of signature accrual and Nakamoto consensus. It is the first meta-stable type of consensus algorithm. This consensus enjoys enviable properties: it scales easily with the number of validators, it is leaderless, and single-slot final. I won't go into much detail but suffices to say it accomplishes all of these by being a consensus algorithm based on a statement about an emergent property of the system. Let me explain what I mean. You can think of the network as having the property of being consistent (all validators agree on the current state). In Avalanche this property is emergent. Like the temperature of a gas, it exists as a property derived by the local interactions of its constituents “particles”. In the case of the gas, particles bouncing of each other exchanging kinetic energy in their small neighborhood gives rise to the macroscopic property of temperature. In Avalanche, validators are the particles and contrary to other consensus mechanisms they interact only “locally”, that is, with a small number of validators that are randomly selected in each round. Somehow - and here there is a strong mathematical theorem behind it - this is enough for the network to have a well-defined sense of state history. Even in the presence of attackers.

It is the property of essentially limitless scaling in the number of validators that allows for the second genius move. You see, Cosmos is the originator of the concept of an app-chain. In this design, it is absolutely necessary that chains can "talk" to each other to really cover all the use cases one is interested in. For this reason, they developed the IBC framework. This is an elegant framework for trustless communication but it incurs a significant requirement to a prospective chain: as a destination chain you need to keep consensus information of any given source chain you want to communicate with in the form of a light client. Wouldn't it be ideal if this information would be globally available to all chains from all chains? This is impossible with a limited set of validators.

So to have an unlimited set of app chains that can trustlessly communicate without having to keep light clients of every other chain they communicate with you need an unbounded set of validators in a global chain that keeps all this information. I hope you see where this is going: this is exactly the subnet design.

In Avalanche the main network that every validator must secure contains three chains. The P-chain (Platform chain), the X-chain (eXchange-chain) and the C-chain (Contract chain). The X-chain - which us currently a DAG but will become a linear chain in the near future - is a chain made for throughput exchanges of assets much like a blazing fast Bitcoin network. The C-chain is what most users are more familiar with and is an EVM based chain. It works just like Ethereum but faster and with instant finality. Great. But the real genius comes from the P-chain. This chain tracks all validation related transactions of the mainnet and all subnets. This is what will enable the unbounded, composable network of app chains. Since all validators have the P-chain at hand, any two subnets can communicate directly provided they want to. In IBC, on the other hand, with its hub-and-spoke design you have the unaddressed issue of path dependence.[^1]

So, we will leverage an Avalanche subnet for our game. Main reasons are the excellent scaling properties of its consensus and the application-specific, isolated nature of the subnet approach. On top of that it supports cross-subnet transactions allowing for valuable assets to move around freely in the ecosystem. Finally but not least, there is also VM2VM message passing that allows the validators in a subnet to easily check the state in other connected VMs be that within the same subnet, in the mainnet, or another subnet running in the same validator (the latter has not even been explored yet).

An Avalanche subnet is essentially the following:

  • The specification of a subset of validating nodes from the overall set of Avalanche mainnet validators.
  • The specification of a set of blockchains these validators should validate and for which their performance is monitored.

The set of validators is dynamic but can be either permissionless or permissioned. The specification of a blockchain is comprised of a specification of a subnet this blockchain pertains to and the specification of a VM (i.e. virtual machine) that characterizes the valid state transitions in that blockchain.

Ava Labs recently announced HyperSDK a toolkit not much unlinke the CosmosSDK to help developers easily build VMs to power their subnet. From now on, they can focus on the logic of the application and worry much less about synchronization, consensus, state storage and availability and other blockchain-heavy topics. On the other hand, if you want to, you can customize these aspects as the SDK was build with modularity in mind.

See Avalanche platform and Subnets sections in the Avalanche documentation for more information on subnets and visit the HyperSDK repository which is open for contributions.

The game architecture

As mentioned before, our intent is to decentralize the game. For this, we will need to decentralize the server infrastructure, mainly the three points named above: databases, content delivery, and anti-cheat logic. This will be done by defining specialized VMs and the corresponding blockchains for each of those game infrastructures. All of this is packaged in the game server binaries which will be run by the validators in the subnet.

A game client will essentially be submitting transactions to the server network. Clearly, the game client is responsible for client-side rendering which is something we do not need to bother with on the server side. In terms of execution hardware, the game server is much lighter than the client and we will exploit this.

Keep in mind that being a player does not mean you can’t be a validator as well or a delegator to a Chorus One validator ;). This is obvious but worth mentioning as this means that for the first time ever a game can actually be in the hands of the players. With governance, even the game features and roadmap can be decided, paid for, and rolled out completely in a decentralized fashion.

So the big question: what are the blockchains, VMs and technologies used for this purpose? We dive in.

Content delivery via BlobVM

The BlobVM already exists in an advanced prototype stage. It was developed by Ava Labs and is available in open-source. What it does is provide a dedicated, seamlessly integrated (at the subnet level) content addressable storage with customizable parameters regarding permissions for read/write and persistence.

We use BlobVM for storing all art, texture, and models, i.e., all game assets. Even the game client binary can be updated via this method. In a fresh install, an externally downloaded game client connects, and sends a transaction to download all necessary assets. Note that this transaction could be a way to monetize the game but this is optional of course. In other words, this transaction would give you a game license NFT.

NFT and player asset tracking with AVM

Now as mentioned before we want to give power and value to the gamers. Path of Exile is famous for its rich economy and is a formidable laboratory for NFT tokenomics. By giving the gamers the option to mint any found loot item we give this economy real value. There is plenty of opportunity and pitfalls here to fill in another article but it is important to mention that PoE works by having multiple “leagues” which give an opportunity to always “reset” the economy and give chance for new players to “make it”. We think this is an important aspect to keep in the decentralized version of this game. As an example of how we could explore this, we can configure it so that minted NFT only work on the current and previous leagues.

For tracking a gamer’s collection we use AVM, the Avalanche VM, which is a DAG (directed acyclic graph) based on the UTXO model capable of massive throughput. In fact, this is the underlying VM of the mainet’s X-chain. Note that since the announcement of Cortina (the next dot release of the Avalanche validator client) the X-chain will move from being a DAG into a linear chain. Here we have the option of launching out own AVM chain for assets transfer or, use the X-chain directly which would make all of the game’s NFT directly available to the wider Avalanche community (NFT reuse in games is an under-explored area). The AVM supports ANTs or Avalanche Native Tokens that can easily be imported/exported across the majority of supported VMs as it defines a unified API for cross-chain atomic swaps.

PoE is a free-to-play game that monetizes itself via cosmetic-only user-purchasable content. This can be easily supported via the AVM chain as well. Simply: an NFT in the user wallet ”unlocks” these assets to be delivered via the content delivery mechanism. This is essentially a VM2VM communication as is desirable and quite probable that the X-chain will support account lookups via this mechanism.

Databases with SQLVM

As mentioned before, as with any modern application, the game needs to store global relational data. For example, loot tables, league-specific information, game metrics, user metrics, NFT market data, etc. The list goes on and on. For this specific use case currently, many web3 projects use The Graph: a sophisticated but complex decentralized solution. A few issues arise with this approach:

  • Your economy has to compete with external, global, economies to make the service persistently available.
  • The Graph only indexes preexisting block data. It is not actually a form of storage.

Because of these, we propose a new type of VM we dub SQLVM and this will be the topic of our next article. But in a nutshell, you should think of it as a hybrid between a app-specific indexer and a persistent relational data store.

It allows for specific types of transactions that query/write to a globally replicated ACID relational database. Here we automatically benefit from the fact that blockchain transactions are atomic at the consensus level which makes designing the underlying database much simpler. For example, a suitable design can be done for a VM where the runtime state is an instance of any query engine: row-oriented like Postgres, column-oriented like BigTable, or document-based like MongoDB. Keep in mind that even this is overkill as we don't need their replication features. What we need is their query engine and storage solution. Most of these databases have sophisticated query planners than can take the place of fee estimators. The beauty here is that Avalanche will take care of maintaining this database eventually consistent which suffices for our use case. More sophisticated designs are certainly possible. The job of the VM here is to essentially declare the types of transactions (write/reads), the fees and verify the blocks by applying the transactions in the database and updating certain database hashes (will be needed for anti-cheat below). For our game - or any other app chain using this backend - other VMs in the subnet should be able to read the database at will which can easily be done with VM2VM.

Similar to how a non-validating Avalanche node have access to the mainnet state, a game client could be a node of this chain running in non-validation mode so as to keep this database state at all times for easy synchronicity.

ZK anti-cheat with ZKVM

Now to the technologically most innovative piece of the puzzle: to run anti-cheat as a ZK verifier. This is such a breakthrough technology that it would be an improvement over existing anti-cheat technology on centralized games.

Anti-cheat works, as mentioned before, as lock-step game simulation. What this means is that the game client is essentially an input system and a rendering engine of a game that is actually run remotely on servers. This Introduces latency which is the reason why game server farms have to be deployed across internet “regions”. ZK changes the game as it allows one to codify all game state transitions in a prover which we can run on the game client (remember the gamers tend to play with machines that are quite powerful) while the server is just a verifier! This has the added benefit that it even liberates the server from having to run in lock-step, to begin with! Essentially we can use eventual consistencyto catch the cheaters. Put differently, we don’t care to verify every little state transition that happened but batches (or recursions) encoding all transitions that happened in a configurable time window: 1 second, 10 seconds, a minute, an hour…

It is obvious what a powerful idea this is: no need to simulate full-blown games on the server. For example, we now can use more sophisticated AIs in the client. The fact that you have to run the game on the server is one of the reasons no modern AI is in use in games. Why not use GPT-4 for creating procedural quests??

We will have more to say about a ZKVM in a future article but I would like to state a few things. Firstly, note that we are not even using the zero-knowledge aspect of this VM and this gives more freedom in the exact construction of the protocol. In precise terms we are interested in SNARKS not necessarily ZK-SNARKS. Nonetheless, we expect that applications that use this zero-knowledge aspect will also exist.

Secondly, we might not be at the stage yet where fast enough provers exist to prove the state transition for a game like PoE. I'm not an expert, but I expect that schemes leveraging the GPUs in the gamer's clients will be just a matter of time.

And finally, we are talking about a very specific VM - that of the game - and not a generic programmable one like the EVM. We need a prover for those exact transitions that happen in game. This is potentially another route for optimization.

Conclusion

We hope to have convinced you that the future of decentralized gaming and player-owned gaming is bright. When Vitalik joined the crypto movement I don't think he thought his dream would come true on another chain, but I think he will be satisfied nonetheless.

But more importantly, we hope the reader is also convinced that this is only possible in a clean, elegant, and reusable way via the subnet architecture. Sophisticated applications like this will only flourish when good reusable VMs are available much like reusable contracts are right now. Multiple VMs demands multiple chains in a subnet architecture. Although technically possible to cram all of these backends into a single block to be serialized/deserialized and verified using a single chain, this would not only hurt code reuse but is also impractical since it is clear that these backends might need different blocktimes.

Of course, there are a lot of unknowns to this as I am not a game developer. I just want this to jump-start the imagination of developers in general (not only game developers) to the reality that the future is app-specific multi-chain subnets. And so that someone develops an unstoppable ARPG like Path of Exile!!

Tune in for some follow-up articles on where we attempt to detail somewhat the SQLVM and ZKVM and come talk to us in the summit. See you there!

About Chorus One

Chorus One is one of the biggest institutional staking providers globally, running infrastructure and validating over 40 blockchain networks. Since 2018, we have been at the forefront of the PoS industry and now offer enterprise-grade staking solutions, industry-leading research, and also invest in some of the most cutting-edge projects through Chorus Ventures. We also invest in subnets on Avalanche so if you’re building something interesting, reach out to us at ventures@chorus.one.

News
Networks
Chorus One announces staking support for Onomy
Chorus One is proud to announce staking support for Onomy Protocol, an on-chain fintech hub for DeFi.
April 6, 2023
5 min read

We’re very excited to announce that Chorus One is live on the Onomy Network! 

Onomy Protocol is pioneering a harmonious connection between traditional financial markets and the DeFi landscape - two worlds that have remained largely disjointed - by creating a vertically-integrated ecosystem  that emulates the familiarity of centralised exchanges but retains the decentralised ethos of Web3, Onomy will be presented to end-users in a digestible, retail-friendly ‘fintech shell’ whose backroom engine smoothens the transition from CeFi to DeFi for retail and institutions alike. 

Leveraging a Cosmos-based layer-1, a hybrid DEX, bridge hub, stablecoin issuance protocol, and additional contributions built on the ecosystem, Onomy is creating the perfect conditions for Forex markets to thrive on-chain. 

Introducing Onomy: An On-Chain Fintech Hub for DeFi

Onomy Network (ONET): A Fast and Secure Proof-of-Stake Blockchain

The Onomy Network is a Proof-of-Stake blockchain constructed using the Cosmos SDK framework, which enables it to achieve scalability by leveraging the infrastructure supported by a network of institutional validators, like Chorus One

With a block time of just five seconds, and its high throughput, low latency, and low fees features, the Onomy network is made to be ideal for financial transactions.

Onomy Exchange (ONEX)

Supporting various order types, including limit, market, conditional, and stop-loss orders, the Onomy Exchange (ONEX) stands out as a unique hybrid, multi-chain decentralised exchange (DEX) on which traders can buy and sell cross-chain through an order book with no trading fees incurred, whilst liquidity providers can get involved and earn rewards from the AMM running in the back-end. 

This empowers users to trade both crypto and Forex pairs effortlessly while also offering cross-chain trading, advanced charting, and more. 

“The Hybrid DEX combines the importance and familiarity of order books while retaining the flexibility and security of AMMs.” - Lalo Bazzi, Co-founder, Onomy Protocol

Essentially, ONEX aims to provide a high-volume trading experience similar to that of traditional centralised exchanges (CEX), but in a decentralised and non-custodial manner on the blockchain. 

Arc Bridge Hub

The Network powers the Onomy Arc Bridge Hub, a cross-chain transfer solution that integrates inter-blockchain communication (IBC) and allows users to easily traverse between prominent blockchains both within and beyond the Cosmos ecosystem, such as Near, Avalanche, Polygon, Ethereum, Neon, etc. Additionally, the Arc Bridge solves the issue of approving multiple cross-bridge transactions by reducing it to a single approval, making the user experience significantly simpler. 

Onomy Reserve (ORES)

Onomy Reserve (ORES) is the linchpin of the ecosystem and the fundamental driver behind Onomy’s core long term mission. A decentralised reserve bank, the ORES will provide on-chain minting of stablecoins, or denominations (Denoms) of fiat currencies. 

The goal is to create a trusted, decentralised system through which national currencies can be exchanged at speed on-chain and with broader integration with the wider DeFi ecosystem and the advantages composable finance brings and the efficiencies it entails for this titanic, $7 trillion per day market. The ORES will function as a gateway for liquidity across all integrated blockchains and will support multiple national currencies, with the native $NOM coin playing a key role. 

$NOM Utility

$NOM is Onomy’s native network and governance token. It’s used by validators (like Chorus One) and their delegators to secure the proof-of-stake blockchain, but also to cover transaction fees, and vote on governance proposals in the Onomy DAO which manages the on-chain treasury with no centralised control. $NOM will have a key role to play in the Onomy Reserve as highlighted in the Onomy Improvement Proposals, with additional utility to be voted on by the DAO. 

Onomy, Forex, and the New Economy

For crypto’s next great wave of adoption to occur, access to crypto needs to be easier, faster, and more intuitive - while also continuing to lay the scaffolding for a decentralised financial system that works entirely on-chain. Onomy is that convergence point. 

Powered by a strong team of crypto natives and backed by prominent crypto investors among the likes of Chorus One, Bitfinex, UDHC, GSR, DWF Labs, CMS Holdings LLC, and more. Onomy offers new possibilities for on-chain FX markets and broadens access to DeFi for the individual and institutional investor.

$NOM is already live for trading on Kucoin, Bitfinex, Gate.io and MEXC.

Onomy will unlock DeFi for the masses, and Chorus One is thrilled to be part of the journey. 

Staking $NOM with Chorus One

Current Inflation Rate: approximately 90% 

Current Staking APR: approximately 114%

Staking $NOM with Chorus is straightforward. Simply hold native $NOM on Cosmostation, Keplr or Leap, connect your wallet to the Onomy SuperApp, and stake $NOM with Chorus One. 

For any other questions, reach out to staking@chorus.one.

About Chorus One

Chorus One is one of the biggest institutional staking providers globally operating infrastructure for 40+ Proof-of-Stake networks including Ethereum, Cosmos, Solana, Avalanche, and Near amongst others. Since 2018, we have been at the forefront of the PoS industry and now offer easy enterprise-grade staking solutions, industry-leading research, and also invest in some of the most cutting-edge protocols through Chorus Ventures. We are a team of over 50 passionate individuals spread throughout the globe who believe in the transformative power of blockchain technology.

Networks
Core Research
Cosmos ticks all the boxes in building the ultimate modular blockchain
We evaluate why Cosmos is the best solution for building a modular blockchain.
March 19, 2023
5 min read

Introduction

Cosmos is steadily becoming the place to create the ultimate modular blockchain. Cosmos SDK allows developers to effortlessly roll out tailored blockchains, resulting in a flood of new projects that provide specialized settings for novel products. The goal of modular blockchains is to divide Execution, Settlement, Consensus, and Data Availability. Refer to page 19 of this report to learn more about modular vs. monolithic blockchain designs (Ethereum). As a result, we see various teams tackling the issues of each layer and creating optimal solutions and developer environments. Ultimately, developers could use these optimizations to create an application that is highly performant using such an ultimate modular blockchain. Not to mention the greater decentralization that comes with spreading your product across numerous ecosystems.

Let’s go over the problems that current ecosystems face in each layer of the modular stack, and how various quality teams are solving these issues. Please bear in mind that there are other teams that are solving these issues too, we are just exploring some.

Issues with Data Availability

It is important to explain that when a block is appended to the blockchain, each block contains a header and all the transaction data. Full nodes download and verify both, whilst light clients only download the header to optimize for speed and scalability.

Full nodes (validators) cannot be deceived because they download and validate the header as well as all transaction data, whereas light clients only download the block header and presume valid transactions (optimistic). If a block includes malicious transactions, light clients depend on full nodes to give them a fraud proof. This is because light nodes verify blocks against consensus rules, but not against transaction validity proofs. This means that a 51% attack where consensus is altered can easily trick light nodes. As node runners scale, secure methods to operate light clients would be preferable because of their reduced operational costs. If nodes are cheaper to run, decentralization also becomes easier to achieve.

The DA problem refers to how nodes can be certain that when a new block is generated, all of the data in that block is truly published to the network. The problem is that if a block producer does not disclose all of the data in a block, no one will be able to determine if a malicious transaction is concealed within that block. A reliable source of truth as a data layer is required that orders transactions as they come and checks their history. This is what Celestia does, solely optimizing the Consensus and the DA layer. This entails that Celestia is only responsible for ordering transactions and guaranteeing their data availability; this is similar to reducing consensus to atomic broadcast. This is the reason why Celestia was originally called ‘Lazy Ledger’, however, efficiently performing this job for a future with thousands of applications is no easy job. Celestia can also take care of consensus. See the different types of nodes in Celestia here.

​​Two key features of Celestia’s DA layer are data availability sampling (DAS) and Namespaced Merkle trees (NMTs). Both are innovative blockchain scalability solutions: DAS allows light nodes to validate data availability without downloading a complete block; NMTs allow Celestia’s execution and settlement layers to download transactions that are only meaningful to them. In a nutshell, Celestia allows light nodes to verify just a small set of data, that when combined with the work of other light nodes, provides a high-security guarantee that the transactions are valid. Hence, Celestia assumes that there is a minimum number of light nodes sampling the data availability layer.

“This assumption is necessary so that a full node can reconstruct an entire block from the portions of data light nodes sampled and stored.”

It is worth noting for later that these layers (DA & Consensus) are naturally decentralized and easier to have fully on-chain, as most of the work is taken on by the validators. Scaling here will ultimately depend on the consensus algorithm. ‘Rollapp’ developers will not need to assemble a validator set for their applications either.

Issues with Execution & Settlement layers

  • Execution refers to the computation needed for executing transactions that change the state machine accurately.
  • Settlement involves creating an environment in which execution levels can check evidence, settle fraud claims, and communicate with other execution layers.

The present web3 environment suffers from centralization in the execution and settlement layers. This is due to the fact that the on-chain tech stack severely limits an application’s functional capability. As a result, developers are forced to perform heavy computation off-chain, in a centralized manner. On-chain apps are not inherently interoperable with external systems, and they are also constrained by a particular blockchain’s storage and processing capability.

More than just a distributed blockchain database is required to create the ultimate decentralized apps. High-performance processing, data IO from/to IPFS, links to various blockchains, managed databases, and interaction with various Web2 and Web3 services are all common requirements for your application. Additionally, different types of applications require different types of execution environments that can optimize for their needs.

Blockless — Facilitating custom execution

Blockless can take advantage of Celestia’s data availability and focus to improve application development around the execution layer. Blockless provides a p2p execution framework for creating decentralized serverless apps. dApps are not limited by on-chain capacity and throughput by offloading operations from L1 to the performant, configurable execution layer offered by Blockless. With Blockless you can transfer intensive processing from a centralized cloud service platform or a blockchain to the Blockless decentralized node network using built-in functions. With the Blockless SDK, you can access any Web2 and Web3 applications as it currently supports IPFS, AWS3, Ethereum, BNB Chain, and Cosmos.

Developers using Blockless will only need to provide the serverless functions they want to implement (in any language!), as well as a manifest file that specifies the minimal number of nodes required, hardware specifications, geolocation, and node layout. In no time, their services will be operating with ultra-high uptime and hands-free horizontal scaling. To learn more about the architecture of the Blockless network go here, but yet again, its orchestration chain is a Cosmos-based blockchain responsible for function/app registration. The cherry on the cake is that you can use and incorporate or sell community functions and extensions into your own application design in a plug-and-play manner using Blockless Marketplace. In Cosmos, you can already do this through projects like Archway or Abstract.

SAGA — Rollups as a service and Settlement optimization

Popular L2s and Rollups today like Arbitrum, Optimism, and StarkNet use Ethereum for data availability and rely on single sequencers to execute their transactions. Such single sequencers are able to perform fast when submitting to Ethereum but evidently stand as a centralized point of failure. Saga has partnered with Celestia to provide roll-ups as a service to enable a decentralized sequencer set.

Saga’s original design is meant to provide critical infrastructure to the appchain vision, where the Saga protocol abstracts away the creation of a blockchain by leveraging IBC.”

Saga provides easy-to-deploy “chainlets” for any developer to roll out an application without having to care about L1 developments. Although their main focus is to support full appchain formation on top of the
Saga Mainnet, the technology can also support the modular thesis. This means that rollup developers can use Saga’s validators to act as sequencers and decentralize their set. In other words, Saga validators can also work in shifts submitting new blocks for Celestia rollups.

https://sagaxyz.cdn.prismic.io/sagaxyz/08e727f2-88a2-4c95-ad17-b0b9579d2b69_saga-litepaper-march-2022.pdf

Saga offers a service that organizes validators into sequencers and punishes misconduct through shared security. Saga’s technology provides functionalities to detect invalid block productions with fraud proofs and to manage censoring or inactivity, challenges are made to process a set of transactions. This means that Saga can enhance the settlement layer whilst using Celestia for data to generate fraud proofs and offline censor challenges. This could also even be done for Ethereum, with the additional benefit of having shared security between chainlets and IBC out of the box. To further understand the difference between running a rollup or a chainlet, please refer to this fantastic article.

Conclusion

In such a modular world, developers finally have full customization power. One could choose to build sovereign rollup or settlement rollups, or even a hybrid. In our example, it could even be possible to use Saga’s consensus instead of Celestia’s. Referring to our example, we could have an application that decentralizes its execution computing through Blockless whilst programming in any language, decentralizes its sequencer set and is able to deploy unlimited Chainlets if more block space is required with Saga, and has a reliable and decentralized data availability layer with Celestia. What’s best, all these layers are built and optimized with Cosmos SDK chains, meaning they will have out-of-the-box compatibility with IBC and shared security of Chainlets.

No results found.

Please try different keywords.

 Join our mailing list to receive our latest updates, research reports, and industry news.

Want to be a guest?
Drop us a line!

Submit
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.