Blog

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Why Web3 Needs Urbit
Part 1 of our deep-dive on Urbit
September 1, 2022
5 min read

Urbit has gained some renown among crypto enthusiasts in recent years as an ambitious and compelling use case of NFTs to power a novel computing system and network. The technical stack that Urbit has developed is impressive and far-reaching, but some criticize its perceived opacity and lack of a precise use-case. If your first impression of Urbit came from a deep-dive into the intricacies of the OS, network, and identity system, you might be left wondering what Urbit’s specific use case even is. Is there a problem Urbit is trying to solve, or is it all just a severe case of NIH syndrome?

The reality is that there is a problem that Urbit solves, and it’s a complex enough problem that it won’t be obvious to most people, but it’s a deep and pernicious enough problem that it affects everyone using the internet. A rudimentary understanding of Urbit’s problem space can be gained from this tweet from Philip Monk, CTO of Tlon, the primary company driving Urbit development. Urbit is a solution to deep technical limitations of the internet that prevent it from being used the way it should: as a permissionless peer-to-peer network that gives freedom and responsibility to its users.

If this explanation feels under-explored, read on for a deep dive into the core value proposition from Urbit to users and developers alike. But before we begin, we should clarify a basic philosophical understanding of Web3.

Decentralization vs. Ownership

“Decentralization” is a commonly used buzzword in Web3 and elsewhere, with much said about new companies whose product is to decentralize some aspect of digital experience. Because of the enormous financial success of Bitcoin and other DeFi technologies, a case can be made that merely decentralizing a product is a sufficient advantage that consumers will flock to it. But this is a poor understanding of what consumers value in crypto, and thereby a flawed approach to Web3’s path to victory.

Bitcoin was, of course, not the first decentralized digital currency ever invented. E-Cash and Bit Gold were predecessors to Bitcoin in this domain, and they each used cryptography-powered precursors to blockchains to make digital payments permissionless. What made Bitcoin more successful than its predecessors is not solely that it was more decentralized (although in some cases it was), but that it was much more secure. The combination of decentralization and security gave Bitcoin holders ownership that they could rely on, and that went on to make it a successful product.

Decentralization is best understood as a special case of ownership, where trusted third parties in central control of a product reduce the user’s intuition that they own the product they use. Merely decentralizing a component of a product does not necessarily compel an end-user to use it, but to some degree, every end-user wants to own their tools if they can.

That’s all to say that Web3’s critics are correct that decentralization itself is not a product. However, decentralization can be a critical component of ownership, and ownership is a critical component of what makes Urbit a compelling product to end-users. Urbit is decentralized, but not for decentralization’s sake. Urbit is “yours forever” and that requires it to have many attributes, including permanence, security, and of course, decentralization.

The Value Proposition of Web3

The story and namesake of “Web3” is perhaps best summarized by this article on Ethereum’s website, which goes through the stages of the internet’s development and shows how a new, blockchain-powered paradigm can shift the balance of power and take ownership from giant tech corporations and give it back to users.

As is well understood by visionaries of a decentralized web, the internet of the early 1990’s was idealized as a permissionless space in which everyone had a voice and could make their own mark on the world by learning and using a set of open protocols that did not discriminate on who could operate them. The early internet was a pluralistic “Wild West” of custom-built websites and services, and while the distribution of activity was anything but equal, there was little resembling a monopoly on most use cases. Idealists saw this web as the beginning of a new flowering of culture and technology, where mass media would become obsolete in comparison to an open field where undiscovered talent could win hearts and minds by their own bootstraps.

As the internet’s ecosystem developed, the idealists only partially got their wish. The internet did become a phenomenal landscape for small contributors to make a big impact, but only under the patronage of monolithic platforms. Somewhere along the way, the expectation that users would have actually owned their means of communication was subverted. As it turned out, running infrastructure and operating servers is boring and hard. End-users needed powerful platforms to obfuscate the complexity of the tech stack, and were willing to give up their ownership in exchange for an approachable user interface.

Detractors and sympathizers alike refer to the early, pluralistic internet as “Web1” and the modern, centralized internet as “Web2”. In accordance with this scheme, the hypothesized successor paradigm of the Internet is called “Web3”.

Proponents of Web3 see in blockchain technology an opportunity for a new phase of development that corrects this flaw by taking the responsibilities of Web2 infrastructure and offloading them to consensus networks that are owned by everyone and no-one. Rather than private infrastructure managed by giant corporations, web services can use public infrastructure managed by the community, and the power structure of the internet can thereby resemble the same fair and open field that the Web1 idealists envisioned, while offering an equal or better user experience to Web2.

The Advantages and Limitations of Blockchains

Blockchains are a promising technology for secure digital ownership by providing one immeasurably valuable feature to its users: trustless consensus on data. By nature, applications must rely on a single source of truth for a dataset in order to be sensible to the developer and the user. In order to obviate the need for a trusted third party to secure and manage this data, consensus must be reached across a network on what is true. This problem is best summarized by the infamous Byzantine generals problem, to which blockchains offer a reasonable solution.

Blockchains also offer another potential way to revolutionize software by offering developers the ability to create new, scarce assets ex nihilo. By allowing investors to speculate on these spawned assets, free and open source software finds a new financial model where code can be given away to the community without thankless developers having nothing to show for their contributions. Given the scope of work required to make systems that are sensible to everyday users, this advantage is truly invaluable.

However, the aforementioned Byzantine fault tolerance comes at a cost in blockchains. Consensus over a network offers a better assurance of ownership to users, but duplicates work that, in the centralized case, only needs to be performed once. The inevitable tradeoff between ownership and efficiency in blockchain networks is best summarized by Vitalik Buterin’s scalability trilemma, which shows that the two most valuable components of blockchains are fundamentally at odds with a third attribute that powerful systems seek to maximize, which is efficiency.

Solutions exist which extend blockchain capabilities in all three domains, so the trilemma is not completely binding. But to the degree that the trilemma is unsolved, scalability constraints manifest themselves in gas fees, which make it costly to write transactions to any chain that is uncompromising on secure decentralization. Costly writes are an anti-feature that make it difficult to excite end-users, and so this limitation threatens the ability of blockchains to obviate monopolies powered by Web2 infrastructure.

Privacy and latency are also notable challenges in a blockchain environment. Infrastructure that, by default, gives read access to everyone and only adds new data at set intervals, forms a limiting use case for many applications that are expected to be responsive and permissioned. Like the scalability problem, these problems have prospective solutions, but still represent technical hurdles for developers to grapple with, that Web2 solutions can simply centralize away. There are several other hurdles of this type that would deserve exploration in deeper dive.

These limitations to blockchain-based infrastructure have, to some degree, already been explored in other places, and may one day each find satisfactory solutions. But one under-explored limitation is the repeated reliance of Web3 applications on trust in order to access blockchain data. This isn’t even necessarily a hard limitation in blockchains as a tool, but can be observed as a pattern in the industry.

Uniswap, for example, is served from a specific domain name, and consumers implicitly trust that domain name with their tokens. MetaMask is a ubiquitous non-custodial Ethereum wallet, but uses hard-coded proprietary endpoints to access on-chain data. OpenSea, despite its name, does not even claim to be permissionless — it’s explicitly a custodial service with administrators to intervene if something goes wrong.

These hallmarks of the Web3 ecosystem are all fueled by a cultural environment that eschews centralization and prioritizes ownership, and yet find themselves making similar compromises to Web2 companies that promise to democratize the ability for people to express themselves. Rather than creating a system that is thoroughly trustless, some trust is inserted into the equation in order to iron out the difficulties of operating permissionless systems, whether blockchain networks or other peer-to-peer protocols.

Again in Web3 as in Web2, complexity is hidden from the user by an interface that achieves a level of human-comprehensibility by offloading user choice to the provider. There are many exceptions, just as in the Web2 era there were alternatives to centralized services that could be used, but were not mainstream. But there is a reason why the choices that make compromises to user ownership tend to win in this environment, and it’s clear that the reason is not lack of access to blockchains as a tool.

It’s The Servers, Stupid

Despite countless efforts to make user-owned applications and networks reliant solely on peers, the role of nodes in any solution is costly to discount. Solutions that give primacy to peers still run nodes to pick up the slack caused by the intermittency of peers. In the Web3 world, offloading all node work to blockchains manifests itself in the cost of writes and the need to obfuscate the gas expense. The need for servers did not go away with blockchains, but only made itself manifest in new ways. Knowing this, the question of a user-owned internet returns to its old question: how can we create a world where each user runs a node?

The underlying need for user-owned servers is not breaking news to those familiar with the history of the internet. In the idealistic days of the early web, user-owned servers were simply a given — as applications became easier to use, always-connected services would follow suit, and the internet of the future would be a patchwork of independent personal servers hosting whichever services were important to the user. In this way, the developments of both Web2 and Web3 technologies can be seen as an adjustment made in response to the failure of personal servers to thrive in the consumer market.

This creates an ugly combination of two hard facts that are discouraging to reconcile, yet both so powerfully true that all previous attempts to engineer around them have failed:
  1. You cannot have a user-owned internet without personal servers.
  2. Consumers do not want personal servers.

We have made a case for why blockchains cannot provide an answer to the former problem. But do blockchains, and other advancements in computing, have anything to say about the latter?

The Marketplace of User-Owned Infrastructure

One interesting case to consider in the landscape of user-owned servers is the omnipresence of personal routers. A router has much in common with a server from a consumer’s point of view: it’s a black box that sits somewhere out of the way. It needs to always be powered on and connected to a network, and you will find out quickly if it’s been unplugged. If it malfunctions, getting it working again is a top priority. In order to do your business, you have to connect to it — what exactly it’s doing is not always clear to the end-user, but that it’s important is well-understood.

The personal router is a successful consumer product because it satisfies three basic conditions:
  • It’s valuable. It is needed in order to access a network, and that network has value.
  • It’s low maintenance. A reboot is required once or twice a year at most.
  • It’s opaque. A user does not need to understand it in order to use it proficiently.

Find a personal server that meets all three of these conditions, and we can begin to imagine a new computing paradigm. In practice, Unix servers typically fail on all three, and where they succeed in one domain, they typically compromise on at least one of the others.

A general study of successful consumer products is also helpful in understanding how and why personal servers failed in the market. This article by Lane Rettig makes a concise case for the viability of tools in the marketplace:

“What the tools we rely on the most heavily have in common is that they’re all simple, durable, and ours.” ~ Lane Rettig

While personal routers do not always satisfy the property of ownership, one can see how their value proposition fits neatly into this model. Unix servers, on the other hand, have only ownership to offer. While they are arguably simple from a highly technical point of view, none of this simplicity is legible to the non-technical user. And their durability is certainly not under question by professionals who rely on them, but non-professionals are almost universally unable to replicate that impression.

But why is Unix in particular under question? The answer is, there is not much else to offer consumers in the way of personal servers. Other solutions exist or have existed, but mostly in the business domain, and mostly targeted at professionals. Servers targeted at tinkerers and privacy advocates have seen some success, but even in that market, Unix is almost always the backbone of their software stack. This may shed significant light on the failure of personal servers in the marketplace: no fully capable operating system has been constructed with the personal server use-case in mind, except for various implementations of Unix. Unix, of course, was never designed for everyday consumers.

Enter Urbit

Urbit is a novel software stack, with its own OS, network, and identity system, built de nihilo from elementary primitives. The OS, as the centerpiece of the system, aims to fulfill the use case of a personal server that is simple, durable, and yours. Urbit uses many theoretical advancements in software engineering to achieve this outcome, most notably determinism, referential transparency, and cryptography.

At the risk of oversimplifying, these advancements can be said to roughly map to the qualities needed to meet the expectations of consumers:
  • Determinism makes Urbit simple. It is always computing one function, no matter the case.
  • Referential transparency makes Urbit durable. All erroneous behavior can and should be tracked down and eliminated by its developers.
  • Cryptography makes Urbit yours. Your ownership of your Urbit is as secure as your private key, no more and no less.

Much remains to be said about the innovations made to create a general purpose server that feels more like a mechanical clock than a fighter jet cockpit, and a deep dive into Urbit’s architecture is recommended to engineers who want to understand the system at more than a superficial level. But for our purposes, it’s also worth taking a glance at our earlier example of the personal router to examine how Urbit compares.

Urbit is as valuable as the personal router. The end-user’s access to the internet is mediated by their router, and the internet is an invaluable ecosystem of force-multiplying services. The Urbit network, similarly, can fulfill the same potential. By adding powerful primitives and a unified back-end to the protocol by which individual Urbit nodes communicate, Urbit’s network promises to lay the foundation for networked applications that can compete with, and even exceed, the services provided on the modern internet.

Urbit is as low maintenance as the personal router. It is designed to never reach an unrecoverable state, and even reboots should never be necessary. The commitment to minimalism and determinism at every turn has paid dividends for Urbit’s developers, and while it cannot be called “zero maintenance” yet, the path to that milestone today yields more known unknowns than unknown unknowns.

Urbit is as opaque as the personal router. The underlying architecture never shows itself to the end-user. To the degree that it has an interface, this interface is a friendly webpage that mirrors the homepage of a mobile OS. Developers can fork its code or play with the internals however they please, but should never need to look at the terminal to use it or its applications. Just like in the case of the router, a connection needs to be established so that services can be made available, and this intuition will be all the end-user needs to know to proficiently use their Urbit.

While serving primarily as a gateway into Urbit’s network, an Urbit server can do much more than merely route packets. As a general-purpose computer on a peer-to-peer network, Urbit can act as a much-needed backbone to user-owned applications that demand nothing more than code from developers. The guarantees of Urbit’s networking primitives, combined with the assumption that all peers run nodes, makes it possible to deliver cutting edge social applications consisting of only two elements: a protocol and an interface. This leads to limitless possibilities for developers, who previously needed to duplicate massive amounts of work and run their own servers in order to deliver software that satisfies users.

Urbit also benefits both users and developers by consolidating data to where it belongs: in a unified environment that the user owns. Developers need not assume the liability of user data residing on their own infrastructure, and users need not trust developers with their private information. And in the case of creating integrations between services, there is no chasm of APIs and terms of service to bridge between: all of the user’s data is in the same place, speaking the same language. The only chasm between two services is the user’s permission to share data between them.

Prior examples show that this level of added value is necessary to put ownership in the hands of users: a sensible, lightweight product that asks no compromises in terms of UX, while giving full control to the owner.

Web3 Revisited

While it is difficult to overstate the centrality of personal servers to the problems Web3 aims to remediate, there remains a need for applications that interface between end-users and blockchains. Even more than this, a growing industry is responsible for developing middleware in Web3, both between different blockchains and between a given blockchain and the real world. Urbit offers solutions in both of these domains, in the bleeding edge industry of blockchain development on Urbit.

Some aspects of Urbit’s natural affinity for blockchains are already well-understood. Azimuth, for example, which serves as Urbit’s identity system and PKI, is implemented as a Solidity contract on Ethereum. Furthermore, the basic problem of association between names and public keys can be considered solved on Urbit, as name-key associations are already an assumed part of the system, and already integrated into Urbit’s Bitcoin application. Already on Urbit, you can natively send and receive BTC with other Urbit users with no need to keep records of their addresses.

Other faults in Web3 are addressed by the mere lack of any need to compromise on user-owned architecture. dApps on Urbit, for example, are truly dApps — they are sent to the user’s server upon installation, and run locally. API layers and trust bottlenecks between Web3 applications and blockchains are not needed if blockchains are built on Urbit, as the network provides a sensible common language for all applications, even if they are hosted on different servers. And above all, the most important factor in keeping blockchains decentralized is user-run validators, which can be considered no different from any other application on a robust and user-friendly personal server.

Even beyond this, Urbit promises to add even more value to Web3 in the domain of global integration. The need for middleware to connect components on-and-off the chain is said by some in the Urbit community to be a symptom of a deeper problem: the lack of a sensible, unified execution environment shared between applications. In summary, crypto needs an OS, and Urbit can be that OS.

The accelerating power of crypto on an OS that speaks its native tongue is much discussed and speculated on in the Urbit community. Uqbar, the first blockchain native to Urbit’s network, aims to obviate any need for middleware by using Urbit as a general purpose orchestrator to synchronize data between disparate components, whether blockchains or ordinary local state. Their solution uses zero-knowledge proofs, sharding, and other bleeding edge technologies to create a crypto ecosystem on Urbit that can not only compete with the best of the L1s elsewhere, but add features that prove indisputably that Urbit is the true home of Web3.

Uqbar is hard at work developing their tooling and plans to release a public testnet in the very near future. Will it revolutionize the industry the way its developers claim? In that domain, only theories and speculation can provide an answer. But their argument is worth a glance for anyone interested in emerging technologies in crypto.

Urbit Now and in the Future

Much is promised here and elsewhere about the potential for Urbit to take the world by storm and bring about a new era of user-owned computing. Nevertheless, if you try Urbit today, you will see a friendly, somewhat minimal interface for text chat and an ecosystem of experimental applications. You may find Urbit’s promises wanting, in the domains of zero-maintenance servers, competitive UX, and perhaps even avoidance of sysadminship. Regrettably, it is not yet even possible to run a Bitcoin node on Urbit in the one-click way that it should.

Urbit is exciting to early adopters not because of what you can do with it right now, but because of what it can enable after the necessary steps are taken. And in contrast to the Urbit of even two or three years ago, the necessary steps are well-understood and waiting in queue. The revolution in computing is no longer “how?” but “when?” for the Urbit community.

Today, Urbit is a simple and clean tool for chatting with friends, playing games, and experimenting with new ideas. More than anything else, the Urbit of today is a tool for doing what its users care about most: building Urbit. If you’d like to get involved, the community would love to have you. If you’d rather observe from the outside, keep a keen eye out. Big things are coming in the near future for Urbit and Web3, and you don’t want to miss out.

Conclusion

Urbit’s value proposition is long-winded enough that it won’t fit into a tweet or a TV commercial, but it’s promising enough to excite developers who share our vision of the future and want to play a part in building it. When it matures as a product, rethinking Unix and the internet won’t be included in the pitch. Urbit will be a service you can buy, either as a subscription or a physical product, that enables you to use apps that are just plain better than the ones you used to use.

Much remains unclear about what happens between now and then, but crypto and Web3 enthusiasts will have many reasons to get involved before you start seeing ads on the television to buy an Urbit planet. Urbit offers a comfortable home to idealists who believe in cryptographic ownership and share a concern about the future of humans and technology, and the next generation of early adopters is sure to include a wide cohort from that audience.

Now that you understand Urbit’s core value proposition, stay tuned for an exposition into the details of Urbit’s capabilities as a platform, its integrations with crypto, and a deeper dive into the promise of Uqbar to reshape the landscape of blockchain development.

Chorus One announces MEV-Boost support
The Ethereum Merge is one of crypto’s most anticipated events.
August 12, 2022
5 min read

The Ethereum Merge is one of the most anticipated events in crypto history.

The transition, meant to take Ethereum from its current Proof-of-Work consensus mechanism to a Proof-of-Stake model, has been in the works since Ethereum’s inception. However, it took its first step in December 2020, when the Beacon Chain was successfully launched. And now, with the consensus mechanism running unimpeded for a year and a half and over 13 million staked ETH, developers feel confident enough to move to the second step. This requires joining the consensus layer of the Beacon Chain with the execution state of the main Ethereum chain, the process known as “the Merge”.

This new era to the Ethereum protocol brings better security, greater energy efficiency, and sets the stage for future scaling efforts meant to take Ethereum to the moon.

Chorus One prepares for the Merge

Chorus One has been closely following the development efforts to bring Proof-of-Stake Ethereum to reality. As a trusted staking provider in the ecosystem, we are participating in testing the Merge at this critical point with our Prater/Goerli nodes ready for transition. We are particularly aware of the risks associated with such a significant change of operations in a blockchain that has captured a major part of the economic activity in the crypto ecosystem. For that reason, our goal remains to support decentralised networks to promote the security and availability of our services, and to increase the rewards of our clients under such a standard.

As we think of the future for both our operations in the Ethereum ecosystem and the existential threats that can compromise the integrity and stability of the network, we have devoted a lot of effort into understanding MEV and clarifying our position towards it.

MEV-Boost

On our path to support a more decentralised, democratic and fair distribution of MEV rewards for our stakers, we would like to announce our support for MEV-Boost.

Although MEV continues to be a controversial and cutting-edge space for research, we believe that this can be an interim solution as we wait for more sophisticated in-protocol upgrades. On a high level, MEV-Boost is an implementation of proposer-builder separation (PBS) built by the Flashbots team for Proof-of-Stake Ethereum. As a free, open-source and neutral software, we believe it embraces the values of the Ethereum community and can be a valuable asset for all validators, big or small.

Why run MEV-Boost

By participating in the fair extraction of MEV, we believe we are unlocking the real value of the networks we support, as well as increasing the value of staking to promote higher rates of participation, and an increase in the security of the PoS protocol.

As staking providers, running MEV-Boost allows us to maximize the staking rewards of our clients while protecting Ethereum decentralization, with an estimated increase of 60% in the rewards we can share.

Unlike previous Flashbots’ offerings, this software is compatible with all client implementations of the Ethereum protocol, making it a big step towards further client diversity, a topic that has been the subject of research at Chorus One in the past year.

Finally, we are committed to evaluate and continue to monitor different approaches to our MEV implementations, and to the risks of single-relay and single-block producers, working with different teams to find the most balanced system. Fair MEV extraction continues to be something we iterate on going forward.

The way forward

In the coming days we will be getting ready to test MEV-Boost on our Goerli infrastructure to best prepare in time for the mainnet Merge. We have been working closely with Flashbots and collaborating with other node operators to ensure that the product is ready and tested by the time it goes live.

MEV is an inevitable part of participating, not only on blockchains, but in all ordered economic systems. Our intent is to be responsible participants of Ethereum and beyond, with MEV research spanning Solana and Cosmos, there is more to come. For the time being, follow our node readiness for MEV-Boost here.

About Chorus One

Chorus One is one of the largest staking providers globally. We provide node infrastructure and closely work with over 30 Proof-of-Stake networks.

Website: https://chorus.one
Twitter: https://twitter.com/chorusone
Telegram: https://t.me/chorusone
Newsletter: https://substack.chorusone.com
YouTube: https://www.youtube.com/c/ChorusOne

Chorus One announces support for MEV-Boost
The Ethereum Merge is one of the most anticipated events in crypto history.
August 10, 2022
5 min read

The Ethereum Merge is one of the most anticipated events in crypto history.

The transition, meant to take Ethereum from its current Proof-of-Work consensus mechanism to a Proof-of-Stake model, has been in the works since Ethereum’s inception. However, it took its first step in December 2020, when the Beacon Chain was successfully launched. And now, with the consensus mechanism running unimpeded for a year and a half and over 13 million staked ETH, developers feel confident enough to move to the second step. This requires joining the consensus layer of the Beacon Chain with the execution state of the main Ethereum chain, the process known as “the Merge”.

This new era to the Ethereum protocol brings better security, greater energy efficiency, and sets the stage for future scaling efforts meant to take Ethereum to the moon.

Chorus One prepares for the Merge

Chorus One has been closely following the development efforts to bring Proof-of-Stake Ethereum to reality. As a trusted staking provider in the ecosystem, we are participating in testing the Merge at this critical point with our Prater/Goerli nodes ready for transition. We are particularly aware of the risks associated with such a significant change of operations in a blockchain that has captured a major part of the economic activity in the crypto ecosystem. For that reason, our goal remains to support decentralised networks to promote the security and availability of our services, and to increase the rewards of our clients under such a standard.

As we think of the future for both our operations in the Ethereum ecosystem and the existential threats that can compromise the integrity and stability of the network, we have devoted a lot of effort into understanding MEV and clarifying our position towards it.

MEV-Boost

On our path to support a more decentralised, democratic and fair distribution of MEV rewards for our stakers, we would like to announce our support for MEV-Boost.

Although MEV continues to be a controversial and cutting-edge space for research, we believe that this can be an interim solution as we wait for more sophisticated in-protocol upgrades. On a high level, MEV-Boost is an implementation of proposer-builder separation (PBS) built by the Flashbots team for Proof-of-Stake Ethereum. As a free, open-source and neutral software, we believe it embraces the values of the Ethereum community and can be a valuable asset for all validators, big or small.

Why run MEV-Boost

By participating in the fair extraction of MEV, we believe we are unlocking the real value of the networks we support, as well as increasing the value of staking to promote higher rates of participation, and an increase in the security of the PoS protocol.

As staking providers, running MEV-Boost allows us to maximize the staking rewards of our clients while protecting Ethereum decentralization, with an estimated increase of 60% in the rewards we can share.

Unlike previous Flashbots’ offerings, this software is compatible with all client implementations of the Ethereum protocol, making it a big step towards further client diversity, a topic that has been the subject of research at Chorus One in the past year.

Finally, we are committed to evaluate and continue to monitor different approaches to our MEV implementations, and to the risks of single-relay and single-block producers, working with different teams to find the most balanced system. Fair MEV extraction continues to be something we iterate on going forward.

The way forward

In the coming days we will be getting ready to test MEV-Boost on our Goerli infrastructure to best prepare in time for the mainnet Merge. We have been working closely with Flashbots and collaborating with other node operators to ensure that the product is ready and tested by the time it goes live.

MEV is an inevitable part of participating, not only on blockchains, but in all ordered economic systems. Our intent is to be responsible participants of Ethereum and beyond, with MEV research spanning Solana and Cosmos, there is more to come. For the time being, follow our node readiness for MEV-Boost here.

About Chorus One

Chorus One is one of the largest staking providers globally. We provide node infrastructure and closely work with over 30 Proof-of-Stake networks.

Website: https://chorus.one
Twitter: https://twitter.com/chorusone
Telegram: https://t.me/chorusone
Newsletter: https://substack.chorusone.com
YouTube: https://www.youtube.com/c/ChorusOne

A CEX vs DEX comparison: Why is dYdX moving to Cosmos?
All the reasons why dYdX has opted to build its own blockchain in the Cosmos.
August 9, 2022
5 min read

This article provides a brief comparative analysis between centralised exchanges (CEX) versus decentralised exchanges (DEX). This will be followed by a comparison of various DEX exchanges to understand why dYdX has opted to join the Cosmos ecosystem. The primary reasons are increased decentralisation, higher throughput, and a developer-friendly SDK.

PART 1 — A DEEP DIVE INTO EXCHANGES AND HOW TRADES ACTUALLY HAPPEN

To help you grasp the subsequent arguments and comparisons, we’ll first go through the key distinctions between a CEX and a DEX. If you already know the distinctions, skip to Part 2 of this article.

Centralised Exchange (CEX): a type of cryptocurrency exchange that is operated by a company that owns it in a centralised manner. Liquidity is supplied by traders in the form of orders (order book model) that keeps the assets involved in all respects in their custody (e.g. Binance, Coinbase). This method gives the CEX a significant advantage since order placement, matching, and settlement can happen immediately off-chain (even if you then have to switch to the Blockchain in effect to move them to a personal wallet).

Decentralised Exchange (DEX): a platform for exchanging cryptocurrencies based on functionality programmed on the blockchain (i.e., in smart contracts). The trading is done peer-to-peer or between liquidity pools. The liquidity in DEXs (Uniswap and dYdX) is given by the users themselves, who contribute the tokens to a specific pool in return for the fees paid by all those who swap the tokens.

Whereas in CEXs, the user trades directly with the platform and purchases the token of his choice with fiat market availability, the scenario is somewhat different in DEXs. To get a token, a user must swap it for another token pair with a liquidity pool of those 2 assets.

Centralised exchanges enable buyers and sellers to submit bids and asks for specific assets via order books (e.g. cryptocurrencies). Order books still exist on a decentralised exchange, where a user may submit a bid or an ask. However, we often see an alternate option where a user can trade without a counterparty via an automated market maker (AMM). An automated market maker uses a mathematical formula known as a ‘constant product’ to calculate an asset price at every moment by calculating x * y = k. (without bids and asks having to be actively placed). This is possible because market makers on decentralised exchanges are referred to as ‘liquidity providers’ (LPs). LPs place assets in a smart contract and authorise the contract to be traded against. In exchange, an LP receives fees based on the amount of liquidity offered versus the whole pool. In general, using Uniswap as an example, the user must have Ethereum in order to trade with Compound, Curve, and many other services. By doing so, the user pays a 0.3 percent fee directly to the pool where he swapped the token, which is then distributed among all liquidity providers.

Other significant differences between CEX and DEX

  1. Listed Coins/Tokens: DEXs provide a significantly wider range of tradeable assets. This is because listing coins/tokens on such exchanges requires very minimal verification. Users may trade almost any asset in DEXes, but how can they know which assets are safe to trade? Conversely, centralised exchanges generally only trade a restricted variety of assets as there are several processes involved in listing an asset on a CEX.
  2. Governance: It is an area in which the DEX differs the most from the CEX: Governance — and therefore decision-making activities for the majority of public elements — are in the hands of users and holders who express their view via a vote using the governance token, UNI or dYdX. Obviously, unlike on a DEX, the choice on centralised exchanges lies on the platform owners and whitelisted access to certain parties, who make their own judgments.
  3. KYC: Centralized Exchanges are always under the radar of governments and regulatory entities. For that reason, the users of such exchanges should pass through the different types of identity verification before starting to use such platforms. On the other hand, decentralised exchanges — as the name suggests — are decentralised. This means that, in principle, no entities can monitor their activities. Hence, it is not necessary to pass through any KYC or similar verification processes to use DEXes.
  4. Ownership of assets: Centralized exchange users do not own their assets. You must be aware that the exchange is the genuine owner of your private keys, and you choose to trust them with them. However, decentralised exchanges do not keep your funds. Users connect their own wallet to such exchanges and start trading. Thus, the user is the actual owner of their possessions.
  5. Availability: Third parties run centralised exchanges. Such systems may collapse at any time. It has occurred before. Many CEXs, for example, restrict user access during market crashes to reduce their own losses. We have seen this happen with Celsius and others in the current market conditions. DEXs, on the contrary, do not have intermediaries and remain open no matter what occurs in the market. However, DEXs have various infrastructure and interfaces that might crash.
  6. Easy of use: Centralized exchanges are more user-friendly. As a result, users do not need to bother about creating wallets or connecting them to exchanges. Conversely, decentralised exchanges’ interfaces provide limited (now expanding) possibilities. Trading on a DEX is also more difficult for new traders.
  7. Security on trading assets: Centralized exchanges often have rigorous procedures for adding new assets. This decreases the hazards of working on risky projects. Meanwhile, decentralised exchanges lack such standards, leaving consumers with more responsibility to assess the security of various initiatives.
  8. Security on funds: Centralized exchanges own users’ private keys. They are also vulnerable to external hacking. The good news is that some of them provide insurance. Decentralised exchanges do not deal with asset ownership, therefore users do not risk losing their cash in this manner.

Orderbook vs Liquidity Pools

Now that we’ve established the primary distinctions between CEXs and DEXs, we’ll look at two sorts of exchange transaction mechanisms that are frequently observed on these exchange platforms.

What is an order book in crypto?

The orderbook concept is the foundation of many CEX and DEX’s (in the case of dYdX) operations. All orders to buy and sell a token are labelled “Bid” and “Ask” in the order book system. The spread is the difference between the highest bid and lowest ask at the top of the book. If a person buys or sells rapidly at the best price available, the order is known as a market order, and the buyer and seller are matched based on top of the book orders. A limit order, on the other hand, is when a person buys or sells a token at a certain price such that the order is posted on the order book.

Pros:
  • This technique works well in liquid markets with a wide range of buyers, sellers, and market makers.
Cons:
  • It does not work in non-liquid marketplaces since a person cannot trade if the highest bid is lower than the lowest published ask.
  • Miners may see your transactions since you must upload them to the blockchain before making an order on DEX. Your information allows miners to make an easy profit by putting a purchase order in a block if it forecasts that your order will cause the price of a token to rise (MEV frontrunning).

What are Liquidity Pools in crypto?

A liquidity pool is a collection of money put by LPs into a smart contract. AMM transactions allow you to purchase anything without a seller as long as the pool has enough liquidity and your trade affects the token ratio computed by the algorithm. This approach does not need an order book. Although both LPs and order books operate on a peer-to-peer basis.

Pros:
  • Liquidity is independent of the sequence or pool size
  • Automated pricing reduces the need to acquire data from exchanges to calculate asset prices
Cons:

This strategy is problematic due to the high amount of slippage for big orders, which necessitates gigantic pools. Uniswap V3 reduced this problem by implementing the concentrated liquidity functionality. Liquidity providers concentrate liquidity in the most likely trading prices rather than spreading it across the entire price range.

We are also now starting to see the rise of hybrid initiatives which combine AMMs and orderbook models in an attempt to extract the best of both worlds. The Cosmos ecosystem is beginning to stand out in this area too, with upcoming protocols such as Onomy.

PART 2 — ANALYZING THE TOP DEXES

The cumulative decentralised exchange volume for the past 7 days stands at $10 billion. Uniswap, yet again, led the pack in trading volume.

dYdX’s current trading volume closely resembles Uniswap’s and ranks 10th in ‘Token Holders by DeFi projects’.

However, it is worth noting that the ratio of DEX:CEX spot volume reflected a mere value of 13% for the month of June, noting a decline from 16% in January. Binance, with significantly lower fees, still dominates the market ($11bn 24h volume). This data clearly highlights that decentralised exchanges are merely complementing centralized exchanges that still account for the lion’s share (trading volume).

Despite this, Uniswap has repeatedly surpassed Coinbase in trading volume in the past. In terms of token trading availability, the former dominates with 430 verified coins in V3 and over 8000 trading pairs in V2.

https://xangle.io/en/research/62c28da8534a07d0b2ffb715

While Binance currently supports trading in more than 600 coins, Uniswap V3 has significantly more liquidity than Coinbase and Binance. However, this is unique to Ethereum and its many pairings.

https://ambcrypto.com/how-uniswap-uni-dominates-binance-coinbase-in-terms-of-liquidity/

Uniswap provides double the liquidity of Binance and Coinbase for ETH/USD. Uniswap boasts 3x the liquidity of Binance and 4.5x the liquidity of Coinbase for ETH/BTC. It also possesses three times the liquidity of large centralized exchanges for ETH/mid-cap pairings. NB: A larger liquidity is required in decentralised exchanges to avoid considerable spreading with big trades.

dYdX vs Uniswap

dYdX and Uniswap are both DEXs that operate on the Ethereum blockchain.

What is Uniswap?

Uniswap is an open-source DeFi platform that employs an automated liquidity protocol paradigm instead of an order book. LPs (Liquidity Providers) construct this pool with no listing costs. Any ERC-20 coin may be created if a liquidity pool is accessible for traders.

Factory and Exchange are two Uniswap smart contracts. Factory contracts help introduce new tokens to the network, while Exchange contracts help exchange tokens. When a Liquidity Provider puts a pair of tokens into a smart contract, other users may buy and sell this trading pair, and the liquidity provider receives a cut from the trading charge.

What is dYdX?

dYdX is a non-custodial decentralised exchange that uses Ethereum smart contracts to trade. This allows traders to trade on margin while simultaneously benefitting from Ethereum’s security.

dYdX teamed up with StarkWare to create a Layer 2 protocol. Traders may deposit money and trade instantaneously without incurring transaction costs. Following China’s reiteration of their stance on banning cryptocurrency, daily trading volume surged to nearly $10 billion on dYdX, beating Uniswap for the first time in September 2021. Later, dYdX lost a significant amount of its market share due to competition and outage problems which questioned the integrity of the protocol. Despite this, being the first perpetual DEX protocol to implement a Layer 2 solution has certainly paid off.

Derivatives trading is a trademark of dYdX. Compared to spot trading, derivatives trading offers more application possibilities, which may help customers adapt to changing market trends, increase profits, hedge risks, improve resource allocation, etc. Derivatives trading is projected to add new incremental users, more live water to the market, and set the groundwork for a fresh DeFi breakout.

Recently, dYdX announced that the protocol is moving to Cosmos to build its own native chain on Cosmos SDK and Tendermint Proof-of-stake with the hopes of regaining the market dominance it once had.

PART 3 — WHY IS dYdX MOVING TO COSMOS?

Here is how and why the move is set to achieve full decentralisation, seeking to solve the problems dYdX had in the past:

Cosmos makes it easy to establish a blockchain with cross-chain capabilities leveraging the Cosmos Tendermint proof-of-stake consensus engine. Cosmos is decentralised and customizable and each Cosmos chain has its own validators and staking token. Other alternative L1s or L2 would not be suitable for dYdX because they are incapable of handling the throughput that dYdX requires (10 operations/second and 1,000 places/cancellations per second).

Because app-specific chains in Cosmos are not dependent on other protocols in the network, network congestion experienced in Ethereum is not a concern. Projects can also benefit from Interchain Security from the Cosmos Hub to increase stability and security.

dYdX contemplated constructing an AMM or RFQ system, but realized an orderbook-based protocol was essential for pro traders and institutions. As such, dYdX concluded that an improvement requires a decentralised off-chain network to handle the orderbook.

While Serum on Solana does create the order book exchanges on-chain, Solana trades centralization for greater speed. dYdX wishes to achieve faster transaction processing while maintaining decentralisation, which is a tough task. Enter Cosmos.

Developing a blockchain for dYdX V4 allows full customization over how the blockchain functions and validator duties. As indicated, Cosmos’ chain may be tailored to the dYdX network’s demands. Traders would pay fees based on deals performed, comparable to dYdX V3 or other centralized exchanges. Cosmos will also bring a greater utility to the current pure governance $DYDX token.

Comparison between Cosmos and Starkware/L2s

What is Cosmos SDK?

One of the most differential aspects of Cosmos is its SDK. The Cosmos SDK is a collection of tools and frameworks created by the Cosmos team. Developers may use this SDK to begin building the application logic layer. Furthermore, users may utilize Cosmos SDK in combination with Tendermint Core and ABCI to access the consensus engine and networking layer’s current functionality.

Some of the benefits include the ease with which the essential ABCI methods, the storage layer, cryptographic features, and client apps in Go may be implemented. It also offers on-chain governance and management of user accounts, keys, and transaction balances, among other items.

The SDK is extremely simple to use, and many of its features may be scaffolded in seconds using Github. You may also overwrite existing methods with your own logic. This saves teams and developers a lot of time and energy when it comes to creating projects. As an example, Kyve Network took less than a week to transfer from Ethereum and have a base chain up and running. It is generally much harder to launch chains on other networks. Read more about why it is so, here.

Lately, there have been reports of Cosmos incurring a significant cost of chain security. This is not entirely correct. With an inflation rate of 8% and an average commission rate of 8%, the validators receive 0.6% of the token supply each year. That’s hardly a lot. Furthermore, individuals enjoy staking because it increases their engagement; they lock up tokens, and validators test your software or perform other services. It’s not a high price to pay.

The future of Ethereum Layer 2, Ethereum 2.0, will increase performance, but the overwhelming assumption is that it will still prioritize security over speed. In comparison, Solana is extremely quick, making it ideal for high-frequency trading systems. When it comes to performance and flexibility, a sovereign app-chain is an obvious choice.

A win-win move

By moving to Cosmos, dYdX will also add a new group of customers to the Internet of Blockchain’s ecosystem; for example, its 24h trading volume is presently $2Bn+, compared to $15M on Osmosis, the network’s largest DEX. Additionally, as stated by Messari’s recent article, StarkWare’s latest valuation alone in private markets was $8 billion. Cosmos’ current valuation in public markets ($ATOM) is $2.9 billion. This certainly raises the question of a possible mismatch in value, especially if Cosmos starts to attract more L2s taking advantage of Ethereum’s slow-moving developments.

 Join our mailing list to receive our latest updates, research reports, and industry news.

Want to be a guest?
Drop us a line!

Submit
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.