Welcome Avatar!
If you read today’s headline and thought “what the heck does that mean?” you’re in luck. This article explains exactly that.
Proto-Danksharding is the biggest improvement to Ethereum since the transition to Proof of Stake.
By providing a new temporary storage facility, Ethereum will be able to offer higher transaction capacity to layer 2’s built on top of it, while reducing costs.
DeFi hasn’t reached mass adoption yet. A major barrier is transaction capacity.
Whenever there is high demand for blockspace, prices quickly become uneconomic for basic transactions like sending money, taking a loan, or buying a token. Predatory actors like short term traders and robots bid up the auction price for ‘gas’, the price to include a transaction in the next Ethereum block. And sometimes too many people want to swap NFTs and speculate on the latest meme token all at the same time.
To onboard the next billion users, blockchains need to scale. Layer 2 solutions, including optimistic rollups such as Arbitrum, Optimism, and Base Chain are part of Ethereum’s scaling roadmap. But.
Rollups still pay massive fees to the Ethereum blockchain to post their transaction batches for settlement on Layer 1. And these fees are passed on to users, although some Layer 2s include a subsidy.
Last week, the top 4 Ethereum rollups paid ~$1.7 million to store their transaction data on Layer 1. That’s nearly $90 million a year - reducing transaction fees by 90% would save around $81 million at *current* (low) levels of blockspace demand.
If a retail driven bull market returns, we will likely still see prohibitively high gas fees even on Layer 2 rollups, pricing out ordinary users who don’t see the value in paying $10 to send a $500 payment or being charged $18 fees to buy $100 of a token. Hundred dollar transaction fees on mainnet were a feature of the 2021 bull market, and being 10x cheaper isn’t much of an advantage if a basic transaction still costs $10.
We need blockchain transactions to cost ten cents at scale. This means being able to process thousands of transactions per second. Danksharding is how we get there.
Proto Danksharding
Database sharding refers to creating a horizontal partition of a database, with each segment stored on a separate computer. This is a scaling method to allow a network of computers to process data in parallel. Danksharding is an Ethereum sharding proposal named for researcher Dankrad Feist, and Proto-Danksharding refers to an intermediate step proposed by a researcher known as Protolambda.
Instead of having a single database which all Ethereum nodes store a copy of, the old Ethereum roadmap envisaged splitting Ethereum into 64 linked blockchains to improve scalability. These “shard chains” are no longer part of the Ethereum roadmap - instead data “blobs” will be stored in the consensus layer using distributed data sampling.
Proto-Danksharding is a specification for providing cheap temporary storage for data blobs on Ethereum’s consensus layer. This means lower costs to use Layer 2s. And the temporary storage can be implemented more quickly than full sharding, bringing lower costs to Layer 2s quickly.
Blocks and Blobs
The basic unit of a blockchain is a block, consisting of an index number (slot), the hash of the previous block (creating a block chain), some state information, and an execution payload consisting of transactions (sending Ether, interacting with smart contracts). So far so simple. But what if you need to store data?
As we’ve covered before (decentralized storage), Ethereum wasn’t designed for storing pictures, videos, music, and other large computer files. This is why decentralized storage systems like Arwave and IPFS exist (and also why the images on NFT art are often hosted on a basic web server instead of being stored on-chain).
How does this related to Layer 2 rollups?
Rollups batch transactions off-chain (currently using a centralized server called a sequencer) and periodically submit these batches of transactions to layer 1 Ethereum. The transaction data needs to be available during the challenge window for fraud proofs, but afterwards only the state change needs to be stored - in other words the addresses which sent and received tokens - not all the other data required to prove fraud. However, Ethereum currently has no temporary storage feature. All the Layer 2 batch data currently needs to be stored by every Ethereum validator…permanently.
Temporary Storage for Blobs
Ethereum Improvement Proposal 4844 (EIP-4844) allows rollups to add data to blocks cheaply by adding a temporary storage facility. Technically this means a new transaction type will be created that holds an additional data field called a blob.
The data type is just a binary object blob, up to ~125KB in size. The Ethereum virtual machine does not need to process the data at all. This saves on global compute costs.
Autist note: Forcing the EVM to interpret blob data would be equivalent to increasing the block gas limit, a brute force way to allow more transactions into a block. The reason we don’t do this is because it hurts decentralization, because validator nodes which can process larger block sizes quickly require expensive modern CPUs. Remember that a block needs to be produced every 12 seconds, so there is little time available to check and process transactions. Adding a requirement for more powerful specialist hardware to process more transactions causes centralization of the network as ordinary people cannot stake and participate with their consumer grade devices. In our opinion this is why Solana never got significant traction despite boasting tens of thousands of transactions per second (tps) capability.
Blobs don’t live on the execution layer (blockspace), instead they are shared via the consensus layer and are automatically pruned after 1-3 months. This means that the global state of Ethereum will not rapidly bloat with obsolete Layer 2 data dumps which would require validator nodes to have specialist storage arrays as data wouldn’t fit on a single hard drive, which would harm decentralization.
In short, EIP-4844 means that we can increase the number of transactions per second that Layer 2s are capable of processing, which increases the size of their rollup data dumps, but without increasing the cost to use Ethereum as temporary storage.
EIP-4844 also includes all the execution and consensus layer logic required for full sharding, and a marketplace for pricing blob storage. Full sharding will require additional work including implementing data availability sampling and proposer-builder separation (allowing a specialist participant with powerful hardware to build 32MB blob bundles and the selected proposer to simply pick the bundle with the highest fee).
Implications for DeFi
The main relevance is transaction costs for users. Here is a summary of Ethereum Layer 1 gas (transaction) costs for the last 7 days broken out by app.
Note that these are blockchain fees - they do not include fees to use the app (e.g. liquidity provider fees on DEX transactions, OpenSea fees, etc).
Making these fees 10x cheaper would allow arbitrageurs to correct smaller price discrepancies making DEX more liquid and less expensive to trade on.
Token swaps currently cost ~$0.11 on Arbitrum and $0.05 on Optimism. Making these fees 10x cheaper while allowing the network to scale will allow users to use the chain even during busy periods for ‘nearly free’. (Who counts pennies per transaction?)
Arbitrum, Optimism, and Base can benefit from the increased adoption which lower fees should attract.
Currently, profits from the existing centralized sequencer are passed onto the respective foundations of Arbitrum and Optimism. However, both have plans to move towards a decentralized sequencer mechanism. Future value capture to the tokens would be contingent on transaction fees and governance changes.
When decentralized, the sequencers (now potentially multiple and not just a single entity) could capture MEV. This MEV could then be distributed among stakers, adding an additional revenue stream for token holders. Essentially, by staking their tokens, holders could earn a share of the MEV generated on the Layer 2 network in the future. If MEV becomes a significant revenue source, it could drive demand for the Layer 2 tokens. More people would want to stake their tokens to get a share of the MEV, potentially driving up the token's price.
Value capture for crypto tokens also often creates new layers on top of those tokens and products that enhance composability and yield.
Optimism x Base
If you haven’t read our post on Base yet, we highly recommend you do so.
Optimism unveiled new details about their collaboration with Base earlier today. Here are our key takeaways from the post.
As part of a fee share agreement, Optimism will earn the greater of a) 2.5% of Base's total sequencer revenue, or (b) 15% of Base’s net onchain sequencer revenue (L2 transaction revenue minus L1 data submission costs)
Base can earn up to 118 million tokens over the next six years ($185 million at current price), subject to a cap of 9% of total votable supply for use towards governance
That means Optimism will directly benefit from higher activity on Base, and Base has even more skin in the game as a result of its OP holdings. Our question is how, if at all, would this value flow through to Coinbase. If the OP token outperforms, does this drop down to Coinbase’s earnings or does it get captured at the “Base” level thereby not flowing through to Coinbase? A 5-10x for OP over 6 years is certainly within the realm of possibility and would represent a windfall for Coinbase.