For one, Visa just does funds transfers. If Ethereum just transferred ETH, it could do 700 tx/sec, based on the gas limit per block divided by the 20K gas for a simple ETH transfer. A lot of Ethereum transactions are more complex and use more gas.
Second, second-layer "rollups" already support transaction rates up to a couple thousand per second, without losing base-layer security guarantees. Essentially they compress the transactions on chain. (Actual traffic is much lower than that, so far.)
The longer-term plan is to move to rollups in a big way, and add a form of data sharding to multiply the on-chain data storage. That's where the 100K/sec comes from.
> If Ethereum just transferred ETH, it could do 700 tx/sec
So... Still an order of magnitude less than Visa.
> A lot of Ethereum transactions are more complex and use more gas.
How complex are they, are they a significant number, and dies this explain the abysmal transaction rates?
Visa aside, on Singles Day AliPay does 1 to 1.5 billion transactions in one day.
Okay. AliPay is an outlier. Klarna does about 2 million transactions per day, an average of 23 per second (it's significantly higher during peak hours, and lower during off-peak hours. Source: worked at Klarna). The transaction includes: client lookup (a complicated affair that differs from country to country and session to session), credit score calculation, order set up, payment (out of a several dozen different options) set up... all while conforming to banking and local regulations across 45 countries. Anything in eth anywhere close to that complexity-wise?
> Second, second-layer "rollups" already support transaction rates up to a couple thousand per second
So layer 2 batches transaction into a batch that then gets batched on layer 1 into a bugger batch that is then written in one go to the blockchain.
Ethereum runs the EVM, which is a Turing complete execution environment. It can run things of arbitrary complexity.
> Congrats, you've reinvented batch processing.
This is an extreme over-simplification of zero knowledge proofs. You would do well in digging deeper what is being achieved instead of dismissing it without any understanding.
Most of the cost comes from the need to maintain consistency in state. This means that even simple database updates require a lot of work updating hash trees and verifying that other nodes reach the same results. The design of these hash trees isn’t terribly efficient, and indeed was in some ways designed to be deliberately inefficient in order to ensure that nodes didn’t fragment into running specific contracts.
The good news is that there’s a lot of room for improvement in the design. The bad news is twofold: (1) efficiency improvements will eventually cause transaction processing to be data-bound rather than compute-bound, and (2) these improvements to improve compute may be incompatible with existing Ethereum nodes. ZK and optimistic rollups potentially offer a way out of this update mess that don’t require dramatic changes to the underlying consensus system. However they will also inherit some of the limitations above, including tradeoffs between availability and processing speed.
This is probably the first reply I've seen that doesn't shy away from saying "yes, we're inefficient", and looking at the solutions with a critical eye.
Over Ethereum there are completely automated exchanges (think Nasdaq), lending, escrow, stablecoins, zero-knowledge proof verifiers, games, payments...
Blockchains are designed to meet the most extreme requirements of availability, censorship-resistance and cost of attack. They achieve them by exploring the extremes of replication of the solution space of distributed systems. For example, Ethereum runs on a Raspberry Pi, and every node runs the exact same things. Therefore how fast it goes is limited to the slowest node you want to make able to run the blockchain.
But, what you are dismissing as simple batch processing are the application of very novel techniques (novel as in math was invented in the last 5 years to make them possible) that allow to externalize most of the heavy computation outside the blockchain and delegate to the blockchain the verification of a proof. Therefore allowing to inherit the security properties of the blockchain but achieving much higher scalability.
Ethereum settles nowadays 200 bitcoin equivalent transactions per second. And is targeting to scale to 1-10M transactions per second by 2030.
> Over Ethereum there are completely automated exchanges (think Nasdaq), lending, escrow, stablecoins, zero-knowledge proof verifiers, games, payments...
All of those "oh my god complex things" on Ethereum are a variation of "look up a single number in one account, look up a second number in the other account, add or subtract two numbers".
Even doing a KYC + proper credit check on a client will cripple Ethereum.
> that allow to externalize most of the heavy computation outside the blockchain and delegate to the blockchain the verification of a proof.
So, batching. But with an extremely complex processing that you trust devs implemented correctly.
> Ethereum settles nowadays 200 bitcoin equivalent transactions per second.
No one cares about "equivalent transactions" to be honest. People care about actual things being handled by actual transactions they engage in.
> So what exactly is so complex that Ethereum runs that it processes stuff so slowly?
Nothing against the talented devs working at Klarna, but if they have trouble handling the load, they can always add a couple TB of memory or a couple hundred CPU cores.
Ethereum needs to solve 100k tps while continuing to run on a Raspberry Pie on a home internet connection.
> Nothing against the talented devs working at Klarna, but if they have trouble handling the load, they can always add a couple TB of memory or a couple hundred CPU cores.
Ouch. Burn :)
A lot of fat has been trimmed from Kred and it's now rammed into an AWS instance, but the pain is real :)
> Ethereum needs to solve 100k tps while continuing to run on a Raspberry Pie on a home internet connection.
To run a node you need a 2TB disc, at least 16 GB of RAM and a quad-core CPU. So, not on Raspberry Pi yet (and probably not ever).
And then you actually go into the docs.... And: oh, 8 GB RAM should work, but let's have a 32 GB swap. The CPU should be fine, but let's overclock it. Oh, you shouldn't use Geth to ensure ecosystem health, but all clients except Geth require at least 16 GB of RAM. And geth will eat all of your disk spce if you're not careful.
So, for a single combination (Geth + Nimbus) it may just be possible to run this tower of babel on a Raspberry Pi. And.... The only thing it will be doing is... nothing, really.
You are not making an effort to appreciate the opposite point if view. An order of magnitude for a young, decentralized, trustless and permissionless network is quite an engineering feat.
How complex? -> Turing complete.
Also there is more to L2s than batching. Their aim is to maintain the security guarantees of the L1 without sacrificing availability. Plus, Zk proofs and merkle trees allow for great data compression in the rollup.
Turing completeness is actually amazing in its simplicity.
You take a push down automata (finite state automata plus a stack) and then add the ability to read and write anywhere in the stack.
The essence of Turing completeness is the ability to take any intermediate result as an input to the computation of the next result.
This is why the halting problem is even a problem, because you need all of the intermediate results to know the end result.
Ok now back to the topic at hand. Turing completeness on the L1 layer isn't very impressive because every node is supposed execute the code to validate it. The problem with L2 then is the fact that you must somehow avoid that, i.e. the L2 layer executes the smart contract and determines a result and these results are batched on the L1 where the point is that you're not supposed to execute the contract as otherwise there is no speedup to be gained. It is sort of a paradox.
The end result has to be verified on chain. The most promising strategy so far is build a zkEVM which produces a zero knowledge proof that can be verified on L1 via a smart contract.
I still don't understand how this convinces people to buy Ethereum.
Money is supposed to be a medium of exchange because barter is expensive and time consuming. Instead of trading anything for anything we trade anything for money and money for anything. Competition is good but is Ethereum even competing?
How exactly do random wealth transfers to early adopters serve any purpose that money is supposed to be used for according to classical text books?
For one, Visa just does funds transfers. If Ethereum just transferred ETH, it could do 700 tx/sec, based on the gas limit per block divided by the 20K gas for a simple ETH transfer. A lot of Ethereum transactions are more complex and use more gas.
Second, second-layer "rollups" already support transaction rates up to a couple thousand per second, without losing base-layer security guarantees. Essentially they compress the transactions on chain. (Actual traffic is much lower than that, so far.)
The longer-term plan is to move to rollups in a big way, and add a form of data sharding to multiply the on-chain data storage. That's where the 100K/sec comes from.