The Limits to Blockchain Scalability on vitalik.ca

Cryptocurrency News and Public Mining Pools

The Limits to Blockchain Scalability on vitalik.ca

At the end of the article Vitalik writes this:

Quantifying this risk is easy. Take the blockchain's data capacity in MB/sec, and multiply by ~30 to get the amount of data stored in terabytes per year. The current sharding plan has a data capacity of ~1.3 MB/sec, so about 40 TB/year. If that is increased by 10x, this becomes 400 TB/year. If we want the data to be not just accessible, but accessible conveniently, we would also need metadata (eg. decompressing rollup transactions), so make that 4 petabytes per year, or 40 petabytes after a decade. The Internet Archive uses 50 petabytes. So that's a reasonable upper bound for how large a sharded blockchain can safely get.

Hence, it looks like on both of these dimensions, the Ethereum sharding design is actually already roughly targeted fairly close to reasonable maximum safe values. The constants can be increased a little bit, but not too much.

I don't really understand how this is going to work. If the blockchain gets this big then only archive.org or other providers will be able to run full nodes. I'm sure there's something I'm missing but I'm not sure what it is. Is it actually planned that in the future of Ethereum you'll need petabyte storage to run a full node?

submitted by /u/aemmeroli
[link] [comments]