Tuesday, 12 November 2019

Unlimited max blocksize

Can anyone point to a consensus rule, an unofficial agreement among relevant people, or to anything else that indicates there is consensus on a max block size?

There should not be a consensus rule to be found, because obviously, from the inventors writings, satoshi thought that the potential capacity was more than enough for the purpose, a p2p cash system for everybody.

There is always a limit presented to us from nature and the capacity of hardware and software. If a mining node running bitcoin abc today, for instance, should receive a block size larger than 32MB, either the code must discard that block or the program would behave unpredictably (crash). The incentive is to discard it in the hope that someone makes an alternative block (others also discard it). So a test is put in, to discard the block. The limit could be set lower, and a size of produced blocks could be set lower, to avoid problems for others (which means the effort to produce that block might be wasted). So forever, there has to be these limits in any implementation.

This does not mean there has to be a consensus rule on blocksize!

Personally I believe we don't have one (it just looks like, because we have currently only one implementation used for mining), and I think we are better off without such rule. It is more organic, it means quicker progress, size will be better adapted to available hardware, and we avoid the dreaded ossification.

submitted by /u/ErdoganTalk
[link] [comments]

source https://www.reddit.com/r/btc/comments/dv5lk9/unlimited_max_blocksize/

No comments:

Post a Comment