Analysis Paralysis

I’ve been accused of being too flippant about increasing the block size limit. This series of blog posts is meant to show that I’m not, that I have carefully thought about risks and benefits. I stepped back from the role of lead committer exactly so I would have the time to think about bigger-picture issues like this one. Today I’d like to address, head-on, this argument against changing the one-megabyte blocksize limit:

Larger-than-one-megabyte blocks have had insufficient testing and/or insufficient research into economic implications and/or insufficient security review of the risks versus benefits.

This is tough to respond to– there can always be more testing or research, especially for a security-critical project like Bitcoin. It is easy to suffer from “analysis paralysis,” and I think the Core Bitcoin project has been suffering from analysis paralysis over the block size issue for at least three years now.

I’m convinced the uncertainty over when or if this will be resolved is harming Bitcoin. If somebody can point me at a successful software technology that went through years and years of debate and research and was not deployed until it was perfect I’d change my mind – the example that immediately comes to my mind is Project Xanadu versus the Internet.

I don’t think we should adopt the Silicon Valley mantra of “move fast and break things.” But I do think we need to move– “stay still and watch things break” is just as bad.


Now read this

It must be done… but is not a panacea

There are two related arguments I hear against raising the maximum block size: There is no need to raise the maximum block size because the Lightning Network / Sidechains / Impulse / Factom solves the scaling problem. If the maximum... Continue →

Subscribe to Gavin Andresen

Don’t worry; we hate spam with a passion.
You can unsubscribe with one click.