Pieter Wuille’s Solution to Bitcoin Scaling: Put a Block in a Paper

Pieter Wuille’s Solution to Bitcoin Scaling: Put a Block in a Paper

Scalability is all anyone talks about and cares about because it is the biggest problem with a solution to it having considerable ramifications not just in the crypto space but also in the entire computer science field.

Pieter Wuille (pictured above), Bitcoin Core developer and Blockstream co-founder, is intimately familiar with the topic, having discussed it and thought about it for at least five years.

In a wide ranging conversation on sidechains, sharding, and much else (to be published), he said his original interest in sidechains wasn’t because of scaling “but to enable experimentation with *different* rules without needing to introduce a new currency first.”

“Somehow lots of people instead took it as a mechanism for scaling, which I think is flawed – if that’s desirable, increasing mainchain capacity is a far more trust-minimized solution to achieve the same thing, with the same costs.”

How to increase that capacity is of course the question, with Wuille stating that in his view “at some point” capacity should be increased but “I don’t think there is any urgency.”

“Technology improves. No reason why our capacity to distribute and verify can’t increase as well,” Wuille argued, with his solution to the problem of ever growing data being “human consensus.” He said:

“At some timescales, human consensus works better than technology.

If you’re able to say, get a good way to verify what the state of the network was 5 years ago (have its hash printed in every newspaper, whatever…), you can bootstrap from that point, without needing to download all history before it.”

Wuille previously proposed a 17.7% yearly increase of the blocksize back in 2015 with his proposal supported by Gregory Maxwell and other prominent developers.

He still thinks a “couple % per year maybe” is how capacity can be increased, with his response to concerns regarding an increase in storage requirements being:

“If you’re specifically talking about the problem of needing to download and verify an ever-increasing dataset when initially joining the network, I think that one of the solutions is having publicly verifiable commitments to history, and bootstrapping off those. Those commitments can take many forms…

Your software shows you the hash it downloaded off the network, and asks you to compare it with what you found elsewhere. Or it’s even embedded in the software. Look at the assumeutxo project.”

Assumeutxo simplistically can be described as those that publish the bitcoin builds, the node software you download unless you want to compile from code yourself, can hardcode the latest block hash and thus “you assume all the blocks in the chain that ends in that hash, that those transactions have valid scripts” for assumevalid while for assuming utxos “you cache the validity of the particular UTXO set.”

Anyone can compute these themselves and if you wanted to you could put them on the paper of record, with the rest of history then archived if people want to archive it.

So effectively solving the storage problem or the node synching problem, leaving still potentially bandwidth constrains but download and upload speeds continue to increase considerably.

Copyrights Trustnodes.com

Share your thoughts, add a comment!

You must be logged in in order to place a comment.

Article comments

Loading...
No comments yet, be the first to comment this article