The Hunt For a Scalable Blockchain: Algorand, Elrond, Polka, Zilliqa, is the Third Wave Here?

The Hunt For a Scalable Blockchain: Algorand, Elrond, Polka, Zilliqa, is the Third Wave Here?

After years of debate, the pen has fallen silent, with only the code now speaking. Wars of words have given way to a general understanding of what is the problem, with all now waiting to see who will provide the solution.

The problem conceptually can reach the highest level of human problems: how to prevent the capturing of a decentralized sovereign peoples by one man or a group of men.

The standing solution is that all have a say, albeit some do have a slightly bigger say than others, but the current set-up where everyone verifies everything does give the highest say to everyone.

That balance can easily be broken in a way that moves the system to the rich having a say, not quite everyone, by increasing the resources required for participation.

Not that blockchains are communist, although one can see them as such if they wanted to. Nor is this reason de tra because blockchains are capitalist, which again one can see them as such if they wanted to.

It’s because blockchains are apolitical that this problem is so hard. Apolitical in as far as they pass not judgment on what is best money – although people may well do – but pass judgment on whether someone can change this money in a way other people may not like.

We are currently at the end stage of the dollar, and the euro, and perhaps all fiat money. By currently we do speak with the perspective of decades, and although the analysis arguing we have entered such stage can be seen as sound, it does not necessarily meant it is true.

Yet it might be and if it is, it would be because one man or a very small and generally very rich group of women and men can and have manipulated the dollar and other fiat currencies almost without any constrain.

Preventing the ability of a man or group to manipulate money some would say is an almost impossible task. Yet that is precisely the task here, a formidable task.

Formidable because while it is easy to give everyone say, it is not easy to do so while increasing access to include everyone.

That is ensuring all can run a full node is very easy. Ensuring all can run a full node while all also are able to use the blockchain is on the other hand not easy at all.

First by alphabetic order this new blockchain boasts of being led by “Turing award-winner Silvio Micali and a renowned team of cryptographers, engineers, and mathematicians that solved the blockchain trilemma.”

With such huge claims you’d hope they’d jump at the opportunity to inform a reporter. Alas their team was not accessible and their community appears somewhat unfriendly.

The reason may be because of its atrocious distribution where only 25% of the tokens are circulating. ETH, which is constantly chastised as a pre-mine, kept about 10% of circulating supply, while here they are keeping 75% of the ‘printed’ supply. In bitcoin none was kept. All were free to mine albeit Nakamoto quite understandably was more passionate than others.

Still these are matters more for potential investors. Our interest was more in their claim they solved the trilemma. A very big claim requiring very big scrutiny, which they denied.

So from what we can gather on our own, this appears to be a more ‘traditional’ blockchain with a 2/3rd honesty requirement where there appears to be no sharding as such but a potentially interesting seemingly insta-pruning technique.

We’re told we’re wrong that this is just an insta-pruning chain, perhaps because they might not appreciate that for this exercise we’re interested only in decentralized scaling, and we may well be wrong but that’s our conclusion with any ratification of course welcomed either in the anon (if desired) comment section below or by email etc.

We used the word ‘just’ above, but we don’t mean it dismissively. Pruning is a big component of scaling a blockchain and it is something even bitcoin devs are pursuing.

As such the method they are or are planning to use here might be interesting, but the vibe overall appears to be that this is more a centralized approach using traditional methods which remains to be seen whether they can stand scrutiny.

Diverging here from an alphabetical order so that you can better understand the next one, Polkadot can perhaps best be described as a super compressed chain.

As you can gather from an extensive interview with Gavin Wood, a former key dev of ethereum and now lead dev of Polka, the design here is what in computer terms can be described as master and slave.

You have the relay chain or what we’ll call master chain where the validators reside and coordinate the communication between slave chains or what they call parachains.

This simplification does it no justice, but if we’re focusing on scaling it makes the point that “it still falls upon a single set of nodes (the relay chain validators) to handle all the interchain message passing. This is an O(N^2) problem.”

That means increased usage increases the resource requirements of the network by the square of the validating participants.

To reduce that resource requirement they engage in what can be called compression methods whereby the slave chain needs to send a fraction of information to master, so reducing the validation overhead.

Making this a fairly interesting blockchain project that arguably quite unnecessarily adds experimental governance methods which introduce committees in a somewhat complex set-up.

If one ignores all other elements however, as of course you can freely do in the open source space as you can just fork, the scaling parts do require considerably more scrutiny if we are right to take the perspective of them being just compression methods.

Elrond is an interesting new project that tries to combine the approaches of different other blockchains in utilizing both pruning and sharding.

On the pruning end, history deletion appears to be more for running nodes in as far as a blockchain explorer needs to be an historic node.

For sharding they kind of have the same approach as Polkadot whereby you have mini-blocks from shards and a meta-chain that notarizes these blocks.

In a conversation to be published the Elrond devs implied their focus was max capacity in regards to the number of processable transactions, instead of measuring capacity within decentralization constrains which can potentially be benchmarked by how much resources would a node require to chain-split like BCH did or ETC.

By that benchmark, this appears to be an interesting project, but not quite the solution. Yet the compression methods with these mini-blocks can have its uses.

For investors only 50% of the supply is circulating here, so you’d need to bear in mind the potential dilution.

Zilliqa is the only project as far as we are aware that attempts sharding in a Proof of Work blockchain, making their experience potentially useful for bitcoin.

At a very high level view of this blockchain which has been running for some months, a Zilliqa rep says:

“The architecture of shard in Zilliqa comprises of a Directory Service (DS) shard and n normal shards.

The DS Committee shard is initially elected by the network, but pushes in and out nodes based on PoW, a node gets elected if it solved the PoW the fastest.

When the DS Committee is elected it will initiate the sharding process to assigns a shard to a node based on PoW submissions and some randomness.

The DS Committee helps to reconcile the states in the various shards in a DS Epoch. So while you have a sharded system, you can see the whole network as one contiguous state organized by DS Epochs.”

Bitcoin might move towards implementing transaction parallelization with the experience of Zilliqa potentially being one live datapoint of how sharding could work in a Proof of Work environment.

Bitcoin moves very slowly however with it to be seen what they will implement and when to compress data and to increase capacity.

As can be seen all of these projects have their tradeoffs in regards to capacity and decentralization with it difficult to fully strike the balance but the good news is that all of them appear to utilize compression methods which can provide more capacity.

Just how much is not clear because their benchmark tends to be transactions per second, when if that was the only requirement a database would do a far better job.

If we had to set up a banchmark, it would be emulating the blockchain at max capacity – or reasonable capacity like it running at 1k tx/s or even 100 – for five years to then present the resources required to chain-split this.

Like how much gigabytes or terabytes of storage you’d need to split the blockchain BCH or ETC style assuming expert knowledge where the code is concerned.

The aim is of course the maximum amount of transactions per second at the least bytes requirements for a chain-split.

And a chain-split in particular because that is the ultimate measurer of decentralization from which then derives much else like immutability, permissionless, no need to trust an intermediary, unchangeable accounts and so on.

Whoever can show the most scalable blockchain then presumably gets the prize of most people using it because it can accommodate more functionalities assuming they get right other far more basic and easier matters like supply distribution.

Copyrights Trustnodes.com

Share your thoughts, add a comment!

You must be logged in in order to place a comment.

Article comments

Loading...
No comments yet, be the first to comment this article