Increase the number of block producers

Background

There is some ongoing work on changing the validator selection algorithm to introduce chunk producers, thereby increasing the total amount of validators in the network. This post, however, will focus on a somewhat different approach that will allow us to have many more block producers.

Motivation

Today the number of seats for block producers is limited to a rather small number, which is not great for the decentralization of the network. This post presents an idea that, if implemented, will increase the number of possible block producers in the network.

Idea

The reason why we limit ourselves to a relatively small number of validators today is mostly because:

  • Reed-Solomon code computation will become slow
  • Distributing chunk parts to thousands of validators may incur too much delay and cause some validators to fall behind.
  • Approvals take time to collect
  • Verifying a lot of approvals may be slow

However, we don’t necessarily need to assign a part to every block producer. In fact, we can, for each block, sample the set of block approvals from the entire block producer set and only assign a part to each of the sampled block approver. We can use the block vrf output as the randomness seed for sampling. The exact algorithm here is yet to be determined. For example, we can always fix the top 2/3 of the stake and sample a fixed number of block approvers from the rest of the validators. Or we simply sample a fixed total number of block approvers according to the stake distribution. Notice that here the condition for block production does not change – we still need more than 2/3 approvals (weighted by stake) to produce a block. Therefore consensus is not affected.

For most of the existing proof-of-stake blockchains, it is the case that a few validators own the majority of the stake. Therefore, we assume here that the stake distribution has a long tail. With that assumption, we know that the number of parts a chunk producer need to compute and the number of approvals required are bounded by a relatively small number and the issues mentioned above won’t create too much trouble.

Block producers will be selected based on their stake, similar to what is proposed in the new validator selection algorithm. To prevent abuse of proposals, we can a lower bound, say, 10,000 NEAR on the amount of stake required.

Drawbacks

  • The likelihood of the network getting stalled increases because it could be that there are more than 2/3 of the stake online but no block can be produced because some selected block approver is offline. While this is certainly a possibility, if we always end up sampling the top 2/3 of the stake, it does not have much impact in practice.
  • Validators can split their stake. In the worst case scenario where everyone stakes the same amount, this scheme does not help. We can set some upper bound on the number of validators per epoch, or even compute the bound dynamically based on the staking proposals received. In addition, since we do not change how validator rewards work, it is not clear why a validator would want to split their stake.

Alternatives

Another idea is to make the data availability providers a subset of the block approvers, since the main bottleneck seems to be on erasure code computation and chunk distribution. More concretely, we can require that for there is a fixed number of data availability providers for each block and require that sufficient number of them approve the block to guarantee data availability for chunks. Since data availability for each block is independent and that the providers are sampled randomly, even though the total stake for data availability will be smaller, it is still difficult for an adversary to corrupt enough block producers to undermine data availability guarantees.

9 Likes

A summary of the discussion today on this topic:

  • @alexatnear suggested an idea to circumvent the problem of newcomers having no visibility: we could enlarge the validator set to include people who make a proposal but do not get selected so that the delegators could know who they could delegate to outside of the active block producer set. We could even change it such that people who make a proposal can receive reward for some number of epochs before they get kicked out so that the initial delegators are incentivized. Of course we need to set some minimal threshold to avoid spamming with proposals. As @nearmax pointed out, this may reduce the total stake that secures the network, but it is not clear whether it will make a material difference.
  • @eatmore mentioned that we could change the staking pool contract to allow “fallbacks”, i.e, a delegator could specify the primary staking pool they want to delegate to and also specify some fallback pool if the primary one is not a validator and when the primary one gathers enough stake to become a validator, the stake could be moved back to the primary staking pool. However, this is not very easy to implement and at the same time, it does not solve the discoverability issue where new pools that are not validators cannot be effectively displayed to delegators.
  • Regarding what is proposed in this thread, I think the second option is more appealing as it is less intrusive and preserves the properties of consensus we have today entirely. It also somewhat decouples the data availability layer from the consensus layer, which may be beneficial for future changes as well.
2 Likes

Metapool project is a superset of “fallback” approach, allowing to spread the funds of delegators among the whitelisted or governed pools.

I don’t think direct fallback approach make that much sense because users already struggle with choosing a validator, so choosing two or three validators as fallback is even harder.

Generally, I think allowing anyone with at least 5k $NEAR becoming in the active set and starting to receive rewards should be sufficient given current NEAR price (as price appreciates we can later consider reducing this threshold lower via a protocol upgrade). Others can then delegate to these validator as soon as they are slotted to be active validators.

Metapool approach could then help with decentralizing the delegations more to these new validators on behalf of users.

There will be also more ways to come by $NEAR via lending soon or a validator bootstrapping DAO can vote to stake initial 5k to applicant and hold it there until they receive their own delegations.

Ideally, with this approach we should aim to have at least 1k active validators. Obviously still majority of stake will be parked with top validators, but having low threshold will allow projects to become validators and operate from the rewards (see discussion here: NEAR Ecosystem Treasury DAO - #39 by galaxyonline).

4 Likes

Are you referring to the approach proposed by @alexatnear here?

1 Like

I wasn’t referring to any specific approach - just general point that 5k $NEAR min is a good start to onboard a large number of community members who will be able to actively participate as validators.

2 Likes

I also think that Near needs more validators which will greatly contribute to the decentralization of the network.
I would love to run a validator node in Turkey if I had enough assets.

2 Likes