EVM Runtime base token

I hear both sides of this argument but I think we could rely on very interesting token engineering magic to align $NEAR & $ETH.

For example, the simplest thing could be buying $NEAR with $ETH (gas fees) on AMMs to then burn or use for ecosystem funding.

This simple mechanism aligns ETH && NEAR in a very simple way. Ethereum devs/folks buy/use $ETH and that indirectly adds value to $NEAR …

1 Like

On yesterday’s weekly EVM Working Group meeting, we outlined our Q1/2021 roadmap and discussed the rationale for ETH as the NEAR EVM’s base token:

5 Likes

@Arto, @alex.shevchenko, @illia. @eatmore told me that EVM is interacting natively with NEP21 wETH, which means EVM interacts with contact through JSON. JSON is not a well-defined standard and dependent on JSON parser implementation, and maybe even on the specific version of serde that we are importing in nearcore for JSON parsing. Unfortunately, it means that a not well-defined standard is leaking into our protocol, since EVM being a native VM is a part of the protocol. It seems like we are risking of having a broken consensus if something in serde changes the way it parses JSON. I suggest we avoid depending on JSON in the protocol entirely and communiate with wETH using Borsh.

3 Likes

Sounds good to me, though I am not sure how easy it is to switch to borsh on the contract side

In order to bridge ETH to NEAR, we need to create a separate connector (not the existing fungible token connector), which was discussed here.
The general idea is that EVM Precompile contract will implement a current fungible token interface (NEP-21 less likely; NEP-141 more likely), with a token connector being a single contract to be able to mint and burn nETH tokens. This specific token connector literally implements a direct transfer of ETH to NEAR EVM for Ethereum users, and for NEAR users NEAR EVM destination is keccak(accountID). This approach can be used onwards for any NEAR users to interact with NEAR EVM.
I’m not sure that this fully answers your question, but I’m pretty sure that the scheme above can be implemented using strict serialisation protocols.

1 Like

The Fungible Token standard is what will define the protocol.
I think we have already decided that JSON will be used as an interface, so going forward consider Fungible Token standard part of the NEAR Protocol (this is the reason why it is in nomicon). Even though it’s potential possibility to later change the interface, the reality when we are going to have tons of various assets and value bridged - it will become as hard to update it as any other protocol change (e.g. require sweeping upgrades across the stack).

Hence using this standard to define ETH token inside EVM seems reasonable.

To my understanding there are two proposed designs:

  1. EVM implements NEP-21/NEP-141 interface;
  2. ETHMinter is extracted as a Wasm contract and EVM operates directly with the state of ETHMinter. EVM itself does not implement NEP-21/NEP-141 interface. ETHMinter implements NEP-21/NEP-141 interface.

I don’t think (1) is feasible. We cannot have JSON interface as part of the EVM because JSON is not a well defined standard. If we make it part of EVM it we will have a not well-defined protocol. This is not the same as having some Wasm contract have JSON interface, because in case of a contract specific rules of parsing JSON are compiled into Wasm and all NEAR nodes independently of how they were implemented agree on how it should be executed.

Image the following scenario.

  • We are running a network, and someone does a contract call to EVM that implements NEP-21/NEP-141 using JSON interface. However, they make a typo in arguments, and add extra comma, e.g. { 'new_owner_id': 'alice.near', 'amount': '100', };
  • Suppose we have three different versions of nodes running in the network: Rust implementation that uses serde_json X, slightly older Rust implementation that uses serde_json X-1, Go implementation;
  • All three versions of the node process the arguments differently and disagree on what is the outcome. This happens because JSON parsers implement different ways of parsing JSON, and some of them even have extra logic (sometimes a lot of it) of handling slightly invalid input, like missing closing brackets. Note, if these arguments were calling Wasm contract then nodes wouldn’t disagree because JSON parser would’ve been compiled into Wasm and nodes do not disagree on how to execute Wasm.

The security consequences of such disagreement:

  • Right now, since we don’t have slashing, the network would split. And the split will not be easy to recover. We are not going to ask maintainers of the Go NEAR node to quickly rewrite their JSON parsing library to perfectly mimic serde_json, and even if they do it will take days;
  • If we introduce challenges later, the validators will be slashed.

The long-term maintenance consequence is that we would have to make the exact implementation of serde_json of version X part of our protocol spec, make all alternative implementations perfectly mimic this JSON parsing logic, which is not going to be easy. serde_json has a lot of logic that attempts to make it robust and forgiving, which might also be changing from version to version.

Even if we don’t have alternative implementation it is maintenance nightmare.
serde is 32k LOC, serde_json is 15k LOC. Imagine trying to keep an eye on every single change that tweaks its robustness in minor ways. Imagine someone bumps the minor version of serde_json and accidentally alters the robustness logic. Also, suppose that at some point we will want to bump serde_json version, it will be a protocol change, and it we would have to import both version of serde_json into NEAR node code to make sure we can validate old blocks. It took @birchmd weeks to introduce versioning into some chain structures, because we did not do it before Mainnet launch, and now we have to maintain the old unversioned legacy code, even though the difference is just a single byte. With serde_json in our protocol its maintenance will be always consuming our time and we would be spending energy on creating sophisticated ways to work it around.

Besides, it will be almost certain that we won’t ever have a full exhaustive protocol specification. CC @alexatnear , @Bowen

1 Like

I’m not deep into the Runtime. Who the smart contract calls work on the API level? How the transaction is serialized over the wire? Are you using JSON API for communicating with a Node? I think JSON API is not a problem - because this is not a part of the blockchain. It’s just a communication with a dapp / wallet. Worst case, a node will fail to decode a transaction and will not include it in the block. So, more importantly, the transaction serialization in the block must be well defined.

As for the smart-contract, the current approach is:

  1. serialize numbers as strings
  2. serialize bytes as base64 strings (or whatever the smart contract will expect)
  3. serialize complex object using JSON keeping the 2 above conditions.

The FT smart contract (NEP-141) doesn’t take complex objects as an argument.

That’s very valid point @nearmax, thanks for bringing this up.

Though, we don’t need super complex arguments parser that serde offers (for example here is 300 line JSON parser, which we should be using for our arg parsing anyway), so we can do define a specific NEP-141 argument parser in this case. Again, Fungible Token is the protocol standard and pretty much will define how all the tooling should work. Having inconsistent parsers across contracts will be not as terrible as network split but from user experience as bad.

I’m not sure what you mean “EVM operates directly with the state of ETHMinter”? Does this mean that we guarantee these contracts are always on the same shard? I don’t think that at this point we have such facilities or were planning to build them.

Alternative design that I would love to think through and measure performance of, is move all the EVM glue code back into “contract” land and just keep execute_evm(bytecode) env function. This would allow to push almost everything into “developer” land and have various versions of contract upgradability.

One of the main issues with this approach is how to handle storage staking. Relayers will need to charge user back for the storage they are paying which currently is substantially more expensive than on ETH and may make it hard to frontend to show it.

Let me and @Arto to think through this more.

3 Likes

@illia Thank you for slowing down and considering alternatives.

That would be the best, I am strongly supporting this option.

We can even make evm account special in a way that it has some special EVM-related host functions exposed, and everything else about it behaves just like a regular account, including ability to deploy any Wasm code. Then we don’t need separate evm and EthMinter accounts.

I’m not sure what you mean “EVM operates directly with the state of ETHMinter”? Does this mean that we guarantee these contracts are always on the same shard? I don’t think that at this point we have such facilities or were planning to build them.

Yes, if we can special-case our runtime for EVM, then we can also special case some subtree in MPT as non-dividable. We have to declare the entire subtree that corresponds to a single account as non-dividable anyway, so it is not really a kludge, just a special-case similarly to what we do with implicit accounts.

Though, we don’t need super complex arguments parser that serde offers (for example here is 300 line JSON parser, which we should be using for our arg parsing anyway), so we can do define a specific NEP-141 argument parser in this case.

In the past I tried to simplify serde_json to reduce its size, and it’s really not that easy. Even tiny_json is 16k LOC. Writing valid JSON parser is not that easy.

Again, Fungible Token is the protocol standard and pretty much will define how all the tooling should work.

I would disagree, Fungible Token is a standard, but not a protocol. When a standard becomes more socially accepted or even vouched/promoted/recommended/official by NEAR collective it still does not become a protocol. Standards (token standards, RPC standards, even network message standards) live in user-land, protocol lives in system-land.

Also, I would like to bring to the attention that this is an instance of us trying to move too fast. Even if we somehow prevent future issues with network split or slashing, it will be a maintenance hurdle. We are already at the state where some people spend a good portion of their time working around things that exist because we moved too fast at some point. One of the reasons why we’ve been able to continue building on top of the runtime code without significant slowdown is because we keep the runtime code clean :slight_smile: Let’s keep it this way so that we can continue moving fast.

3 Likes

@robert that’s the problem. The proposal in question suggests to make specific JSON parser implementation as part of the blockchain protocol.

I linked to 300 LOC parser. Also in AS we have 300 LOC parser as well.

I would argue that it make sense to define specific set of rules for JSON that works inside smart contracts to avoid potential problems down the line, when UIs will work for some contracts and fail with completely unreadable JSON errors for others.

If you want to argue about semantics of wording, here is definition for protocol that totally matches with Fungible Token or ERC-20:

To be clear, ERC-20 is even more important “protocol” in Ethereum, than, for example, definition of network messages. Changing network messages is doable reasonably easily. Changing ERC-20 is impossible and you can only build super-seeding version (like build NEAR and super-seeding with it’s token protocol) but the old ERC-20 will continue exist for decades to come.

This is the same with RPC - there are 1000s of applications now that are using this specific RPC end points in Ethereum. Changing them is possible but will require long deprecation and coordination process.

IMO that dividing “user-land” and “system-land” is the easiest way to build something that no one wants to use :frowning: The fact is that those things that users actually interact with are more important than things that are under the hood.

7 Likes

I accidentally deleted @Bowen 's post where he mentions that he also thinks having JSON in the protocol is questionable. Sorry @Bowen

1 Like

JSON is a standard: Information on RFC 8259 » RFC Editor

While not the focus of conversation here, it would be incredibly useful to support JSON for other reasons beyond ERC token support.

JSON is the basis of Verifiable Credentials and confidential storage which are emerging data standards. The ability to leverage this off-chain JSON data for inputs into smart contracts will be incredibly powerful.

1 Like

I would like to make a short summary here:

  1. NEAR EVM Runtime would use bridged ETH (nETH) as a base token: this is a made decision
  2. Implementation details are unclear at this point: there’re concerns regarding JSON parsing on the protocol level, necessity of protocol upgrades for the all EVM upgrades and performance.
  3. To decide on implementation additional research and discussions have started; see here
6 Likes